Artificial Intelligence

Ukraine using AI facial recognition software to uncover Russian infiltrators

15th March 2022
Paige West
0

On Saturday, Ukraine began using Clearview AI’s facial recognition technology to help uncover Russian infiltrators as well as identify the dead, reunite refugees, and stop misinformation.

Reported by Reuters, Clearview Chief Executive Hoan Ton-That sent a letter to Kyiv offering the technology for free.

The letter identified several use cases Ukraine could utilise such as identifying the dead, reuniting refugees separated from their families, identifying Russian operatives and debunking false social media posts.

The company has a database of more than 10 billion publicly available facial images, which is believed to be the largest in the industry, and it says that a large portion of that is taken from Russian social media sites.

In a statement to the BBC, Ton-That said: “I'm pleased to confirm that Clearview AI has provided its ground-breaking facial recognition technology to Ukrainian officials for their use during the crisis they are facing.”

However, in recent months Clearview’s technology has faced much scrutiny.

In November 2021, the Information Commissioner’s Office (ICO) fined the company £17 million for failure to comply with UK data protection laws. The ICO reported that “the images in Clearview AI Inc’s database … may have been gathered without people’s knowledge from publicly available information online, including social media platforms”.

Its technology is also supplied to US law enforcement agencies as an investigative tool and Clearview is currently facing lawsuits in the US for taking images from the Internet.  

One critic even said that the technology has the potential to misidentify people.

Albert Fox Cahn, Executive Director of the Surveillance Technology Oversight Project in New York told Reuters that we could likely see “well-intentioned technology backfiring and harming the very people it’s supposed to help”.

Last year, a faulty facial recognition match led to a man being arrested for shoplifting. After reviewing grainy footage from video surveillance, the technology identified Robert Williams as a potential match based on a driver’s license photo.

However, facial recognition technology is not 100% accurate particularly when looking at low-quality images. A study by NIST (the National Institute of Standards and Technology) also found that many facial recognition algorithms are biased when it comes to age, race, and ethnicity. One reason for this could be the lack of diversity in the datasets that are used to train the AI-based system.  

Amnesty International launched a global campaign in January 2021 to ban the use of facial recognition technology, which it said it amplifies racist policing, violates privacy rights and the rights for peaceful gatherings.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier