Ukraine is using verbal recognition to communicate with relatives of dead Russian soldiers

This site may earn authorized commissions from links on this page. Terms of use.

(Photo: Gerhard Reus / Unsplash)
Communicators with facial recognition software have always warned that the technology could carry a number of deadly uses, but this is unexpected: the Ukrainian military is using facial recognition AI to locate fallen Russian soldiers, then contact their relatives.

Ukraine has used technology to analyze the faces of more than 8,600 dead or captured Russian soldiers since Putin ordered the attack in February, according to a new report. Report By the Washington Post. The scans were used to identify and communicate with the families of 582 people – a complex task that falls to the Ukrainian IT Army, which is made up of volunteers but run by the government. Some death notices even contain pictures of dead soldiers.

Facial Recognition AI US software company in question Clearview AIWhich has been headlined before for the threat Effectively finished We know it anonymously. The company-named app works in tandem with facial scans against social media, digital wallets and a huge database of other websites and mobile apps. These photos are linked to other data, revealing where the target person lived, worked, shopped or traveled, among other things. While Clearview AI is not said to exist for any specific purpose, it is used by hundreds of law enforcement agencies who use the app in numerous (and sometimes disturbing) ways, from assisting in cases of identity theft or child pornography to activist identification. Until. In protest

Pictures match within the ClearView AI app. (Photo: Clearview)

The use of Clearview AI technology in warfare is new, although it has been possible over the years. Those who have knowledge of the subject seem to be divided as to the value of the loss as a result of the benefits it offers. Some argue that facial recognition-driven death notices prevent Russian citizens from fully acknowledging what is happening in Ukraine, helping to clear the smokescreen. Meanwhile, a surveillance researcher pointed out to the Washington Post that targeting soldiers’ families in such a way could create psychological warfare and set a dangerous precedent for future military conflict. Other experts fear that any misidentification may result in the wrong family being told that their child has died.

Clearview AI CEO Hoan Ton- It clearly claims that there are more advantages than the difficulty of face recognition in wartime (although his defense is not immediately money-motivated, as access to the app granted to Ukraine is apparently free). Ton-That told the Washington Post that the technology could help discourage Russian forces from committing more war crimes because soldiers are at risk of being personally identified. If Russia’s previous war crimes history is anything to go by, word-of-mouth software is unlikely to help.

Read now:

Leave a Reply

Your email address will not be published.