A 21-year-old from Ukraine has found her image has been taken and duplicated using artificial intelligence (AI) to create an alter-ego on Chinese social media networks.
Olga Loiek, a University of Pennsylvania student, launched a YouTube channel in November last year to share her lifestyle and experiences. She was looking to build an audience for her content, but things have taken an unexpected turn.
It has emerged her image has been lifted and processed through AI programs to create a series of personas, like “Natasha” who appears to be a Russian woman, fluent in Mandarin, who wants to express her gratitude for Chinese support of Russia and to offer some products for sale on the side to earn some money.
Loiek’s content is progressing well with her channel having just under 18,000 subscribers at the time of writing, but in what is a likely sense of frustration, the fake Chinese accounts have hundreds of thousands of followers in some cases.
“This is literally like my face speaking Mandarin and, in the background, I’m seeing the Kremlin and Moscow, and I’m talking about how great Russia and China are,” said Loiek. “
“That was really creepy, because these are things I would never say in life,” she added.
There is a deeper context to this story of deception through AI and that is geopolitics.
Accounts, images, and avatars like the one of ‘Natasha’ play off the close political relationship between Russia and China, with the former still at war with Ukraine following the invasion in February 2022.
At face value, Chinese social media is frequented by Russian women who want to show their warmth and respect toward China, helped by the ability to fluently speak the language. They are also looking to support the war effort by selling imported products from their country but it is all a mirage.
This is AI at work, duplicating images and taking advantage of real-world situations to influence and mislead unsuspecting people. Often the videos and the products will be targeted at single Chinese men to evoke a response and to sell.
Some of the accounts appear to have sold tens of thousands of dollars worth of products, including candies, while others even carry a disclaimer, stating AI may have been used to create the avatars.
This is another example of how AI remains a potent tool for propagating disinformation and a problem for governments and tech companies right now, not in the future.
Featured image via Ideogram
If you can’t get your kids to sit through one of the five NBA games…
Nvidia’s GAAP gross margins — forecast at 73% for the company’s fiscal fourth quarter —…
Since the incoming administration has indicated the deportations will start on day one, I thought…
Follow Nikolaus On X Here Today, MicroStrategy (MSTR) surpassed a $100 billion market cap to…
President-elect Donald Trump has nominated heart surgeon Dr. Mehmet Oz to run Medicare and Medicaid,…
Nvidia’s third quarter results came in better than expected as the company beat the already…