Categories: Technology

Chinese AI turns Ukrainian YouTuber into Mandarin speaking Russian



A 21-year-old from Ukraine has found her image has been taken and duplicated using artificial intelligence (AI) to create an alter-ego on Chinese social media networks.

Olga Loiek, a University of Pennsylvania student, launched a YouTube channel in November last year to share her lifestyle and experiences. She was looking to build an audience for her content, but things have taken an unexpected turn.

It has emerged her image has been lifted and processed through AI programs to create a series of personas, like “Natasha” who appears to be a Russian woman, fluent in Mandarin, who wants to express her gratitude for Chinese support of Russia and to offer some products for sale on the side to earn some money.

Loiek’s content is progressing well with her channel having just under 18,000 subscribers at the time of writing, but in what is a likely sense of frustration, the fake Chinese accounts have hundreds of thousands of followers in some cases.

Things I would never say

“This is literally like my face speaking Mandarin and, in the background, I’m seeing the Kremlin and Moscow, and I’m talking about how great Russia and China are,” said Loiek. “

“That was really creepy, because these are things I would never say in life,” she added.

There is a deeper context to this story of deception through AI and that is geopolitics.

Accounts, images, and avatars like the one of ‘Natasha’ play off the close political relationship between Russia and China, with the former still at war with Ukraine following the invasion in February 2022.

At face value, Chinese social media is frequented by Russian women who want to show their warmth and respect toward China, helped by the ability to fluently speak the language. They are also looking to support the war effort by selling imported products from their country but it is all a mirage.

This is AI at work, duplicating images and taking advantage of real-world situations to influence and mislead unsuspecting people. Often the videos and the products will be targeted at single Chinese men to evoke a response and to sell.

Some of the accounts appear to have sold tens of thousands of dollars worth of products, including candies, while others even carry a disclaimer, stating AI may have been used to create the avatars.

This is another example of how AI remains a potent tool for propagating disinformation and a problem for governments and tech companies right now, not in the future.

Featured image via Ideogram



Source link

Washington Digital News

Share
Published by
Washington Digital News

Recent Posts

The FCC is creating a new Council for National Security within the agency

The Federal Communications Commission (FCC) said on Thursday it's creating a new Council for National…

4 hours ago

Top 10 Countries to Source Dropshipping Products (At Low Prices)

Dropshipping is a great business model that allows you to sell products online without keeping…

4 hours ago

Gold tops $3,000. Here’s what investors joining the gold rush need to know.

The value of gold has nearly doubled in the past five years, crossing the $3,000-an-ounce…

4 hours ago

Guest Contribution: “Steeling losses: sectoral strains from the return of tariffs on steel and aluminium”

Today we are fortunate to present a guest post written by Maria Grazia Attinasi, Lucas…

5 hours ago

Bitcoin 2025 Conference Brings Back Highly Anticipated Legal Education Program

BTC Inc., a leading provider of Bitcoin-related news and events, has announced the return of…

5 hours ago

Couples most likely to divorce have this factor in common

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance…

5 hours ago