sonsedskaya - stock.adobe.com

Researcher exposes crypto scam network exploiting YouTube

A massive network of fake YouTube videos promoted by automated sock puppet accounts is reeling in hundreds of cryptocurrency enthusiasts and persuading them to hand over their money, WithSecure researchers found

WithSecure researchers have exposed a network of fraudulent YouTube videos, channels and associated web applications that are manipulating users into joining dodgy cryptocurrency investment scams.

The fraud operation appears to be promoting a USDT (also known as Tether) cryptocurrency investment scheme. USDT, which is pegged to the US Dollar and known as a stablecoin, has itself been heavily criticised over its opaque practices, and has been the subject of multiple regulatory and legal probes.

The network comprises well over a thousand videos, many of which are receiving inauthentic and probably automated engagement – intended to legitimise the videos – from hundreds of distinct sock puppet YouTube channels (some verified) set up to give the operation a sense of legitimacy. The whole setup seems to be run by a group of 30 scammers who use the encrypted Telegram application to coordinate their work.

Led by WithSecure’s Andy Patel – who earlier this year reported on the malicious use of AI language models – the team pored over a number of the five- to 10-minute-long videos, which all follow approximately the same script and are presented in a number of languages. Their findings can be read in full here.

“The scripts show you how to bring up an app or website where you can register with a username and password, and recharge the account with USDT cryptocurrency,” said Patel. “If you put in more money, you get a reward. [Of course] putting money into the app is putting it into the scammer’s wallet.”

The team found over 700 distinct URLs masquerading as investment web apps, each of them nothing more than a cryptocurrency wallet run by the scammers. Once funds were transferred from the victim’s cryptocurrency wallet to the scammers’, the victim is supposedly earning commission and rewards, and in common with other similar scams, will often be shown what appears to be evidence of this, which will never actually materialise.

The web apps also offer a withdrawal functionality, which, according to Patel, “basically doesn’t work”. The WithSecure team saw no evidence of any transfers back to the victims’ wallets. “It’s not even a pyramid scheme,” said Patel. “It’s just convincing people to give away their money.”

Hunting a white whale

Patel said the network he observed seemed to be targeting existing cryptocurrency enthusiasts, but that the videos were of low quality and did not appear to be localised, beyond being translated, suggesting the scam is largely an opportunistic one.

“Typically this results in a large volume of small transactions,” he said. “But as that volume increases, so do the odds of them getting lucky and finding someone able and willing to invest more substantial amounts.”

Indeed, based only on the data his team was able to pull themselves during the last six months of 2022, the fraudulent apps generated returns of barely $100,000, from about 900 victims.

This suggests the perpetrators are playing a numbers game, and are content to extract small amounts of money from victims who are unlikely to object too violently, while looking for the occasional white whale to swim by.

Patel said the somewhat hands-off approach of the scammers via inauthentic videos and apps contrasted with the hands-on confidence-based social engineering methodology used in so-called pig butchering scams.

He suggested that one reason the scammers are using YouTube infrastructure is because it helps the scammers tap into a pool of victims without needing to pay social engineers who can speak their languages fluently.

“This doesn’t appear to be a very lucrative business when you consider the costs of registering domains, creating apps, paying creators to publish and boost videos, and managing the flow of currency they were able to extract,” he said.

Read more about scams and fraud

However, Patel pointed out that this does not mean the scam should be considered less problematic. “They [the scammers] have clearly figured out how to game YouTube’s recommendation algorithms by using a fairly straightforward approach,” he said.

“Moderating social media content is a huge challenge for platforms, but the successful amplification of this content using pretty-simple, well-known techniques makes me think that more could be done to protect people from these scams.”

Indeed, crypto scams aimed at defrauding potential investors are becoming a significant problem on social media.

Cryptocurrency enthusiasts, known by some as “cryptobros”, are popularly stereotyped as more likely to take risks with their money, and prone to evangelising their “successes” to others. These stereotypes may make them a tempting target for criminals.

Indeed, as the volume of large-scale crypto frauds and rug pulls in recent history shows, enthusiasts are prone to being exploited by cyber crime gangs and fraudsters. According to the US Federal Trade Commission, 46,000 people have reported losing over $1bn to crypto scams between January 2021 and June 2022, with almost half saying the scam originated via social media.

How YouTube can help

Patel said that given the number of channels discovered that were involved, how often they were active, and how long the scam has been running, it was somewhat surprising that YouTube had not acted, although he conceded the platform has a lot of pressing issues for its moderation teams to deal with, and added that this may change now that the scam has been exposed.

“Videos of this nature should be thoroughly enumerated and removed by the YouTube safety team, along with any other channels participating in similar operations,” he said. “If this isn’t something YouTube is willing to do, they should, at the very least, suppress their algorithm’s recommendation of these videos.

“YouTube should also make an effort to understand how the SEO text found in the description fields of these videos might affect YouTube’s search and recommendation algorithms. A cursory glance at results returned by an internet search for ‘buy YouTube views’ illuminates the existence of many services selling YouTube likes, views, comments and subscribes.

“It is clear that inauthentic amplification is being used to boost engagement numbers on many of the videos highlighted in this report,” said Patel. “While we’re aware that detecting inauthentic activity on social networks is a difficult endeavour, with regards to the videos highlighted in this report, determining patterns and channels involved in their actions was a straightforward task that required very little API usage. It would be nice to know that YouTube’s administrators take inauthentic amplification seriously and are devising more generic methods to detect and counter such activity in the future.”

He added that the fact that the team had seen verified YouTube accounts getting involved was worrying as it conveyed the idea that verified status cannot be trusted, and that the badges are handed out too easily.

Read more on Hackers and cybercrime prevention

CIO
Security
Networking
Data Center
Data Management
Close