Two days after Russia invaded Ukraine, an account on the Telegram messaging platform posing as President Volodymyr Zelensky urged his armed forces to surrender.
The message was not authentic, with the real Zelensky soon denying the claim on his official Telegram channel, but the incident highlighted a major problem: disinformation quickly spreads unchecked on the encrypted app.
The fake Zelensky account reached 20,000 followers on Telegram before it was shut down, a remedial action that experts say is all too rare.
For Oleksandra Tsekhanovska, head of the Hybrid Warfare Analytical Group at the Kyiv-based Ukraine Crisis Media Center, the effects are both near- and far-reaching.
“For Telegram, accountability has always been a problem, which is why it was so popular even before the full scale war with far-right extremists and terrorists from all over the world,” she told AFP from her safe house outside the Ukrainian capital.
Telegram boasts 500 million users, who share information individually and in groups in relative security. But Telegram’s use as a one-way broadcast channel — which followers can join but not reply to — means content from inauthentic accounts can easily reach large, captive and eager audiences.
False news often spreads via public groups, or chats, with potentially fatal effects.
“Someone posing as a Ukrainian citizen just joins the chat and starts spreading misinformation, or gathers data, like the location of shelters,” Tsekhanovska said, noting how false messages have urged Ukrainians to turn off their phones at a specific time of night, citing cybersafety.
Such instructions could actually endanger people — citizens receive air strike warnings via smartphone alerts.
In addition, Telegram’s architecture limits the ability to slow the spread of false information: the lack of a central public feed, and the fact that comments are easily disabled in channels, reduce the space for public pushback.
Although some channels have been removed, the curation process is considered opaque and insufficient by analysts.
Emerson Brooking, a disinformation expert at the Atlantic Council’s Digital Forensic Research Lab, said: “Back in the Wild West period of content moderation, like 2014 or 2015, maybe they could have gotten away with it, but it stands in marked contrast with how other companies run themselves today.”
WhatsApp, a rival messaging platform, introduced some measures to counter disinformation when Covid-19 was first sweeping the world.
For example, WhatsApp restricted the number of times a user could forward something, and developed automated systems that detect and flag objectionable content.
Unlike Silicon Valley giants such as Facebook and Twitter, which run very public anti-disinformation programs, Brooking said: “Telegram is famously lax or absent in its content moderation policy.”
As a result, the pandemic saw many newcomers to Telegram, including prominent anti-vaccine activists who used the app’s hands-off approach to share false information on shots, a study from the Institute for Strategic Dialogue shows.
Again, in contrast to Facebook, Google, and Twitter, Telegram’s founder Pavel Durov runs his company in relative secrecy from Dubai.
On February 27, however, he admitted from his Russian-language account that “Telegram channels are increasingly becoming a source of unverified information related to Ukrainian events.”
He said that since his platform does not have the capacity to check all channels, it may restrict some in Russia and Ukraine “for the duration of the conflict,” but then reversed course hours later after many users complained that Telegram was an important source of information.
Oleksandra Matviichuk, a Kyiv-based lawyer and head of the Center for Civil Liberties, called Durov’s position, “very weak,” and urged concrete improvements.
“He has to start being more proactive and to find a real solution to this situation, not stay in standby without interfering. It’s a very irresponsible position from the owner of Telegram,” she said.
In the United States, Telegram’s lower public profile has helped it mostly avoid high level scrutiny from Congress, but it has not gone unnoticed.
Some people used the platform to organize ahead of the storming of the US Capitol in January 2021, and last month Senator Mark Warner sent a letter to Durov urging him to curb Russian information operations on Telegram.
Asked about its stance on disinformation, Telegram spokesperson Remi Vaughn told AFP: “As noted by our CEO, the sheer volume of information being shared on channels makes it extremely difficult to verify, so it’s important that users double-check what they read.”
But the Ukraine Crisis Media Center’s Tsekhanovska points out that communications are often down in zones most affected by the war, making this sort of cross-referencing a luxury many cannot afford.