By Dean Jackson
On August 19, 2019, Twitter and Facebook announced for the first time that they had detected and removed a network of accounts, groups, and pages engaged in “coordinated inauthentic activity” on behalf of the Chinese Communist Party (CCP). While the two platforms are blocked in mainland China, a state-backed influence operation reportedly used VPNs to access the platforms with intent to spread disinformation discrediting mass demonstrations in the special administrative region of Hong Kong.
These announcements were made as part of Twitter and Facebook’s respective efforts to unmask inauthentic activity and influence operations on their platforms—efforts which began in response to sustained public pressure. This pressure was effective in part because these platforms were developed in democratic settings where space exists for civil society to push for transparency, accountability, and change. Recently, however, new applications have emerged from within authoritarian settings, where the space for independent scrutiny and pressure is far more limited. With this development, the information landscape is becoming more fractious, with these parallel platforms representing a new frontier even more vulnerable to authoritarian manipulation of political discourse worldwide.
Consider WeChat, the Chinese messaging application developed by Tencent, with 100-200 million users outside of mainland China. The app is so prevalent among the Chinese diaspora and Chinese-speaking populations that politicians in Australia, Canada, and the United States use WeChat to communicate with constituents, and its use as a mobile payment system is fueling its growth worldwide.
New applications have emerged from within authoritarian settings, where the space for independent scrutiny and pressure is far more limited.
At first blush, this story is about politicians finding new ways to communicate with constituents. But WeChat communications are subjected to censorship and surveillance by China’s authorities. Even outside China, WeChat has censored news coverage of topics and statements from Western politicians addressing matters Beijing would rather go undiscussed.
Further, every message and scrap of metadata moving across WeChat is stored on Tencent’s servers in China. In a country where the rule of law comes second to the will of the Party, data is vulnerable to CCP demands.
Because censorship on WeChat focuses primarily on topics considered sensitive by the Chinese Communist Party, it has not prevented the kind of hatemongering seen on other messaging apps. Anti-Muslim content from abroad circulates within the PRC, where the government has incarcerated over a million Uighurs (a predominantly Muslim ethnic minority) under the guise of counterterrorism. While WeChat removes content that China’s authorities deem impermissible, anti-Muslim content remains.
The spread of rumors and disinformation on WeChat has already affected political processes outside China. In May 2019, Chinese-Australian voters were targeted by false messages on WeChat about a specific political party’s alleged refugee policy. Attributing the original source of disinformation with complete certainty is often difficult, and it is unclear who was behind this effort—highlighting the need for greater transparency for platforms beholden to the interests of powerful authoritarian states.
Every message and scrap of metadata moving across WeChat is stored on Tencent’s servers in China. In a country where the rule of law comes second to the will of the Party, data is vulnerable to CCP demands.
All of this is relevant to TikTok, a Chinese video-sharing app experiencing explosive global growth. In 2018, TikTok was the third most-installed app worldwide. Its business model is similar to that of other social media platforms: collection of personal and other data fuels a predictive algorithm designed to convert user attention into revenue.
Unlike WeChat, TikTok does not straddle China’s great firewall. Bytedance, which owns TikTok and a variety of other platforms powered by data collection and artificial intelligence, provides a similar but completely separate app called Douyin in mainland China. This way, Bytedance’s content moderators can more easily screen for content on the Chinese app which violates censorship rules. The separation also allows for better targeting of official Chinese state content through the platform: over the past few years, CCP sources have increasingly published nationalist content on Douyin, bootstrapping it to an algorithm that allows for more tailored messaging to individual users. (It is worth noting that TikTok’s parent company also owns a Chinese-language search engine which appears to have censored coverage of the Hong Kong demonstrations, promoting state-sponsored content instead.)
This does not mean the company takes a liberal attitude toward political speech abroad, and in the past it has indicated that it would not prominently feature political criticism on the app in India, where TikTok has a large user base. This has not stopped Indians from using TikTok to share political content, but Bytedance has implied that the company would prefer its application not be used for political discourse at all. This approach has not spared TikTok from darker forms of online content: as with WeChat and many other social media applications, hate speech and extremism can be found on TikTok in countries beyond India.
The challenge is that these parallel platforms are shielded from public debate and a robust and democratic policy process, and therefore doubly compromised—they exhibit the hallmarks of both surveillance capitalism and an emerging digital authoritarianism.
In another similarity with WeChat, TikTok faces concern about the handling of user data. By virtue of Bytedance’s position as a company that must ultimately answer to the Chinese authorities, data collected through its applications may be vulnerable to access by those authorities for whatever purposes they choose. Privacy critics have alleged that TikTok lacks nuanced privacy settings and may collect excessive data from users. (After altering its previous policies, Bytedance now says it stores TikTok data outside of China and that the Chinese government has no access.)
TikTok and WeChat are not the only examples in which platforms developed in authoritarian settings have raised international concern about censorship and disinformation. In 2017, the Ukrainian government took the controversial step of banning VKontakte, a popular Russian social network, citing the threat of ongoing Russian information warfare against Ukraine. In 2014, VKontakte made headlines when its founder resigned as CEO and sold his shares in the company after sustained troubles stemming from his refusal of Russian government requests that VKontakte remove opposition and activist pages. Since then, the Kremlin has pursued ever tighter control of the internet in ways that are reminiscent of the Chinese model. It is noteworthy that in the 2019 Ranking Digital Rights Index, both countries’ major internet companies were ranked at the very bottom in terms of policies and practices affecting free expression and privacy.
Ultimately, the challenge is that these parallel platforms are shielded from public debate and a robust and democratic policy process, and therefore doubly compromised—they exhibit the hallmarks of both surveillance capitalism and an emerging digital authoritarianism. Though it is far from certain, with the help of vocal and vibrant civil societies, democracies around the world may find ways to curtail the former. The latter is emerging as a long-term challenge requiring sustained attention and resolve.
Dean Jackson is a program officer at the National Endowment for Democracy’s International Forum for Democratic Studies, where he works on issues relating to disinformation and media freedom. Follow him on Twitter @DWJ88.
The views expressed in this post represent the opinions and analysis of the author and do not necessarily reflect those of the National Endowment for Democracy or its staff.
Image credit: naamphawa / Shutterstock.com