International Grand Committee on Big Data, Privacy, and Democracy
May 27, 2019, Ottawa, Canada
Remarks by Dr. Heidi Tworek, Assistant Professor of International History at the University of British Columbia, Visiting Fellow at the Joint Center for History and Economics at Harvard University, Non-Resident Fellow at the German Marshall Fund of the United States and the Canadian Global Affairs Institute.
Thank you, Mr. Chair, and thank you to the distinguished members of the International Grand Committee for the kind invitation to speak before you. My statement represents my own views and not those of my employer.
I wear two hats in my work: history and policy. Wearing two hats is a strange fashion choice, I admit. But the combination can inspire more practical, robust, and durable solutions. Wearing my policy hat, I have examined Canadian, American, and European approaches to hate speech, violent extremism, and disinformation. I am also a member of the steering committee of the Transatlantic High-Level Working Group on Content Moderation Online and Freedom of Expression.
Wearing my history hat, I just published a book on how Germany tried to control world communications from 1900 to 1945—and almost succeeded. Amongst other things, I explain how Germany’s democracy with its vibrant media landscape could descend into an authoritarian, Nazi regime spreading anti-Semitic, homophobic, and racist content around the world.
While I was writing this book, the present caught up with history. Far-right groups revived Nazi terminology like Lügenpresse (lying press) or Systempresse (system press) to decry the media. News is falsified for political and economic purposes. Minority groups are targeted and blamed for societal ills that they did not cause. Like with radio, technologies designed with utopian aims have become tools for demagogues and dictators.
Some aspects of the Internet are unprecedented: the microtargeting, the scale, the granular level of surveillance. But many patterns look surprisingly familiar. Allow me to offer five lessons from history that can guide future policy.
First, disinformation is an international relations problem. Information warfare may seem new. In fact, it is a long-standing feature of the international system. Countries feeling encircled or internationally weak may use communications to project international prowess. This was as true for Germany in the past as it is for Russia today. We are returning to a world of geopolitical jockeying over news. If the causes of information warfare are geopolitical, so are many of the solutions. These must address the underlying foreign policy reasons for why states engage in information warfare.
Second, we must pay attention to physical infrastructure. Information warfare is enabled by infrastructure, whether submarine cables a century ago or fiber-optic cables today. One of Britain’s first acts during World War I was to cut submarine cables connecting Germany to the world. Germans invested in new radio technology to bypass cables. By 1938, an American radio executive called one Nazi radio tower “the most potent agency for political doctrine the world has ever known.”
Infrastructure remains crucial. The Internet may seem wireless; actually fiber-optic cables carry 95 to 99 percent of international data. Google partly owns 8.5 percent of all submarine cables. Sometimes sharks accidentally bite through cables and cause disruption. States might bite too. Russia and China are both surveilling cables. The Chinese government and companies are investing in 5G infrastructure while building international information networks through the news agency Xinhua, an English-language satellite TV news channel (CGTN), a Belt and Road News Network, and apps like TikTok (the most downloaded app in the US recently).
Third, business structures are often more crucial than individual pieces of content. It is tempting to focus on the harm created by particular viral posts. But virality is enabled by a few major companies who control the bottlenecks of information. Only 29 percent of Americans or Brits understand that their Facebook newsfeed is algorithmically organized; the most aware were the Finns at 39 percent. This affords social media platforms huge power, a power that is not neutral. At a minimum, any solutions will have to include greater transparency from companies on how their algorithms and platforms actually function. Evidence-based policy needs good evidence.
Fourth, we need to design robust regulatory institutions. Spoken radio emerged in the 1920s. One German bureaucrat, Hans Bredow, was tasked with regulation. At first, he thought radio could unite Germans after the devastating loss of World War I. As the Weimar Republic grew politically unstable, however, radio regulation was reformed to mandate direct state supervision of content. Bredow believed that state supervision would protect Weimar democracy by not broadcasting news that could provoke violence. Ironically, the Nazis could then seize immediate control over radio when they came to power in 1933. Well-intentioned regulation had tragic unintended consequences.
Any productive approach to regulation should consider how to democracy-proof our systems. Institutional design is key here. Robust institutions would, for instance, consistently include civil society. They would bolster data security and privacy. They would also be designed not to lock in the current big players and shut down possibilities for further innovation.
One suggestion I have made to balance between curbing harmful speech and respecting freedom of expression is to create social media councils. These would be multi-stakeholder fora, convened regularly to address content moderation and other challenges. The exact format and geographical scope remain up for debate. The idea is supported by others, including the UN Special Rapporteur on the Right to Freedom of Opinion and Expression. By mandating regular meetings and information-sharing, social media councils could become robust institutions to address problems that we have not yet even imagined.
Fifth, solutions must address the societal divisions exploited on social media. The seeds of authoritarianism need fertile soil to grow; if we don’t address underlying economic and social issues, communications cannot obscure discontent forever.
Let me recapitulate the five lessons. First, disinformation is also an international relations problem. Second, physical infrastructure matters. Third, business structures are more important than individual pieces of content. Fourth, we must build robust regulatory institutions. Fifth, we must address the societal divisions exploited on social media.
In all these five areas, international cooperation is key. That is why I am honored to have had the opportunity to support this international committee today.
Thank you very much.