Credit: CC0 Public Domain When it came to our online lives, 2018 was revealing in its dysfunction.
The just-expired year's parade of scandals at Facebook alone was relentless —Cambridge Analytica, its inflation of video-viewing stats that have been credited with convincing legacy media companies to "pivot to video" and away from print, data breaches, playing fast and loose with users' data and of course its role in enabling Russian interference in the 2016 U.S. presidential election.
The company also stands accused by a United Nations agency of contributing to a genocide in Myanmar by failing to effectively police hate speech on its platform. Others have noted how radio played a similar role in the 1994 Rwandan genocide.
Facebook is only the most obviously awful of the social-media platforms that have become so central to our social, economic and political lives. All of the major (for-profit, American) social media platforms have been tainted by scandals, from Instagram's link to Russia's 2016 U.S. presidential operation to YouTube's algorithmic propensity to serve up neo-Nazi propaganda and Twitter's ongoing failure to police white supremacists on its platform.
Regulation is inevitable
These and other socially destabilizing behaviours have brought us to the point where even U.S. tech companies, strident libertarians, have resigned themselves to the fact that greater government regulation is inevitable. Tim Cook, Apple's CEO, said in November 2018 that "the free market is not working" in regards to regulating tech companies' use of personal data, and that government regulation is "inevitable."
The form that this government regulation may take will be a critical debate in 2019. A new year offers a fresh start for thinking about how best to regulate social media companies' use of personal data.
Calls to regulate social media companies are now coming from scholars and politicians. In December 2018, Canada's federal Standing Committee on Access to Information, Privacy and Ethics proposed tough new rules on political advertisements on social media.
But what should these rules look like and what should they address?
As researchers studying internet governance and the regulation of personal data, we identify two elements are at the heart of the social media problem.
First, if, as commonly argued, social media platforms are our contemporary town squares, they are being operated as for-profit enterprises dependent on the accumulation and monetization of personal data, a practice that Harvard Business School Professor Shoshana Zuboff calls surveillance capitalism.
Second, although these social media companies operate worldwide, they are based in the United States and operate through American rules and norms. The exceptions of course are China-based social media giants like WeChat and Weibo.
Regulation strategies
The coming year is likely to see many debates on possible regulatory strategies. We offer several ideas to help shape those debates.
First, it's necessary to prohibit the data-intensive, micro-targeted advertising-dependent business model that is at the heart of the problem. In line with what the Public Policy Forum has recommended, reforms in this area should eliminate incentives for the collection and hoarding of data for purposes unrelated to delivering services.
As the search engine DuckDuckGo demonstrates, advertising-based business models need not rely upon selling detailed data profiles of customers. DuckDuckGo relies upon advertising keywords based on users' search queries but, unlike Google, it does not collect data on its users.
Second, it's vital that countries craft rules that are appropriate to their particular domestic social, legal and political contexts. A common criticism is that this is a form of state censorship. But all speech is subject to some form of regulation, such as the prohibition of hate speech.
Domestically crafted legislation recognizes that Canada and Germany regulate hate speech more strictly than the United States.
Globally operating tech giants tend to resist being subject to different countries' laws, arguing that global standards are best suited to govern the internet, but these standards often reflect U.S.-style rules and norms that may conflict with local values.
Third, and most provocatively, it's time to consider non-commercial ownership of social-media entities—including non-profit or some form of public ownership. This has been recommended by several U.S. and UK scholars, as well as one of us, to replace the fundamentally flawed for-profit companies that dominate these spaces.
Government-managed digital infrastructure
Along the same lines, some scholars are also calling for dominant tech platforms to be regulated as public utilities given their power in operating private informational infrastructure.
If social media platforms are the new town squares that are essential to facilitating public dialogue, then such spaces are too important to be left to foreign, profit-focused enterprises that are unaccountable to Canadians. Instead of paying for social media with our data, such platforms could be supported through user fees or taxes, or be operated as a Crown corporation.
While this may seem radical, remember other important elements of infrastructure —telecoms, railways and energy companies —have historically been publicly owned. Others, like banks, are very strictly regulated. If we've learned anything from 2018, it's that industry self-regulation is a recipe for ongoing disasters.
We recognize that many are uncomfortable with the idea of the government imposing strict regulation or ownership rules on social media.
This isn't a call for an authoritarian internet, but rather, an acknowledgement that someone will be making the rules. If our choice is between government and business —and it is —only government can credibly provide the accountability and responsiveness to protect the public and safeguard democratic integrity.
Explore further: Tech leaders call for greater social media regulation