Since last year, Facebook has been taking down fake news accounts from countries all over the world. Credit: Shutterstock Since late last year, Facebook has been actively shutting down accounts responsible for spreading hoaxes in some countries, especially those holding general elections.
In Indonesia, for instance, Facebook has shut down thousands of accounts believed to be disseminating misinformation.
Unfortunately, Facebook's policy on closing accounts is not transparent enough. Questions arose when Facebook recovered an account that belonged to social media activist Permadi Arya, a supporter of the incumbent presidential candidate, Joko "Jokowi" Jokowi. Facebook blocked and then recovered his account after Permadi threatened to sue Facebook for around Rp 1 billion (US$70 million) if it didn't open his accounts and clear his name.
Facebook has not clearly explained its policy on shutting down or recovering accounts. Transparency is needed from Facebook to ensure it is free from any political pressures. This will also empower its users to take action in fighting hoaxes and misinformation.
A responsibility finally fulfilled
Social media platforms should play a key role in stopping the spread of misinformation. Misinformation has polarised societies in the US, Brazil, Moldova and Indonesia. The platforms' lack of action has allowed misinformation and hate speech to spread faster, triggering genocide (in Myanmar), homicides (in India) and health crises (in Liberia and Nigeria).
Until recently, social media justified their lack of action by arguing that they are not media companies. Therefore, they should not regulate information circulating on their platforms. Twitter CEO Jack Dorsey told CNN last year that Twitter should not be the "arbiter of truth" on its platform.
However, platforms should be responsible for preventing the spreading of misinformation and hoaxes for at least four reasons.
First, actors who spread misinformation are often part of an organised group. One person, no matter how resourced he or she is, can not protect him/herself from these networks. A study of individuals' behaviours in spreading misinformation that I conducted with my colleague, M. Laeeq Khan, showed that a person could still spread misinformation on social media although he or she was aware the information was false. In other words, platforms should minimise the spread of hoaxes because relying on individuals is not enough.
Second, social media platforms establish the algorithms and artificial intelligence for timelines where hoax and misinformation are being circulated. Therefore, they are the ones with the knowledge and understanding to prevent information bubbles among their users.
Third, social media companies have unlimited access to all the data to help them identify misinformation on their networks.
Last, as a part of society, social media companies have an ethical responsibility to manage the circulation and quality of information.
How will Facebook's policy affect democracy?
In his book Ethics and The Media: An Introduction, media ethics scholar Stephen Ward considers the internet as a new public sphere where people can share various expressions and opinions. It is also a new marketplace of ideas that helps to promote democracy.
Nevertheless, Ward believes that conversations online sometimes do not contribute to public discourse and democracy. Instead, the internet becomes just another channel of communication.
Conversations on social media usually discuss superficial things, focusing mostly on popular issues and entertainment. Thus, they would contribute little to instilling democracy in public discussions. When the conversation is full of hatred and causes harm, the internet's capacity to create democratic discourse is also under question.
After the second presidential debate in Indonesia, for instance, people mostly discussed one candidate's unfamiliarity with a tech business term on social media. Discussions on more critical issues, such as the qualities of both candidates, did not receive much public attention.
Moreover, Ward said the primary objective of democracy is neither a free press nor the marketplace of ideas. Its main goals are to create harmony in the society and to promote open, equal and respectful participation in social life. To achieve these objectives, democracy requires fact-based participation from the public.
Indonesia has been suffering from the damage caused by misinformation circulated on social media since the 2014 general election. It happened in Jakarta's gubernatorial election in 2017 and continues in the current campaign for the 2019 elections.
This year, the polarisation of opinions on social media is getting more noticeable. Rising numbers of actors are spreading misinformation to shape public opinion and flame hatred.
By shutting down accounts associated with spreading hate speech and misinformation, social media platforms are playing their roles in maintaining democracy and public order.
Facebook still needs to be transparent
Other social media platforms need to follow Facebook's step in shutting down suspect accounts. However, Facebook still needs to explain to the public its policy in shutting down these accounts.
In Indonesia, Facebook deactivated accounts associated with the Saracen group, which is accused of spreading hate speech, racism and intolerant values. This group acts as a mercenary, attacking people by order. At the end of 2017, the National Police caught this group.
Facebook said it had partnered with governments, experts and other platforms to stop misinformation from spreading. The question is: whose recommendation would Facebook listen to in its attempt to fight misinformation?
Thus far, Facebook's explanation touches things on the surface only. Having the most social media users in Southeast Asia with 130 million active users, Facebook in Indonesia still needs to explain why it shut down one account and not the other. Transparency is needed to ensure Facebook's decision is free from any political pressure and that it doesn't only shut down accounts that criticise the government.
By explaining how its algorithm and system can detect misinformation, Facebook will help users to analyse their timelines.
The huge amount of information circulated on Facebook every day makes anyone susceptible to being exposed to misinformation. Facebook's transparency is essential to help the public understand better what is happening, so they can empower themselves to fight against the spread of misinformation.
Facebook, which also manages WhatsApp and Instagram—the most popular social media platforms in Indonesia—must also work faster to prevent the spread of misinformation. The general election is happening in less than a month and the spread of misinformation can happen any time.
Explore further: Study: Information literacy can combat 'fake news'