The purpose of this research is to establish whether or not traditional accountability mechanisms, such as ombudsman, NGO’s, citizen activism, code of conduct, industry codes and similar mechanisms, could be applied to online social platforms in their assumed gatekeeping position, between users and content. The research is especially concerned with the threats that come with the use of ‘personalisation’ by means of algorithmic selection and filtering. This is done by online platforms such as Facebook and Google and the majority of their peers, to ensure that content as displayed on their platforms suits the user’s individual preferences, in order to maximize the height of the user experience on their platform. However, there are side-effects of personalisation, often put under the umbrella term of ‘the filter bubble’, which are undesirable in (democratic) societies. A filter bubble occurs when a user is no longer confronted with views that are contrasting or new. The user now simply has his own views re-affirmed over and over. Whereas democracies thrive on information plurality and diversity for citizens to base their opinions upon. Traditionally, in content-distributing companies, such problems where battled by means of accountability mechanisms, to minimize government interference, which is more a facet of totalitarian regimes. This debate largely follows Hambermas’ idea of the public sphere, applied to the online world. But for platforms, these mechanisms are scarce. Thus, expert interviews were conducted to analyse whether or not these mechanisms were applicable. It became clear that, while the risks as posed by the filter bubble theory need more empirical research, there is a perceived need for more accountability mechanisms. Although the industry of platforms is difficult to define, it appears that at least a basis of legislation is required. This should take an affirmative approach, encouraging and prescribing actions to be taken by the platforms themselves as well as governments. For the platforms, codes of conduct, transparency in communication towards users and responsiveness are found to be important elements of accountability. Issues of algorithmic transparency appear still a complex subject where the opinions are divided thoroughly. Meanwhile, governments should try to stimulate independent actors such as NGO’s and citizens to get engaged in the ‘holding accountable’. This can be done through increasing media literacy, rerouting taxes into accountability initiatives, supporting independent research and the likes. Summarizing, the urgency of legislation is low. Thus, independent research must be done to find support (or not) for the perceived risks of personalisation and algorithmic filtering and selection. Meanwhile, platforms should be 2 encouraged to behave responsibly. Governments should try to keep their involvement to affirmative actions towards stimulation desired behaviour and other indirect measures, at to keep involvement in content related issues at a distance.

, , , , , ,
P.M. Leendertse
hdl.handle.net/2105/43574
Media & Business
Erasmus School of History, Culture and Communication

Anne van der Laan. (2018, June 25). Media Accountability - Holding online social platforms accountable. Media & Business. Retrieved from http://hdl.handle.net/2105/43574