Press "Enter" to skip to content

Principles, Not Government’s Discretion, Can Fix Social Media

According to the IT Rules 2021, the government can order social media platforms to take down any content which they determine to be unlawful. This paves the way for unaccountable censorship. The government should be drafting objective principles and guidelines instead of such ad hoc censorship.

Editor’s Note: This article was originally published by Takshashila Institution.

The issues of hate speech, disinformation, misinformation and polarisation on one side, and arbitrary exercise of power on public discourse and information on the other side, is what I coin as a broken social media problem.

The urge to fix social media has been well recognised by various policy actors, including policy experts, the government and the general public, since the Cambridge Analytica fallout. While this was a normal problem for a long time, some recent developments have moved the visibility of the problem towards an unanticipated crisis.

Social media is a deliberative medium. The nation’s mood towards some significant problems, which have directly impacted the image of the incumbent government, opened up the window for regulating social media.

Since last August, Indian farmers have been protesting against the three farm bills passed in September 2020. Recently, the government ordered Twitter to remove nearly 1,200 accounts (including an eminent publication, The Caravan) that were accused of spreading disinformation on the farmers’ protests. While Twitter complied with the order for a brief period, it restored the accounts after a few hours. This event and other events (such as the fallout over the movie Tandav) motivated the government to develop the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (called the IT Rules 2021).

Intermediary Guidelines and Digital Media Ethics Code

The window for social media regulation opened up in 2018 when the Ministry of Electronics and Information Technology (MEITY) put out draft intermediary guidelines for public comments. After that, the matter was left untouched until 2021, when political will reopened the issue and the MEITY recently notified the IT Rules 2021. Compared to the 2018 draft guidelines, many things were newly added in the IT Rules 2021, such as digital media and online curated content rules, which brought no chance for policy actors to contribute.

As the IT Rules 2021 were passed as an executive order, there is still scope for further legislation. Currently, it is difficult to say when the window of opportunity for further legislation will open up. Still, policy commentators and advocacy groups have already started pointing out many flaws in the IT Rules 2021, seeking a more democratic route through legislation.

When the window for legislation opens up, some nuanced aspects regarding social media content and its moderation must be considered to form robust legislation. There are two kinds of content moderation performed by social media platforms – hard moderation (in the form of content takedown and flagging) and soft moderation (in the form of algorithmic recommendation of content). The law must recognise this difference in moderation and consider the below suggestions.

Principle-Based Approach for Hard Moderation

Social media platforms follow their own community guidelines as a basis for performing hard moderation of third-party content. According to the IT Rules 2021, the government can also order the platforms to take down content which they determine to be unlawful, within 36 hours of notification.

While performing hard moderation, both social media platforms and the government adopt a content-based approach to decide which content should be taken down at their own discretion – not an approach backed by any principles or supervision. This paves the way for unaccountable censorship over our access to information and public discourse.

This ad hoc approach also means that one can’t determine the rationale behind the removal of content, whether it’s for collective user welfare, political reasons or wrongdoing. Therefore, it is critical to move towards a principle-based approach, where we define principles (in lines with fiduciary responsibility) to dictate the hard moderation process.

The government’s role has to be limited to defining the principles, by deliberating with various stakeholders, and monitoring due diligence, by mandating various disclosure mechanisms for social media platforms. The hard moderation process performed by social media platforms has to be in accordance with the defined principles on a case-by-case basis, accounting for the context, significance and veracity of information.

Besides, it is also essential to list out charter rights for internet users as part of the legislation, to counterbalance and check the power wielded by both the government and social media platforms on our access to information and freedom of expression.

Outcome Analysis for Enhancing Soft Moderation

When we see and read posts on social media, we don’t realise that the content which appears on our screen is not coincidental. Social media platforms collect information (input data) on our preferences, behaviour, relationships and so on, to develop an algorithmic system and recommend content (output) that will help them retain our attention for longer.

In algorithmic recommendation systems, feedback from outcome analysis is missing. While social media platforms adjust their recommendation algorithms to account for changes in user behaviour, this does not mean that they do it for the welfare of the user. Therefore, it is necessary to mandate auditing of the algorithmic recommendation systems.

An independent auditing agency has to perform the audit, based on some defined norms and principles. The results/inferences derived through the audit have to be incorporated into the algorithm (input data and coding) to enhance the recommendation systems for better user outcomes. Besides, as a stimulative policy measure, a market for algorithmic rating system has to be forged to push the platforms towards performing better on the user outcome aspect.

[email protected] | + posts

Kamesh Shekar is a tech policy enthusiast. He is currently pursuing PGP in Public Policy from the Takshashila Institution. His views are personal and do not represent any organisations.