January 16, 2020
Judicial Reforms with reference to Economic Survey 2019
January 29, 2020

Regulation of Content on Social Media: Strict approach or Liberal

In today’s era of being circumvented by technological facilities, social media is one of the mainstream platforms for present and future generations to articulate and apportion their views around the globe. Social media revolves around user-generated content on their platform facilitating wider discussion. At the same time, this platform has posed incipient concern for creating a balance between freedoms of expression on one hand and the right to privacy on the other. The contents in question on social media include fake news, videos pertaining to violence, terrorism, and sexual violence as well. Recently, the murder of Afrazul in Rajasthan went viral in a short span time through social media and people started supporting the accused on grounds of religion. Taking note of the haste and scale with which a message can spread on these platforms, the concern as to under which condition liberation of expression has to be restricted legitimately has consistently been raised.
Judicial Perspective  Concerning Regulation of Content
Freedom of expression under Article 19(1) of the Constitution is broadly understood as the right of expressing views without any trepidation or censorship. But this is an intricate right because it carries certain responsibilities and may be subjected to certain restrictions provided by law, on these social media platforms. Considering this contention, the Apex Court in Secretary, Ministry of Information and Broadcasting v. Bengal Cricket Association, held that in order to ensure the right of free speech of the citizens, it is essential that there must be a plurality of opinions on all public issues. In light of this judgment, it can be opined that there is a need to regulate social media content which keeps up the user’s privilege rather than vetoing or censoring it rigorously.
Further, in the landmark case of Shreya Singhal, the Supreme Court struck down section 66A of the Information Technology Act considering it to be a draconian provision for arbitrarily and unreasonably invading freedom of expression. The court opined that it perturbs the balance between the concerned right and the restrictions imposed upon it. Besides this, the court absolved social media websites from constantly keeping an eye on content posted on their platform. The court limpidly verbalized that only sanctioned regime agencies and courts can request the intermediaries to take down the objectionable content.
On concluding these judgments, it is ostensible that the contents on these social media platforms must be regulated keeping in mind the provisions of law. Though social media platforms like Facebook have provided their community guidelines verbalizing that they don’t sanction objectionable video on their platform, it is still not enough to curb objectionable content. Also, WhatsApp, a chatting application, after facing complaints pertaining to the sharing of belligerent videos on their platform introduced end to end encryption technology where the  sent message can be unlocked by the intended receiver only. However, there have been many instances when these platforms were the main source of incitation for violence and spreading fake news. The reason behind these issues is a sizably voluminous number of contents and the contractors who work for filtering out the content are nescient of the politics and culture prevalent in different regions. Speed is preferred over accuracy.
Legislative Provisions for Regulation of Content
Further, concerning the legal provisions, Section 69 of the Information Technology Act provides an efficacious measure to take down the illicit content. It sanctions the Central Government to direct the intermediaries to take down the posted content on the precondition that if it is in the interest of sovereignty and integrity of India, defense of India, security of State, friendly relations with a foreign state, public order or incitement to an offense. In addition to this, the Information Technology Act under Section 43A also provides penal provision for company or body corporate for negligently handling sensitive personal data and failure to maintain plausible security practices.
Further, Rule 3 of the Information Technology (Intermediary rules), 2011 lays down certain guidelines pertaining to due diligence to be observed by intermediaries. Rule 3(1) and 3(2) of the Intermediary Guidelines provides that the intermediaries shall apprise the users about the privacy policy and to forbear posting content that is grossly deleterious, abhorrent, unlawful or obscene, etc. But, the scope of the term ‘unlawful’ is nowhere defined in the rules and is left for a broader interpretation. Also, Rule 3(4) provides that the intermediaries shall act within thirty-six hours upon obtaining knowledge in writing by an affected person with regard to objectionable content. It also provides that the intermediary shall take all plausible measures to secure the computer resources and shall apprise the users about the Grievance officer. The flaw in this provision is that the intermediaries are not obliged to give notice to the concerned user before terminating his access right.
Moreover, a categorical guideline has also been passed by the legislature in relation to blocking access of objectionable content as Information Technology (Procedure and Safeguard for Blocking for Access of Information), 2009. Under this, affected people can complain to the Nodal Officer to review the takedown request of objectionable content. Rule 6 and Rule 8 provides for a committee to examine the takedown request of alleged objectionable content. Further, Rule 9 provides an efficacious measure in case of emergency to immediately take down the objectionable content where no delay is acceptable.
Changes Suggested in Regulation Guidelines
In addition to this, the Central Government has introduced a draft of amended Intermediary guidelines Rules with certain incipient provisions and had invited comments from the public at large. Consequently, there has been an introduction of the incipient enjoined category of information as ‘public health or safety’. However the term ‘public health or safety’ is not defined in the rules and is left for a broader interpretation. Additionally, imposing restrictions under ‘public health or safety’ is not reasonable restriction under Article 19(2) and thus can be held unconstitutional. Withal, the intermediaries are required to provide assistance to governmental agencies in order to trace the progenitor of objectionable content on its platform.
The draft rules have suggested using automated technologies for filtrations of content. It has been widely argued that artificial intelligence technology is inappropriate as it lacks the human understanding to assess context. Also, it suggests the intermediaries having fifty lakhs users in India to establish a registered office and become a registered company under the Companies Act, 2013. This is unclear in its own terms as it does specify the parameter to determine the size of the user base as to whether registered users shall be taken into consideration or active users.
Why Regulations shall be done by Tech Companies Itself
The government has set up its mind to regulate social media through its own agencies and the updated regulation will be on the floor by early next year. The goal will be about the traceability of content and to make the intermediaries liable for the content they host. Contrary to this, this will impose challenges as well to draw a distinction between prohibited and permitted speech. Rather the regulation shall be done by tech companies in lieu of Government. The reason behind this argument is that firstly, they are dealing with public data and have an obligation to circumscribe the spread of misinformation, hatred content, etc. Secondly, failure on their part will lead to losing the trust of the customer from their platform where the content is untrustworthy. Thirdly, regulating content will require modern technologies and tools which they have been at it. A recent example of this is YouTube employs 10,000 across the world for removal of objectionable content and had taken down approximately 8 million such videos in 2018. Indeed, the intermediaries shall be made liable if they fail to remove illicit content within a given time frame by the courts. Additionally, the companies shall publish compliance data conventionally to encourage transparency and shall update content standards periodically.
On concluding all this, though it is pellucid that we have plenty of laws and regulations for keeping a check on objectionable contents. Even if it is to be done by Government, there is an exigency of a separate governmental body consisting of technical experts from all across the globe to exhaustively examine the contemporary problems of the cyber world to engender a better Digital India. Further, there is a need to decide a felicitous definition of objectionable content as it solely depends upon the language, culture and social life of respective region.


Juned Akhter is a 3rd Year law Student at Dr. Ram Manohar Lohia National Law University, Lucknow.








In Content Picture Credit: The Silicon Review

Leave a Reply

Your email address will not be published. Required fields are marked *