In order to govern online content, the Government of India notified new guidelines, called, the ‘Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021’ on February 25, 2021. The rules were framed under Section 87 of the Information Technology Act, 2000 and in supersession of the Information Technology (Intermediaries Guidelines) Rules, 2011. Now, it forms a part of Section 79 of the Information Technology Act. Part II of these rules shall be administered by the Ministry of Electronics and IT (MeitY) and Part III, relating to Code of Ethics and procedure and safeguards in relation to digital media shall be administered by the Ministry of Information and Broadcasting (MIB).

The new guidelines aim to bring digital news publishers under the ambit of Section 69(A) of the Information Technology Act, 2000 (IT Act, 2000). This section empowers the government to block public access to any information online—on websites and mobile apps, particularly when there is a threat to India’s defence, its sovereignty and integrity, friendly relations with foreign countries and public order. These rules work on the existing laws and statues of the country, just to provide a level playing field for all the concerned.

Need of the Guidelines

In the recent past, many disturbing developments have been observed on the social media platforms like WhatsApp, YouTube, Facebook, Instagram, Twitter, etc. Some incidents jolted the secular fabric of India and challenged India’s age-old cultural practices and traditional values, such as, constant spread of fake news, rampant abuse of the platform with indecent images of women which threatened the dignity of women and worked as an advocate of untruth. Misuse of social media for settling corporate rivalries in blatantly unethical manner became a major concern for businesses. To make the situation worse, instances of use of abusive language, defamatory and obscene contents, and glaring disrespect to religious sentiments through different platforms have seen a rapid rise. The most disturbing threat from the misuse of digital media platforms relates to biased political motif and threat to national security.

For years, criminals and anti-national elements have been using social media as a safe haven to execute their plans. Terrorist organisations and individuals not only abuse these platforms for influencing youth and arranging funds for their sinister motives, but also for inciting violence, among others. Even law enforcement agencies often fail to trace the origin of such acts.

At present, India does not have a robust transparency and complaint redressal mechanism wherein the ordinary users of social media and over-the-top (OTT) platforms can register their complaints and get them redressed within a specified timeframe. While the film and television are subject to certification and compliance with codes of ethics, such as, the print media is monitored by the Press Council of India, news channels are monitored by the News Broadcasters Association (NBA) and the Cable Television Networks (Regulation) Rules, 1994 and film content is regulated by the Central Board of Film Certification (CBFC), digital video services operating in India have largely been unregulated in the absence of an applicable jurisdictional framework. Consequently, the users are totally dependent on the whims and fancies of social media platforms. The monopolistic tendency of the digital platforms has been harassing the users in different ways. For example, a user who devotes valuable time, energy, and money in developing a social media profile is left with no remedies in case that profile is restricted or removed by the platform. These pressing compulsions compelled many media platforms to create fact-check mechanisms.

Observing the present threats both from outside and inside the country, including the cyber threat, it was necessary to safeguard our national interest along with cyber sovereignty. It is a primary duty of the government to look after the interest of its citizens and safeguard their privacy. So, the government has every right to control and know how the data of its citizens is handled.

Since the operationalisation of OTT platforms in India, they have attracted widespread criticism. Many public interest litigations (PILs) have been filed in the Supreme Court alleging the OTT content to be obscene and even insulting the religious sentiments of the people.

In the wake of the rising popularity of the OTT platforms as well as the controversy associated with their content, the Supreme Court in October 2020, sought the Centre’s response on the issue of regulating these platforms. Responding to it, the Centre asked the OTT platforms to formulate a self-regulatory model to look into the complaints.

Therefore, digital video content providers—often referred to as OTT platforms—attempted to develop their own systems. In September 2020, the industry players signed a code of self-regulation, but it was turned down by the MIB.

Present guidelines will regulate offensive contents that breed violence and vulgarity. It will also preserve constitutional values while keeping intact citizens’ right to freedom of expression. These guidelines were long overdue because of the recklessness and irresponsibility of some of these platforms.

The government has proposed self-regulation and has said that the OTT entities should get together, evolve a code, and come up with content classification so that a mechanism is evolved to preclude non-adults from viewing adult content.

The rule proposes a three-tire grievance redressal mechanism with the publishers and self-regulating bodies being the first two. The third tier is formed with the central government oversight committee. The policy proposed requires publishers to appoint grievance redressal officers and ensure a time-bound acknowledgement and disposal of grievances. Then, there can be a self-regulating body headed by a retired judge.

Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021

Some highlights of the Information Technology, 2021, are as follows:

  1. Guidelines to be Administered by Ministry of Electronics and IT
  • The new rules define digital media for the first time as ‘digitised content’ that can be transmitted over the internet or computer networks including content received, stored, transmitted, edited or processed by an intermediary, or a publisher of news and current affairs content or publisher of online curated content.
  • Social media platforms have been categorised into two, namely, ‘social media intermediary’ and ‘significant social media intermediary’.
  • ‘Social media intermediary’ is one which primarily enables online interaction between two or more users, allowing them to create, upload, share, modify or access information using its services. On the other hand, ‘significant social media intermediary’ is one that has many registered users in India above such threshold which is notified by the Centre. Significant social media intermediary hasto follow certain additional due diligence, such as, appointment of a Chief Compliance Officer, resident grievance officer, identification of the first originator of the information giving an opportunity to be heard, voluntary user verification mechanism, etc.

Rule 3 of the Intermediary Rules, 2021, compels all intermediaries (social media platforms like Twitter, Facebook, YouTube, and WhatsApp) to make a detailed examination while delivering content. If due diligence is not observed, and generally for non-observance of the Intermediary Rules, as applicable, safe harbour provisions under Section 79 of the I&T Act, 2001 would cease to apply. The due diligence requirements will come to effect after three months from the notification of the rules, i.e., February 25, 2021.

[Section 79 of the I&T Act, 2001 provides a ‘safe harbour’ to intermediaries that host user-generated content, and exempts them from liability for the actions of users if they adhere to government-prescribed guidelines.]

  • A grievance redressal mechanism is prescribed by mandating that the intermediaries, including social media platforms, should establish a mechanism for receiving and resolving complaints from users. The platforms are required to appoint a grievance officer to deal with such complaints. The grievance officer must acknowledge the complaint within 24 hours, and resolve it within 15 days of receipt. It also says that the names and contact details of such grievance officers must be prominently displayed in the website or mobile application of such intermediary.
  • Social media platforms will now be required to appoint a chief compliance officer, resident in India, who will be responsible for ensuring compliance with the rules. They will also be required to appoint a nodal contact person for 24×7 coordination with law enforcement agencies and a resident grievance officer, who is a resident of India. He shall perform the functions mentioned under grievance redressal mechanism.
  • The platforms are required to publish a monthly compliance report mentioning the details of complaints received and action taken on the complaints, as well as details of contents removed proactively by the significant social media intermediary.
  • In principle, 10 categories of content are prescribed under the rules that the social media should not host. These are—content that threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or public order, or causes incitement to the commission of any cognisable offence or prevents investigation of any offence or is insulting any foreign states. It also includes content that is defamatory, obscene, pornographic, paedophilic, invasive of another’s privacy, including bodily privacy, insulting or harassing on the basis of gender, libellous, racially or ethnically objectionable, relating or encouraging money laundering or gambling, or otherwise inconsistent with or contrary to the laws of India, etc.
  • It is specified that in case of receipt of information regarding the platform, hosting prohibited content, from a court or the appropriate government agency, the platform in question, must remove the content within 36 hours.
  • There is provision for identification of the first originator of the information or the source, as may be required by a judicial order. This will be the sole responsibility of a social media intermediary. The intermediary shall not be required to disclose the contents of any message or any other information to the first originator.
  • Social media platforms are required to give users a chance for explanation and a fair opportunity to be heard before restricting access to their accounts.
  • The provision of emergency blocking powers are also entailed in the rules. These powers are conferred upon the secretary, Ministry of Information and Broadcasting, who may ‘if she/he is satisfied that it is necessary or expedient and justifiable’ give orders to block access. Such orders can be released ‘without giving an opportunity of hearing’ to the publishing platform.
  1. Rules to be Administered by Ministry of Information and Broadcasting

Following rules are applicable to publishers of news and current affairs and publishers of online curated content. These rules shall be administered by the Ministry of Information and Broadcasting (MIB).

  • The intermediary is required to inform the publishers of news and current affairs content that their user account details must be furnished to the Ministry of Information and Broadcasting as may be required under Rule 18 of the Intermediary Rules, 2021.
  • Publishers of news and current affairs must observe Norms of Journalistic Conduct of the Press Council of India under the Press Council Act, 1978, and the Programme Code under Section 5 of the Cable TV Networks Regulation Act 1995. Content which is prohibited under any law for the time being in force shall not be published or transmitted.
  • For OTT service providers such as YouTube, Netflix, etc., the government has prescribed self-classification of content into five categories based on age suitability: (i) Online curated content that is suitable for children and for people of all ages shall be classified as ‘U’ (universal rating); (ii) content that is suitable for persons aged 7 years and older, and which can be viewed by a person under the age of seven years with parental guidance, shall be classified as ‘U/A 7+’ rating; (iii) content that is suitable for persons aged 13 years and above, and can be viewed by a person under the age of 13 years with parental guidance, shall be classified as ‘U/A 13+’ rating; (iv) content which is suitable for persons aged 16 years and above, and can be viewed by a person under the age of 16 years with parental guidance, shall be classified as ‘U/A 16+’ rating; and (v) online curated content which is restricted to adults shall be classified as ‘A’.
  • The content may be classified on the basis of—(i) themes and messages; (ii) violence; (iii) nudity; (iv) sex; (v) language; (vi) drug and substance abuse; and (vii) horror.
  • The platforms are required to implement parental locks for content classified as ‘U/A 13+’ or higher, and reliable age verification mechanisms for content that is classified as ‘A’.
  • Regarding obligation in relation to news and current affairs content, an Intermediary has to give a proper notice on its website, that all publishers of news and current affairs content must furnish the details about their user accounts to the Ministry of Information and Broadcasting.
  • There is provision of forming one or more self-regulatory bodies of publishers consisting of not more than six members, headed either by a retired Supreme Court or high court judge or an independent eminent person from the field of media, broadcasting, entertainment, child rights, human rights, etc., to ensure the compliance to the Code of Ethics and rules by online digital news platforms.
  • There will be an oversight mechanism under the Ministry of Information and Broadcasting. It will be entrusted with the task of publishing a charter for self-regulating bodies, including Codes of Practices. It will also set up an Inter-Departmental Committee for hearing grievances and issue proper guidance and advisories to publishers, and will ensure adherence to the Code of Ethics and maintenance thereof. For this, the information and broadcasting ministry will appoint an officer not below the rank of a Joint Secretary to the Indian government as ‘Authorised Officer’.

Provisions for Penalties

As the offences under the IT Act vary, the penal provisions also vary accordingly (from Section 65 to 78 and their sub-sections) from imprisonment up to three years to a maximum of seven years, or with fine starting from Rs 2 lakh, or with both. A person responsible for tampering with, concealing, destroying, or altering any computer source intentionally, is liable for a penalty of up to Rs 2 lakh along with simple imprisonment of three years, or both.

A person is liable to pay a penalty of up to Rs 5 lakh, or be jailed for up to three years, or both, under Section 66 of the IT Act if that person without the permission of the owner or any other person in charge of the computer or the computer network, damages the said properties. As per section 67A of the IT Act, for sexually explicit act, or conduct, in the first instance, such persons will be liable to pay a penalty up to Rs 10 lakh and face imprisonment up to five years, while in the second instance, the jail term will go up to seven years.

As per section 69 of the IT Act, executives of intermediaries which fail to follow the order issued by the government citing the seasons related to threat to sovereignty or integrity, defence, security of the state or public order, can be jailed for up to a period of seven years.

Implications

The new regulations expand the scope of regulation of the digital media manifold. The rules require social media companies in India to change their operating model to accommodate the new mandates. Messaging apps such as WhatsApp or Signal have to dilute end-to-end encryption to trace the ‘first originator’ of flagged messages. Facebook will also have to create a new interface for India, which will give users the option to verify users through authorised know-your-customer (KYC) processes. It will also display a verification tag for those who seek this. WhatsApp will devise a means of showing the verification tag. Twitter will implement the verified blue tick feature for those who want it. The new rules mandate social media companies to alter their interface to clearly distinguish verified users from others. Therefore, they have to set up automated tools for content filtration and inform users if their accounts have been blocked with explanations.

Global Perspective

Growing reliance on digital technologies has created new ways for individuals, companies, and governments to intentionally cause harm or to act irresponsibly. Every day brings new stories about hatred being spread on social media, invasion of privacy by businesses and governments, cyber-attacks using weaponised digital technologies, or states violating the rights of political opponents. India is not spared from this deadly trap.

According to Ram Seethepalli, CEO, Cyberior by Europ Assistance India, “India is among the most cyber-attacked countries in the world and hence it is imperative to have stricter cybersecurity and data protection laws to mitigate data thefts and cybercrimes.”

India’s present rules are landmark steps in controlling and regulating digital platforms and the OTTs and pave the way for other countries to follow the footsteps. According to some experts, the Indian brave act could be followed by other countries, such as the US, the UK, and Australia. The new rules are part of a global shift following the mass shooting in New Zealand that was live-streamed by the gunman on Facebook and a US executive order to revisit a law that gave absolute immunity to social media platforms. In this context, the Indian rules act as a logical resemblance to the event happening globally. This could be termed the first decisive step taken by a big nation to have a strong policy for intermediaries and their liabilities. Recently, Australia is pushing companies, such as Google and Facebook, to share revenue with news publishers. The UK and the European Union (EU) will watch how India implements the new social media rules. Many countries have also been pressing social networks to take responsibility for content on their platforms besides wanting tighter data handling practices.

India finds a place among top three internet markets with close to 700 million users. Therefore, its digital policy making is followed closely, the world over. If companies agree to Indian government laws, they cannot refuse to do it in other countries. Recently, India’s ban of Chinese short video app TikTok was cited in the US executive order seeking a similar halt on the Bytedance-owned company. In the past, India’s demand for traceability had also echoed in countries such as the US, the UK, and Australia.

According to Pavan Duggal, a Supreme Court advocate and cyber expert, “Rules, like the present ones, help strengthen national interest and cyber sovereignty of countries, give governments much more control over how the data of their citizens is handled, apart from giving them a way to govern companies which are generating revenues from users from the country by having an establishment there and a grievance redressal mechanism.”

Criticism of the Rules

The Government of India claims that the new mechanism will deal with issues such as the continuous spread of fake news and other misinformation. But with the coming of the guidelines, the digital platforms will come under the government fold, which is a matter of concern. The codes make the governmental panel the final authority for resolution of grievances. According to a digital liberties organisation the Internet Freedom Foundation (IFF), this move is an ‘excessive governmental control over digital news and OTT content’. Excessive censorship and surveillance might be a potential cause to hide the truth.

Some experts opine that the rules will curtail freedom of expression and privacy as granted by the Constitution of India. Article 19 of the Constitution, which deals with right to freedom, says all citizens shall have the right to freedom of speech and expression. But it clarifies that freedom of speech and expression should not affect the operation of any existing law, or prevent the State from making any law, in so far as such law imposes reasonable restrictions on the exercise of the right if it comes into conflict with the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, public order, decency or morality, or in relation to contempt of court, defamation, or incitement to an offence.

Udbhav Tiwari, public policy adviser at Mozilla, opines that some provisions of the rules, such as those enabling traceability of encrypted content and automated filtering, are fundamentally incompatible with end-to-end encryption and will weaken protections that millions of users have come to rely on in their daily lives.

According to Mayank Bidawatka, co-founder of Koo (the made-in-India alternative to Twitter), the guidelines helped clarify the responsibilities of intermediaries. “Only a small fraction of the social media users are found to be making posts which may be against the laws of the land. The social media guidelines make addressing these situations uniform across all social media platforms and ensure the safety of the majority social media users across India. Enabling and maintaining freedom of speech is core to social media platforms. This policy will help protect the interest of citizens at large and keep nefarious elements at bay.”

Obstacles in the Path of OTT

The social media platforms have to face the following constraints:

  1. The industry will have to keep a check on the contents to be aired.
  2. They would have to apply for certification and approval of the content to be streamed.
  3. If it so happens, it would create many problems as most OTT content could otherwise be censored by the certification boards.
  4. Now social media platforms like Facebook, Twitter, etc., would also come under the I&B ministry, which used to be under the IT ministry earlier.
  5. The fate of the digital media is at stake. The sector is doomed for both the media practitioner and the media entrepreneur that have been the vibrant face of contemporary journalism.
  6. Platforms need to disclose what metadata (age, educational qualification, choice of content, and time spent viewing or reading it, etc.) is collected and algorithms to define how content is served up to users.
  7. Aspects related to metadata will come under the IT Act, while the part related to content will be regulated by the I&B ministry.

© Spectrum Books Pvt Ltd.

error: Content is protected !!

Pin It on Pinterest

Share This