The new amendments to IT rules impose a legal obligation on social media companies to take all out efforts to prevent barred content and misinformation, the government said on Saturday making it clear that platforms such as Twitter and Facebook operating in India will have to abide by local laws and constitutional rights of Indian users.

The new rules provide for setting up appellate committees which can overrule decisions of the big tech firms on takedown or blocking requests.

The hardening of stance against the big tech companies comes at a time when discontent has been brewing over alleged arbitrary acts of social media platforms on flagged content, or not responding fast enough to grievances.

Amid concerns over the rising clout of Big Tech globally, the CEO of electric car maker Tesla, Elon Musk, on Friday completed his $44 billion (roughly Rs. 3,62,300 crore) takeover of Twitter, placing the world’s richest man at the helm of one of the most influential social media apps in the world. Incidentally, the microblogging platform has had multiple run-ins with the government in the past.

India’s tweaking of IT rules allow formation of Centre-appointed panels, that will settle often-ignored user grievances against content decision of social media companies, Minister of State for IT Rajeev Chandrasekhar said, adding that this was necessitated due to the “casual” and “tokenism” approach of digital platforms towards user complaints so far.

“That is not acceptable,” Chandrasekhar said at a media briefing explaining the amended rules.

The minister said that lakhs of messages around unresolved user complaints reflected the “broken” grievance redressal mechanism currently being offered by platforms, and added that while it will partner with social media companies towards common goal of ensuring Internet remains open, safe and trusted for Indians, the government will not hesitate to act, crackdown, where public interest is compromised.

On whether penalties will be imposed on platforms for not complying, he said the government would not like to bring punitive action at this stage but warned that if the situation demands in future, that could be considered too. The internet is evolving, as will the laws.

“We are not getting to the business of punity, but there is an opinion that there should be punitive penalties for those platforms not following rules…it is an area we have steered clear of, but that is not to say it is not on our mind,” he cautioned.

The tighter IT norms raises due diligence and accountability of platforms to fight illegal content proactively (government has added deliberate misinformation to that list too), with a 72-hour window to take down flagged content. So far, intermediaries were only required to inform users about not uploading certain categories of harmful or unlawful content.

“The obligations of intermediaries earlier was limited to notifying users of the rules but now there will be much more definite obligation on platforms. Intermediaries have to make efforts that no unlawful content is posted on platform,” the minister said.

These amendments impose a legal obligation on intermediaries to take reasonable efforts to prevent users from uploading such content, an official release said.

Simply put, the new provision will ensure that the intermediary’s obligation is not a “mere formality”.

“In the category of obligation we have added misinformation…intermediary should not be party to not just illegal content, but they can’t be party to any deliberate misinformation as content on platforms. Misinformation not just about media it is about advertising…illegal products and services, online betting, misinformation can be in fintech community, misrepresenting products and services. Misinformation also refers to false information about person or entity,” the minister said.

For effective outreach, communication of the rules and regulations will have to be done in regional Indian languages by platforms.

The government has, in the new rules, added objectionable religious content (with intent to incite violence) alongside pornography, trademark infringements, fake information and something that could be a threat to sovereignty of the nation that users can flag to social media platforms.

The words ‘defamatory’ and ‘libellous’ have been removed; whether any content is defamatory or libellous will be determined through judicial review.

Some of the content categories have been rephrased to deal particularly with misinformation, and content that could incite violence between different religious/caste groups (that is information promoting enmity between different groups on the grounds of religion or caste with the intent to incite violence).

The rules come in the backdrop of complaints regarding the action/inaction on the part of the intermediaries on user grievances regarding objectionable content or suspension of their accounts.

“The intermediaries now will be expected to ensure that there is no uploading of content that intentionally communicates any misinformation or information that is patently false or untrue hence entrusting an important responsibility on intermediaries,” the official release said.

The rules also have made it explicit for the intermediary to respect the rights accorded to the Indian citizens under the Articles 14 (non-discrimination), 19 (freedom of speech, subject to certain restrictions) and 21 (right to privacy) of the Indian Constitution.

In a strong message to Big Tech companies, the minister asserted that community guidelines of platforms – regardless of whether they are headquartered in the US, Europe, or elsewhere – cannot undermine constitutional rights of Indians, when such platforms operate in India. Chandrasekhar said platforms will have obligation to remove within 72 hours of flagging, any “misinformation” or illegal content or content that promotes enmity between different groups on the grounds of religion or caste with the intent to incite violence. He said that effort should be to take down illegal content “as fast as possible”.

The complaints around illegal content could range from child sexual abuse material to nudity to trademark and patent infringements, misinformation, impersonation of another person, content threatening the unity and integrity of the country as well as “objectionable” content that promotes “enmity between different groups on the grounds of religion or caste with the intent to incite violence”.

The modalities defining the structure and scope of Grievance Appellate Committees will be worked out soon, he promised adding that the process will start with 1-2 such panels, which will be expanded based on requirements. The panels will not have suo moto powers.

“Government is not interested in playing role of ombudsman. It is a responsibility we are taking reluctantly, because the grievance mechanism is not functioning properly,” the minister said. The idea is not to target any company or intermediary or make things difficult for them. The government sees internet and online safety as a shared responsibility of all, the minister noted.

It is pertinent to mention that big social media platforms have drawn flak in the past over hate speech, misinformation and fake news circulating on their platforms, and there have been persistent calls to make them more accountable. Microblogging platform Twitter has had several confrontations with the government over a slew of issues.

The government, in February 2021, notified IT rules that provided for social media platforms to appoint a grievance officer. Non compliance with IT rules result in these social media companies losing their intermediary status that provides them exemptions from liabilities for any third party information and data hosted by them.


Affiliate links may be automatically generated – see our ethics statement for details.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *