New amendments to IT rules impose a legal obligation on social media companies to do their utmost to prevent banned content and misinformation, the government said on Saturday, making it clear that platforms such as Twitter and Facebook operating in India must comply with local regulations. laws and constitutional rights of Indian users. The new rules provide for the creation of appeal panels that can overrule big tech companies’ decisions on takedown or blocking requests.
The hardening stance against big tech companies comes at a time when discontent is simmering over alleged arbitrary actions by social media platforms on flagged content, or not responding to grievances quickly enough. Amid concerns over the growing influence of Big Tech around the world, electric car maker Tesla Inc CEO Elon Musk completed his $44 billion takeover of Twitter on Friday, placing the richest man of the world at the head of one of the most influential social networks. apps in the world. Incidentally, the microblogging platform has had several run-ins with the government in the past.
India’s adjustment of IT rules allows formation of Center-appointed panels that will settle often-overlooked user grievances against social media companies’ content decision, State IT Minister Rajeev says Chandrasekhar, adding that this was necessitated by the digital platforms’ “casual” and “symbolic approach” to user complaints so far. “It’s not acceptable,” Chandrasekhar told a news conference explaining the changed rules.
The minister said hundreds of thousands of posts about unresolved user complaints reflected the “broken” grievance mechanism currently offered by platforms, and added that while he will partner with social media with the common goal of ensuring that the Internet remains open, safe and reliable. for the Indians, the government will not hesitate to act or to crack down where the public interest is compromised.
On whether sanctions will be imposed on platforms for non-compliance, he said the government did not want to take punitive action at this stage, but warned that if the situation required it in the future, it could also be considered. The internet is changing, and so are the laws. “We don’t address the issue of punishment, but there is a view that there should be punitive sanctions for platforms that don’t follow the rules…this is an area we’ve avoided, but that doesn’t mean it’s not in our minds,” he cautioned.
Stricter IT standards increase due diligence and accountability for platforms to proactively tackle illegal content (the government has also added willful misinformation to this list), with a 72-hour window to remove flagged content. Until now, intermediaries were only required to inform users not to upload certain categories of harmful or illegal content. “Previously, the obligations of intermediaries were limited to informing users of the rules, but now there will be a much more precise obligation on the platforms. Intermediaries must make efforts to ensure that no illegal content is published on the platform”, said the minister. These changes impose a legal obligation on intermediaries to use reasonable efforts to prevent users from uploading such content, according to an official statement. Simply put, the new provision will ensure that the obligation of the intermediary is not a “mere formality”.
“In the category of obligations, we have added disinformation… the intermediary should not be a party not only to illegal content, but he cannot be a party to deliberate disinformation as content on the platforms. Disinformation is not only about media, but also advertising…illegal products and services, online betting, misinformation can be found in the fintech community, misrepresent products and services Misinformation also refers to false information on any person or entity,” the Minister said. For effective outreach, communication of rules and regulations shall be done in regional Indian languages through the platforms.
The government has, in the new rules, added objectionable religious content (with the intention of inciting violence) alongside pornography, trademark infringement, false information and anything that could pose a threat to the sovereignty of the nation that users can report to social media platforms. The words “defamatory” and “libelous” were deleted; whether any content is defamatory or libelous will be determined by judicial review. Some of the content categories have been reworded to specifically address misinformation and content that may incite violence between different religious/caste groups (i.e. information promoting enmity between different groups for religious or caste grounds with the intent to incite violence). The rules come in the context of complaints about action/inaction by intermediaries on user complaints about objectionable content or suspension of their accounts.
“Intermediaries will now have to ensure that there is no uploading of content that intentionally communicates misinformation or manifestly false or untrue information, thereby placing significant responsibility on intermediaries,” the official statement read. The rules also made explicit respect through the rights granted to Indian citizens under Articles 14 (non-discrimination), 19 (freedom of speech, subject to certain restrictions) and 21 (right to privacy ) of Indian law. Constitution.
In a strong message to Big Tech companies, the minister asserted that the platforms’ community guidelines – whether they are headquartered in the United States, Europe or elsewhere – cannot undermine the constitutional rights of Indians, when these platforms operate in India. Chandrasekhar said platforms will be required to remove within 72 hours of reporting, any “misinformation” or illegal content or content that promotes enmity between different groups on the grounds of religion or caste with the intention of incite violence. He said the effort should be to remove illegal content “as quickly as possible”. Complaints about illegal content could range from child sexual abuse material, nudity, trademark and patent violations, misinformation, impersonation of another person, content that threatens the unity and integrity of the country as well as “objectionable” content that promotes “enmity between different groups”. on grounds of religion or caste with the intent to incite violence”.
Modalities defining the structure and scope of grievance appeal committees will be worked out soon, he promised, adding that the process will start with 1-2 such groups, which will be expanded as needed. Panels will not have suo moto powers. “The government is not interested in playing the role of ombudsman. It is a responsibility we take on reluctantly, as the grievance mechanism is not working properly,” the minister said. The idea is not to target a company or an intermediary or to make it difficult for them. The government sees internet and online security as a responsibility shared by all, the minister noted.
It is pertinent to mention that major social media platforms have drawn attention in the past to hate speech, misinformation and fake news circulating on their platforms, and there have been persistent calls to hold them more accountable. . The microblogging platform Twitter has had several confrontations with the government over a host of issues. The government in February 2021 notified IT rules that required social media platforms to appoint a complaints officer. Failure to comply with IT rules results in the loss by these social media companies of their intermediary status which exempts them from any responsibility for the information and data of third parties hosted by them.