As reported in today’s front page story in the Indian Express, why are government and large online platforms privately discussing how to censor and break encryption of your social media and messaging? To take down your posts “pro-actively” and by requiring traceability.
Why is it being done secretly? Why are you not being involved? We first explain what these rules do, then pose the top 5 concerns for users in a rule-by-rule analysis.
Due to substantial public interest being at stake, we also are making available a complete copy of these draft rules here.
Intermediary Rules Made in 2011 Were ‘Vague’
These rules are called the Intermediary Rules, 2011 made under the Information Technology Act, 2000 that provide immunity for online platforms, ISPs — big, large, and tiny — for the content which is transmitted and published by end users.
This allows these conduits of information to facilitate a core function of free expression and prevents them from throttling content or overbroad censorship which is termed as a, “chilling effect”.
In return, they have to comply with legal requests for takedown of content and providing information on users – basically comply with the law. This was a principle recognised in Section 79 of the Information Technology Act, 2000 (as amended in 2008).
But principal legislations such as Section 79 leave the details to subordinate rules. This is exactly what are the Intermediary Rules, 2011 which were made after public consultation around March, 2011. There was dispute on how this consultation was carried out, but still the draft rules were published online and comments were invited by the Ministry of Electronics and IT.
However, these rules were unclear and vague. For instance, they did not clearly state what was, “actual knowledge”, and due to this in the Shreya Singhal case (yes, the Section 66A one), the Supreme Court said, that, “actual knowledge” came only when these platforms received a legal notice from the police or a court, not from private parties. So what is being changed? And what is at stake?
5 Top Concerns
1. On Process:
First let us start with how these draft rules are being made. This is a serious development and is eerily reminiscent of the calls for pre-censorship made for, “pre-censorship” in December, 2011.
As reported by the Indian Express, the process was to be closed, being held between officials of the Ministry of Electronics and IT and a handful of large social media and messaging companies who have been allowed to give comments by 7 January.
But the changes, as we go on to explain, will impact users like you and me. They will impact our right to privacy and freedom of speech and expression. For such serious changes to be made behind closed doors, with inputs from only a select few companies and without a public consultation, would have seriously tainted whatever rules were finally adopted.
After the news broke, the Ministry of Electronics and IT has now opened the document to a supposed public consultation, and invited replies from the public until 15 January 2019. While this is a welcome move, the fact that this only happened now raises serious questions about the process and how committed the government is to reviewing public responses – the fact that the deadline is 15th rather than 7th, indicates this was a post-facto decision.
2. Breaking Encryption:
Draft rule, 3(5) introduces the requirement of traceability which would break end-to-end encryption.
- Many platforms (Whatsapp, Signal, Telegram and even other platforms) retain minimal user data for electronic information exchange and also deploy end-to-end encryption to provide reliability, security and privacy to users. These are used by millions of Indians to prevent identity theft, code injection attacks. Encryption becomes more important as for most of us, our lives now involves personal data. Without thought or involving technical experts in an open consultative process, without any data protection law or surveillance reform, this is being tinkered with by introducing the requirement of, “traceability”.
- This has important consequences for everyday users of online services and should also be seen in the context of the MHA notification which activates the 2009 rules with the power to direct, “decryption”. We do not have any proper parliamentary oversight or judicial check on surveillance and the latest draft rules if they go through would be a tremendous expansion in the power of the government over ordinary citizens eerily reminiscent of China’s blocking and breaking of user encryption to surveil its citizens
3. Longer, Even Indefinite Data Retention:
Draft Rule 3(8) increases the data retention period from 90 to 180 days and provides for further discretionary retention on the discretion of “government agencies”. The phrase, “government agencies” is not defined and the specific conditions or any outer limit for data retention at the end of the online platform is also not limited.
Hence, by a mere letter on behalf of any government department, arguably a private platform can be required to store a users’ data indefinitely, without even letting him know about it. It is important to remember that such retention will be possible despite the user deleting the data on the servers of the intermediary.
4. Pro-active Censorship:
Draft Rule 3(9) is the most dangerous bit which would be sledgehammer to online free speech. Not abuse, harassment or threats, but legitimate speech by requiring online platforms to become pro-active arbiters and judges of legality (not their own terms of use which is a contract between the user and a platform).
- Placing such a requirement for a platform to obtain immunity from prosecution and actively sweep its platform would result in widespread takedowns without any legal process or natural justice. This violates the reasoning of the Shreya Singhal judgment which noted, “it would be very difficult for intermediaries like Google, Facebook etc. to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.” It shifts the duty of the state to a private party.
- What is worse? It will be done by, “technology based automated tools or appropriate mechanisms”. Such tools have been shown to be faulty, have coding biases and prone to overbroad censorship. Should we subject our fundamental right to free speech on the basis of a developing technology measure? AI censorship is the Chinese model of censorship.
5. The Nanny Requirement:
Draft Rule 3(4), inserts a monthly requirement (at the least) to inform users about the legal requirements such as the terms and conditions and privacy policy. At first blush this may seem as a needed measure, with rampant online abuse and trolling. But consider the change in the environment from a public park to a guarded school yard in which you are constantly reminded that you are under watch and you better behave yourself.
It will turn the internet in India into a corporal environment which is bad for users. Rather than letting market mechanisms figure out a notification for good conduct, which is in the best interests of platforms themselves, such a measure by law will require product side changes for smaller startups and entrepreneurs as well.
There are many more problems which we will comment upon and analyse during the day and the coming week.
At IFF we believe there are better ways to check misinformation and threats to Indian elections. These can be achieved as per our fundamental rights guaranteed under the Constitution. The instant proposals seen alongside the recent MHA notification activating the 2009 interception rules is taking India close to a chinese model of censorship. Yes, online platforms are problematic, they require fixes.
But driving changes through a closed and secretive process in which measures that undermine fundamental rights is a harmful approach for all of us.
To us the path to disinformation is by first checking and bringing in a comprehensive privacy law which brings the power and control of smartphones in the hands of ordinary Indians. This will help bring accountability to large data controllers, from online companies which target us with advertising that is used by political parties.
We also need to focus on steps and support the Election Commission. But right now, today, we all need to push back and speak up to #SaveOurPrivacy and #RightToMeme.
(This was first published on the website of the Internet Freedom Foundation and has been republished with permission. This is an opinion piece and the views expressed are the author’s own. The Quint neither endorses nor is responsible for them.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)