advertisement
With Lok Sabha Elections 2019 two months away, WhatsApp is expected to be as much of a political battlefield as the ground rallies. On Monday, Carl Woog, WhatsApp’s head of communications, admitted that political parties in India have been observed to have misused the app during elections.
During a workshop held to explain the tools that the messaging app will be deploying to tackle misinformation and abuse in the run up to elections, Woog said they had observed how their platform was used during the Karnataka elections
Woog also emphasised that they have specified to political parties that the app “will be banning accounts that engage in automated or bulk behaviour”.
Importantly, reiterating WhatsApp’s commitment to privacy, Woog also clarified that the government’s recent demand to introduce traceability of its messages by breaking encryption was “not possible” and “not consistent with the strong privacy protections.”
The messaging app, on Monday, said it will be deploying machine learning tools to fight election-related misinformation and at the heart of its strategy is tracking automated or bulk messages and banning accounts that it finds abusive.
WhatsApp also released a white paper detailing the strategies it has adopted to tackle the menace of fake news and abuse globally including India.
“We ban two million accounts a month globally,” said Matt Jones, lead software engineer on WhatsApp’s integrity team. He did not, however, specify how many of those accounts, on average, are from India – the largest market for the app which currently has 1.5 billion active users globally.
Apart from designing software tools to detect abuse, Woog said that the company has also been speaking with different political parties over the last few months to discuss and explain the ways in which their platform should and should not be used.
Apart from iterating that WhatsApp was primarily meant for personal messaging, it also made clear its stance on the government’s demand to introduce traceability of messages on its platform.
The Ministry of Electronics and IT had published draft amendments to its intermediary liability rules under section 79 of the Information Technology Act (2000). In exchange for immunity from liability for content posted by end users, the government wants intermediaries like WhatsApp to essentially break its end-to-end encryption and allow for content to be accessed.
He further added that the “proposed changes are over broad and are not consistent with the strong privacy protections that are important to people everywhere. Not just in India but around the world.”
In response to The Quint’s question about how parties have responded to WhatsApp’s emphasis on privacy and encryption, Woog said “he hoped the parties understood what they were trying to explain”. End-to-end encryption ensures that only the sender and the recipient can read the texts and in doing so also guarantees privacy and security of the message in transit and at rest.
Among the major developments in WhatsApp’s quest to fight abuse and misinformation are its machine-learning tools. Matt Jones, lead software engineer on WhatsApp’s integrity team, said they look for specific signals around abusive content and once those signals are triggered, they take action in the form of banning a user account responsible for it.
According to Jones, WhatsApp’s machine-learning model comprises three essential components:
According to Jones who made a 30-minute long presentation on WhatsApp’s abuse-fighting strategies, the company has trained its machine-learning algorithms to track and detect abusive accounts.
An app that was built to facilitate private messaging between two people has also been used to spread rumours and misinformation which has resulted in violence as well deaths due to lynching.
WhatsApp, in July 2018, had rolled out a number of safeguards in the aftermath of lynchings in Maharasthra and Assam to help users better identify possible misinformation.
It had created a ‘forwarded’ label to mark messages that have been forwarded. This was intended to help people know when a message they have received was not created by the person who sent it.
WhatsApp had also imposed a five-chat forwarding limit to stifle bulk forwards. It had also removed the quick forward button next to media messages. The global limit , however, is 20 chats compared to five in India, Woog reiterated.
In addition to all the tools that the app has come up with, it has also spoken with the major political parties to explain issues of privacy and discuss ways to tackle the menace of political bulk messaging. “We have discussed it and we are hoping that the parties understand that we are not a bulk messaging platform,” Woog told The Quint.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)