We know it already, and we didn’t need WhatsApp to tell us that Indians send more forwards – messages, photos and videos – than people anywhere else in the world.
With that in mind, and the Fake News and misinformation mess it has allowed to fester in India, WhatsApp is testing adding more friction to the process of forwarding messages, which it will be introducing in its beta build.
WhatsApp is doing two key things to add friction
1. For media messages, it is removing the forwarding button next to it. Thus, to forward a media message, users will have to select the message, click on the forwarding icon on the top menu bar, and then forward. This adds one extra step to the forwarding process.
2. It is restricting forwarding to 5 people/chats at the same time. Thus, if you’re forwarding a message, you can only select 5 people/groups from your contact list to forward it to. Thus, to forward it to 50 people, you’ll have to forward the message 10 times, selecting 5 people each time. This adds a significant amount of friction.
From a product perspective, it’s important to remember that these are very significant changes: apps look to reduce friction, since each additional step means that the propensity to use it is lower.
Some Thoughts on this Change
These are welcome moves from Whatsapp, but some concerns remain.
1. If it’s important to them, people will forward: The kind of messages that lead to mobs and lynchings, such as those related to kidnapping of children, rapes by a particular community/religion, etc are such that people will, either by design or because they don’t realise these are false, will forward. The same applies regarding false medical advise, or that Rs 2,000 rupee notes have GPS chips embedded. My point is, if it seems important enough, that friction will not stop people from forwarding the message.
2. It still doesn’t help bring accountability: If someone is fanning communal tension in India, it is illegal. WhatsApp’s changes do not create a path to the identification of those who might be inciting violence. I know that one doesn’t design (or rule) for exceptions, but these exceptions are becoming a norm, and people are getting hurt.
My solution, as I’d mentioned, was around allowing people to choose whether a message is public or private; if the message is public, then it gets a unique ID, which means that it can be shut down across the platform, and can also, though a legal process, be linked back to the individual who sent it. That brings in accountability. More details here.
3. It reduces the network effects of the spread: These changes do slow down the spread of the spread, and force people to prioritise who they send it to, instead of broadcasting the messages to everyone. Now, instead of, say, one person forwarding it to 50 people, who then forward it to 50 people each, it’ll have to be one sending it to five, who then send it to five each and so on. Network effects still have an impact, but the speed and expanse of the spread may be impacted.
What’s not available from WhatsApp is how they will track the impact of these changes.
Not Enough
I wrote in 2014 that there is, in case of all platforms, a regulatory gap between the responsibility, accountability and liability of platforms. The safe harbour provisions that limit intermediary liablity, give platforms the protection to scale. And that makes sense: platforms cannot be held liable and accountable for how people use them. However, that doesn’t mean that platforms don’t have a responsibility.
Most platforms work backwards: they start with a more open platform, with low friction, and allow misuse, often in plain sight, and they start correcting for misuse when they feel there’s enough of a problem to tackle.
Lack of liability means they can allow illegal behavior, unless someone spots it. At the scale that WhatsApp is, the impact of turning a blind eye – and let’s face it, WhatsApp messages were spread during the Muzaffarnagar riots in 2013, is now having dangerous consequences.
Regulation will catch up, and it is up to WhatsApp to do much more than it is doing now. It’s the same problem with Facebook post the Cambridge Analytica situation: if they don’t make very significant changes now, they will end up being regulated heavily, and their intermediary liability protections will be lost forever.
The problem for the Internet, and the freedom that Intermediary Liability protections allow, is that the collateral damage could hurt even the more responsible actors.
(This article was first published on Medianama. Nikhil Pahwa is the founder of Medianama. This is an opinion piece and the views expressed above are the author's own. The Quint neither endorses nor is responsible for the same.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)