advertisement
This multimedia immersive tells you how. Click here to read and view the full immersive.
Despite the explosion of generative AI-powered art tools like Midjourney, Stable Diffusion, Open AI’s DALL-E, and others, artificial intelligence hasn't perfected making images just yet.
So, get ready to channel your inner Sherlock as we explore the most common telltale signs in AI-made images — so you can spot them better. We begin, with that cliché of fact-checking — TAKE A CLOSER LOOK.
What do you do when none of these tips work? Sometimes, AI creates extremely realistic images without any 'tells'. That's when a simple search can help.
A reverse image search on the viral picture can often lead to the original source. A search can open up a lot of possibilities.
It might lead you a sharper or a better version of the image which can further be used to search for the source.
It can also direct you to similar images available on the internet.
It can also help in finding older posts carrying the same image and news reports (if any).
The search can be performed using Google Lens or can also be done using InVID WeVerify, a Google Chrome extension.
Watch the video below where we explain in detail how to perform a reverse image search.
One should also rely on credible news sources and media reports to verify images going viral on social media platforms.
It should be noted that important events don't happen in isolation and if an image is claimed to be from such an event, then there would almost invariably be news reports around the same.
Read and view the full chapter here, with examples to demonstrate the different points.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)