A fake image apparently showing a blast near the Pentagon was widely shared on social media on Monday (22 May), triggering a brief fall in the stock market in the United States. According to Associated Press (AP), the artificial intelligence-created photograph was picked up by news outlets outside America and was even shared by some verified accounts, further fuelling the panic. However, the reports were soon debunked by Arlington County Fire Department (ACFD). “Pentagon Force Protection Agency and the ACFD are aware of a social media report circulating online about an explosion near the Pentagon. There is NO explosion or incident taking place at or near the Pentagon reservation, and there is no immediate danger or hazards to the public,” it said in a tweet.
@PFPAOfficial and the ACFD are aware of a social media report circulating online about an explosion near the Pentagon. There is NO explosion or incident taking place at or near the Pentagon reservation, and there is no immediate danger or hazards to the public. pic.twitter.com/uznY0s7deL
— Arlington Fire & EMS (@ArlingtonVaFD) May 22, 2023
This is just in a series of fake AI-generated images that have gone viral recently – remember former United States president Donald Trump being arrested or Russian leader Vladimir Putin behind bars. With such pictures becoming more common day by day, here’s how you can spot fakes. Look closely Although the technology is becoming increasingly accessible and sophisticated, AI-created images still have some shortfalls. If you are unsure whether an image is real or not, enlarge it. As Deutsche Welle (DW) noted, this would reveal the inconsistencies and mistakes that might have gone unnoticed the first time. You should analyse the background to look for clues such as nearby landmarks, road signs and even weather conditions, reported Al Jazeera.
AI images typically have flaws.
Hany Farid, a computer science professor at the University of California, Berkeley, told AP in an email that AI-produced images typically have inconsistencies in the building, fence and surrounding area. In the case of the image of the fake blast at the Pentagon, Farid said, “Specifically, the grass and concrete fade into each other, the fence is irregular, there is a strange black pole that is protruding out of the front of the sidewalk but is also part of the fence. The windows in the building are inconsistent with photos of the Pentagon that you can find online.”
Verify the source Try to find out the source of the image to check which account first uploaded the post. Take a look at their previous posts and who are they following and who follow them. You can also carry out a reverse image search by uploading the picture on tools such as Google Image Reverse Search, TinEye or Yandex to locate the original source, as per DW. Moreover, in case of a big news event, check reputable news outlets to confirm if they have covered it. If something dubious has gone massively viral, you can search fact-checking websites to see if they have quashed the reports. ALSO READ: Donald Trump 'arrested'? Vladimir Putin 'in jail'? How deepfakes are spreading misinformation online Be sceptical If an image is too good to be true, that is probably because it is not. Wael AbdAlmageed, a research associate professor of computer science at the University of Southern California, told Scientific American magazine: “We as human species sort of grow up thinking that seeing is believing”.
“That’s not true anymore. Seeing is not believing anymore.”
Most of the time AI images have errors when it comes to depicting body parts. A person in an AI-generate picture might have more or lesser fingers or even comically large hands or feet. These images can also mess up teeth or produce unrealistic ears or other body parts. “There’s a structure to your hands. You have five fingers, not four or six or whatever. The [AI] models have trouble with that kind of structure, although the newer ones are getting better at it,” James O’Brien, a computer science professor at the University of California, Berkeley, was quoted as saying by Discover magazine. Another way to spot an AI image is to look at its perfection. People with glossy skin with no flaws are too common in AI images. “The faces are too pure, the textiles that are shown are also too harmonious,” Andreas Dengel of the German Research Center for AI told DW.
AI photos have this weird glossy look to them that I can’t fully explain but I’m getting better at noticing it. https://t.co/oyFAvAJWzk
— ɪ ᴀᴍ ɴᴏᴛ ᴋᴀʏᴛʀᴀɴᴀᴅᴀ (@chaclobro) March 17, 2023
You should also pay attention to the eyes to distinguish deepfakes as most AI-generated videos which imitate people have problems blinking, noted Al Jazeera. Dangers of AI While some AI images may be shared for entertainment purposes, they still have the potential to spread propaganda and misinformation. Fake images and deepfakes can be employed to commit crimes, influence events such as elections, harm someone’s reputation as well as undermine trust in democratic institutions and media. AI tools like Midjourney, DALL-E, Stable Diffusion and DeepAI can easily produce realistic images that can deceive millions across the globe.
In the era of deep fake, AI generated images, it is going to be humanly impossible to discern what is fake & real.
— Kumar Manish (@kumarmanish9) March 31, 2023
Pope Francis in Balenciaga deepfake fooled millions. ⬇️ pic.twitter.com/EbZM0LdYED
In an email to AP, Chirag Shah, co-director of the Center for Responsibility in AI Systems & Experiences at the University of Washington in Seattle, warned that detecting fakes would not always remain easy. As AI technology upgrades, people will need to depend on “crowdsourcing and community vigilance to weed out bad information and arrive at the truth”. “Simply relying on detection tools or social media posts is not going to be enough,” Shah added. O’Brien cautioned, according to Discover magazine, that the future might not be too far where “we can no longer tell the difference between real photos and AI-generated images”. With inputs from agencies Read all the Latest News , Trending News , Cricket News , Bollywood News , India News and Entertainment News here. Follow us on Facebook, Twitter and Instagram.


)

)
)
)
)
)
)
)
)
