Friday, October 25

Fake AI-generated picture of explosion close to US Pentagon goes viral

A pretend picture that appeared to indicate an explosion close to the Pentagon briefly went viral on social media and left fact-checkers and the native fireplace service scrambling to counter the declare.

It appeared the picture, which purported to indicate a big cloud of black smoke subsequent to the US headquarters of the Department of Defence, was created utilizing AI expertise.

It was first posted on Twitter on Monday morning and it was shortly recirculated by verified, however pretend, information accounts.

Department of Defence spokesperson Phillip Ventura instructed the Reuters information company that the reviews of an explosion had been “false”.

The native fireplace division, Arlington Fire, tweeted: “There is NO explosion or incident taking place at or near the Pentagon reservation, and there is no immediate danger or hazards to the public.”

The picture was described as “clearly AI generated” by Nick Waters, an investigator on the digital investigations agency Bellingcat.

“Check out the frontage of the building, and the way the fence melds into the crowd barriers,” he wrote on Twitter.

More on Artificial Intelligence

He pointed on the market had been no different photographs, video or eyewitness accounts of the supposed explosion.

However, that didn’t cease Russian state-backed information channel RT tweeting about “reports of an explosion near the Pentagon”.

Fake accounts pretending to be information retailers with paid-for blue ticks additionally retweeted the picture.

The hoax is certainly one of a number of AI-generated photos which have made headlines lately.

A convincing deepfake of the Pope sporting a white Balenciaga puffer coat went viral and the winner of the Sony Photograph Awards turned down the prize after admitting his entry was created by AI.

Content Source: information.sky.com