AI Images Sneak Into The News, Get Called Out as Fakes

A large wave crashes into a house on the beach.

As almost instantly-rendered AI imagery becomes ever better and easier to generate, it’s finding its way everywhere, often disingenuously.

This has been the case many times in the last year, but one of the most recent examples was called out by none other than Fox Weather in the United States.

YouTube video

Shortly after the powerful category 4 Hurricane Idalia made landfall along the southeastern coast of the United States, a trove of supposed photos depicting gigantic waves smashing houses along a shoreline was posted to a Facebook page called “Outer Banks Photography” and to other social media platforms.

On the Instagram and Facebook accounts of the Outer Banks Photography group, the images were shared with the caption “Idalia is here” despite being completely fake and rendered by AI technology.

While anyone with experience in being bombarded by AI imagery should have been able to tell that the dramatic “photos” are fake after even a minor bit of scrutiny, many, many people apparently couldn’t tell.

On the Outer Banks social media pages, hundreds of comments piled up praising the supposed photographer behind the images and applauding their skills with a camera.

As the images went viral on both Facebook and IG, they received no shortage of attention either, garnering over 23 million views and nearly 190,000 shares.

According to Amy Freeze, the Fox Weather meteorologist who outed their falseness, “That’s a lot”.

She also added that hundreds of comments were listed from people complimenting the photographer, they are incredible pictures. The only problem is, the images are not real — they are AI-generated.”

She also pointed out that this kind of AI fakery is dangerous due to the National Weather Service itself often using social media to handle storm reports.

Having these accounts flooded with rendered, made-up imagery will make real warnings and photos harder to take seriously by the public.

I’ll let you judge for yourself how realistic these particular images are but at a quick glance it’s easy to see how they could fool the inexperienced even though, lots of telltale signs give away their rendered nature.

A wave crashes over a beach with houses in the background. A large storm cloud is seen from a balcony overlooking the ocean. A large wave crashes into a house on the beach.

In the case of the Outer Banks “photos” of Hurricane Idalia, they were credited to the “Outer Banks Photography” social media pages. Outer Banks refers to a string of islands that hug much of the coast of North Carolina coastline, where the storm also made landfall.

The Outer Banks Photography page itself showcases a lot of AI-generated content among its posts and has many followers.

It’s evident from comments that among these, many don’t even realize they’re sometimes seeing invented imagery posing as real photography.

This is the case despite a review section for the Facebook page having commentary from reviewers who warn viewers that they’re often looking at fake images.

Outer Banks Photography itself apparently makes no effort to warn anyone that it’s posting what are essentially deep fakes and according to one of the reviewers who called out its AI posts, even blocked her for posting a warning review.

AI imagery used to be very easy to spot due to tendencies among AI rendering platforms that caused them to create misshapen body parts –especially hands- and completely fumble with rendering letters, numbers or any text.

This however has already changed and their results keep improving, especially since the introductions of the latest versions of platforms like Midjourney and DALL-E.

As we previously covered, one cheeky online creator recently used the latest version of Midjourney to render remarkably “photos” of famous political figures during very private sexual moments in hotel rooms.

Two pictures of a man kissing a woman in a hotel room.

The creator in question had done this to point out the dangers of AI deep fakes for people who aren’t famous enough to easily discredit images showing them in compromising situations.

He was banned from Midjourney for his experiment, but it doesn’t change that the platform and others like it can easily be used to create digital rivers of maliciously fake content.

The Outer Banks Photography incident is just one modest example of something that’s already on its way to a storm of digital hurricane proportions.

Leave a Comment





mark-shotkit

WELCOME TO SHOTKIT

Enter your email to be sent
today's Welcome Gift:
19 Photography Tools

🔥 Popular Now:

Shotkit may earn a commission on affiliate links. Learn more.