On May 10 this year, the strongest geomagnetic storm in more than two decades occurred, producing a Northern Lights show that reached parts of the world where they rarely do. I was lucky enough to hear that this happened while I had plenty of time to get out with my camera gear, and it was a truly memorable evening.
The G5-rated geomagnetic storm was so strong that the aurora borealis could be seen even where I live in southern Britain, dancing in the night sky and lighting up the landscape with otherworldly hues. I wrote about my experiences that night, pitting a mirrorless camera against a smartphone to see which could take the best Northern Lights photos.
More recently, on October 11, there was another chance to see the aurora, but unfortunately on this occasion I only discovered that this had occurred the following day. And yes, the images on my news and social feeds were spectacular.
The night sky had been clear, I was home with no particular plans, and I had learned a thing or two about photographing the lights from my first experience that I wanted to put into practice if I was ever lucky enough to see them again – so I was heartbroken because I missed it.
A few days after the event, as I saw a steady stream of spectacular photos popping up in my social media feeds that only added to my disappointment, Meta hit Threads with a sentiment that fell completely flat, and worse.
“POV: You Missed the Northern Lights IRL, So You Made It Yourself with Meta AI” read Meta’s post on Threads, along with an AI-generated image of the aurora over famous landmarks including the Golden Gate Bridge (see the main image above ). Cue a Meta-toasting.
It’s okay to lose some
I was one of those people who missed the Northern Lights IRL. But even though I was disappointed to have missed this, especially after seeing so many incredible images from photographers all over the world and especially from the UK, I’m not going to recreate the experience of capturing the real thing, whether it with AI or with old photos. -school Photoshop fake.
For me, such events are mainly about being present and experiencing them in the moment. I echo the sentiment of my colleague Phil Berne, who gave up trying to capture April’s rare total solar eclipse in the US with an array of camera equipment to finally enjoy this once-in-a-lifetime event.
When I first witnessed the Northern Lights in May (you can see one of my photos above), the night had an otherworldly feel and the atmosphere was electric. I got distracted by taking photos and time-lapses, to my own detriment. I wish I had taken a few photos that I was artistically satisfied with, and then put my camera away and spent more of my time simply processing them. I don’t need a photo to prove I was ever somewhere, especially if it limits my enjoyment of the moment.
On the most recent occasion I missed it, but that’s fine with me; you win some, you lose some. But if you listen to Meta, you’ll never lose anything, thanks to the magic of AI-powered photo editing. I don’t share that feeling. For the less secure, FOMO is what Meta has promoted since it was born as Facebook. Yet Meta’s post last week wasn’t just in bad taste – it also has a dark side.
It’s not okay to fake it
Look, I get it, Meta’s post on Threads was just a way to show off his new AI image generator skills. But the message it contained not only went down like a lead balloon, it also contained a sinister element: AI allows you to pretend you are at an event.
It’s one thing to use photo editing tools creatively – although image manipulation has been a gray area since the advent of Photoshop – but to spoof elements of real events with AI editing and image generation? That’s not okay. If you’re going to use Meta AI’s tools to falsify reality, where does it end? Creating fake news (and potentially catastrophic consequences) is just one example of the dark side of AI.
AI-powered photo editors and image generators can be wonderful tools to use, helping you realize your creative vision with ease. On its own, adding the Northern Lights to photos actually seems like a good use of Meta AI, as you can get reasonable results (no matter how scientifically inaccurate they may be). But show these images in a way that makes people believe you were there? That’s a no from me.
In the wrong hands – and there is no control over which hands will use it – AI image generation can be all too convincing, to the point where we simply no longer know what is real. If Meta is actively promoting the misleading use of AI image generation when it should be leading the fight against it, what hope do we have?