There’s so much processing by modern phone camera algorithms that this photo of the Moon is literally faked.
This news spotted in the columns of the Reddit platform reveals the degree of processing that Samsung cameras apply to photos of the Moon, further blurring the line between real images and fake images in the age of artificial intelligence (AI) .
For years, Samsung phones that support the “Space Zoom” feature have been known for their ability to take incredibly detailed pictures of the Moon. But one recent post posted on Reddit showed in stark terms the amount of computer processing done by smartphones the company and, given the evidence provided: the photos of the Moon taken by Samsung are fake.
What makes a photo wrong? It’s a do-it-yourself regarding digital cameras, Photoshop, Instagram filters, and more.
Comparative method of images of the Moon
User test of Samsung phones Reddit at/ibreakphotos was ingenious in its simplicity. He created a deliberately blurry photo of the Moon, displayed it on a computer screen, then photographed that image using a Samsung S23 Ultra.
As you can see below, the first image on the screen showed no detail, but the resulting image showed a clear, crisp “photograph” of the Moon. The S23 Ultra has added details that weren’t there before. There was no scaling of blurry pixels or recovery of seemingly lost data. There was simply a new Moon – but a false one.
The controversy is not new. People have been asking regarding Samsung’s Moon photography ever since the company unveiled a 100x Space Zoom feature in its S20 Ultra in 2020. Some have accused the company of simply copying and pasting preset textures onto images of the Moon to produce its photographs, but Samsung says the process is more involved than that.
From 2021…
In 2021, Input Mag published a long article regarding the “detailed false photos of the Moon” taken by the Galaxy S21 Ultra. Samsung told the publication that “no image overlay or texture effects are applied when taking a photo,” but that the company uses AI to detect the presence of the Moon and “ then offers a function of improvement of the details by reducing the blurs and the noises”.
The more generous interpretation is that Samsung’s process captures blurry details in the original photograph and upscales them using artificial intelligence. This is a tried and tested technique, but has its problems. As Reddit’s tests show, Samsung’s process is more intrusive than that: it doesn’t just sharpen blurry details, it creates them. It is at this point that most people will agree that the resulting image is, for better or worse, wrong.
In this particular case, however, the images of the Moon captured by Samsung’s phone appear to be less the result of optical data than the product of a computer process. In other words, it’s more of a generated image than a photo.