For years, Samsung phones that support “Space Zoom” have been known for their ability to take incredibly detailed photos of the Moon. But a recent Reddit post showed in no uncertain terms just how much computational processing the company is doing, and given the evidence provided, it looks like we should just go ahead and say it: Samsung’s Moon pictures are fake.
But what exactly does “false” mean in this scenario? It’s a difficult question to answer, and one that will become increasingly important and complex as computational techniques become more integrated into the photographic process. We can say with certainty that our understanding of what makes a photo false it will soon change, as it has in the past to accommodate digital cameras, Photoshop, Instagram filters, and more. But for now, let’s stick with the case of Samsung and the Moon.
The test of Samsung phones by Reddit user u/ibreakphotos was ingenious in its simplicity. They created an intentionally blurry photo of the Moon, displayed it on a computer screen, and then photographed that image with a Samsung S23 Ultra. As you can see below, the first screenshot is shown no details at all, but the resulting image showed a crisp, clear “photograph” of the Moon. The S23 Ultra added details that were simply not present before. There was no scaling of blurry pixels or recovery of seemingly lost data. There was only one new Moon, a false one.
Here is the blurry image of the Moon that was used:
A GIF of the photo taking process:
And the resulting “photograph”:
This is not a new controversy. People have been asking questions about Samsung’s Moon photography ever since the company introduced a 100x “Space Zoom” feature on its S20 Ultra in 2020. Some have accused the company of simply copying and pasting pre-stored textures to images of the Moon to produce their photographs. but Samsung says the process is more complicated than that.
In 2021, Entrance Mag published a lengthy feature on the “fake detailed moon photos” taken by the Galaxy S21 Ultra. Samsung said in the post that “no image or texture overlay effect is applied when taking a photo,” but that the company uses AI to detect the presence of the Moon and “then provides a function of detail enhancement by reducing blur and noise.”
The company later provided a bit more information in this blog post (translated from Korean by Google). But the core of the explanation, the description of the vital step that takes us from a photograph of a blurry Moon to a sharp Moon, is couched in obfuscating terms. Samsung simply says it uses a “detail enhancement engine function” to “effectively removes noise and maximizes moon detail to complete a bright, clear moon image” (emphasis added). What does that mean? We just don’t know.
A “detail enhancement engine feature” is to blame.
The generous interpretation is that Samsung’s process captures blurry details in the original photograph and then augments them with AI. This is an established technique that has its problems (see: Xerox copiers distort numbers by upscaling fuzzy originals), and I don’t think it fakes the resulting photograph. But as Reddit’s tests show, Samsung’s process is more intrusive than that: it not only improves the sharpness of blurry details, but create they. It is at this point that I think most people would agree that the resulting picture is, for better or for worse, false.
The difficulty here is that the concept of “falsehood” is a spectrum rather than a binary. (Like all the categories we use to divide the world.) For photography, the standard of “reality” is usually defined by the information received by an optical sensor: the light captured when you take the photo. You can then edit this information quite extensively the way professional photographers adjust RAW images and adjust color, exposure, contrast, etc., but the end result is not fake. In this particular case, however, the images of the Moon captured by the Samsung phone look less like the result of optical data and more like the product of a computational process. In other words: it’s a generated image rather than a photo.
Some may disagree with this definition, and that’s okay. Making this distinction will also be much more complicated in the future. Ever since smartphone manufacturers began using computational techniques to overcome the limits of small smartphone camera sensors, the combination of “optically captured” and “software generated” data in their output has been changing We’re definitely headed for a future where techniques like Samsung’s “Detail Enhancement Engine” will become more commonplace and more widely applied. You could train “detail enhancement engines” on all kinds of data, such as the faces of your family and friends to make sure you never take a bad photo of them, or on famous landmarks to enhance your photos of holidays In time, we’ll probably forget we ever called these images fake.
Samsung says “no image overlay or texture effect is applied when taking a photo”
But for now, Samsung’s Moon images stand out, and I think that’s because it’s a particularly convenient application for this kind of computational photography. To begin with, the photograph of the Moon is visually pleasing. The Moon looks more or less the same in all images taken from Earth (ignoring librations and rotational differences), and while it has detail, it lacks depth. This makes AI enhancements relatively simple to add. And second: Moon photography is marketing catnip because a) everyone knows phones take bad pictures of the moon and b) everyone can test the feature for themselves. This has made it an easy way for Samsung to showcase the photography abilities of its phones. Just check out this ad for the S23 Ultra zooming in on the moon at 11 seconds in:
It is this viral appeal that has landed the company in trouble. By not properly explaining the feature, Samsung has allowed many people to mistake its AI-enhanced images for a physics-defying optical zoom that doesn’t fit on a smartphone. In turn, this has made others interested in debunking the images (because the tech world loves a scandal). Samsung doesn’t exactly claim that its Moon shots are representative all its zoomed-in shot, but a consumer would be forgiven for thinking that, so it’s worth highlighting what’s really going on.
Ultimately, photography is changing, and our understanding of what constitutes a “real photo” will change with it. But for now, it seems fair to conclude that Samsung’s Moon photos are more fake than real. Presumably, in a few years, this will no longer apply. Samsung did not immediately respond The Virginrequest for comment, but we’ll update this piece if they get back to us. Meanwhile, if you want to take a pure photo of the Moon with your Samsung device, just turn off the “Scene Optimizer” feature and prepare to take a photo of a blurry circle in the sky.