title: Samsung caught faking zoom photos of the Moon
url: https://www.theverge.com/2023/3/13/23637401/samsung-fake-moon-photos-ai-galaxy-s21-s23-ultra
hash_url: e990536ed8
archive_date: 2024-01-13
og_image: https://cdn.vox-cdn.com/thumbor/-B56-pAq80Llm4zMVYQPa5UT6D4=/0x0:3000x2000/1200x628/filters:focal(1500x1000:1501x1001)/cdn.vox-cdn.com/uploads/chorus_asset/file/24500739/bXJOZgI.jpg
description: A Reddit post has revealed just how much post-processing the Galaxy S23’s camera applies when it detects it’s taking a photo of the Moon, inserting extra detail that isn’t present in reality.
favicon: https://www.theverge.com/icons/favicon_32x32.png
language: en_US
For years, Samsung “Space Zoom”-capable phones have been known for their ability to take incredibly detailed photos of the Moon. But a recent Reddit post showed in stark terms just how much computational processing the company is doing, and — given the evidence supplied — it feels like we should go ahead and say it: Samsung’s pictures of the Moon are fake.
But what exactly does “fake” mean in this scenario? It’s a tricky question to answer, and one that’s going to become increasingly important and complex as computational techniques are integrated further into the photographic process. We can say for certain that our understanding of what makes a photo fake will soon change, just as it has in the past to accommodate digital cameras, Photoshop, Instagram filters, and more. But for now, let’s stick with the case of Samsung and the Moon.
The test of Samsung’s phones conducted by Reddit user u/ibreakphotos was ingenious in its simplicity. They created an intentionally blurry photo of the Moon, displayed it on a computer screen, and then photographed this image using a Samsung S23 Ultra. As you can see below, the first image on the screen showed no detail at all, but the resulting picture showed a crisp and clear “photograph” of the Moon. The S23 Ultra added details that simply weren’t present before. There was no upscaling of blurry pixels and no retrieval of seemingly lost data. There was just a new Moon — a fake one.
Here’s the blurry image of the Moon that was used:
A GIF of the photo-taking process:
And the resulting “photograph”:
This is not a new controversy. People have been asking questions about Samsung’s Moon photography ever since the company unveiled a 100x “Space Zoom” feature in its S20 Ultra in 2020. Some have accused the company of simply copying and pasting prestored textures onto images of the Moon to produce its photographs, but Samsung says the process is more involved than that.
In 2021, Input Mag published a lengthy feature on the “fake detailed moon photos” taken by the Galaxy S21 Ultra. Samsung told the publication that “no image overlaying or texture effects are applied when taking a photo” but that the company uses AI to detect the Moon’s presence and “then offers a detail enhancing function by reducing blurs and noises.”
The company later offered a bit more information in this blog post (translated from Korean by Google). But the core of the explanation — the description of the vital step that takes us from a photograph of a blurry Moon to a sharp Moon — is dealt with in obfuscatory terms. Samsung simply says it uses a “detail improvement engine function” to “effectively remove noise and maximize the details of the moon to complete a bright and clear picture of the moon” (emphasis added). What does that mean? We simply don’t know.
A “detail improvement engine function” is to blame
The generous interpretation is that Samsung’s process captures blurry details in the original photograph and then upscales them using AI. This is an established technique that has its problems (see: Xerox copiers altering numbers when upscaling fuzzy originals), and I don’t think it would make the resulting photograph fake. But as the Reddit tests show, Samsung’s process is more intrusive than this: it doesn’t just improve the sharpness of blurry details — it creates them. It’s at this point that I think most people would agree the resulting image is, for better or worse, fake.
The difficulty here is that the concept of “fakeness” is a spectrum rather than a binary. (Like all categories we use to divide the world.) For photography, the standard of “realness” is usually defined by the information received by an optical sensor: the light captured when you take the photo. You can then edit this information pretty extensively the way professional photographers tweak RAW images and adjust color, exposure, contrast, and so on, but the end result is not fake. In this particular case, though, the Moon images captured by Samsung’s phone seem less the result of optical data and more the product of a computational process. In other words: it’s a generated image more than a photo.
Some may not agree with this definition, and that’s fine. Drawing this distinction is also going to become much trickier in the future. Ever since smartphone manufacturers started using computational techniques to overcome the limits of smartphones’ small camera sensors, the mix of “optically captured” and “software-generated” data in their output has been shifting. We’re certainly heading to a future where techniques like Samsung’s “detail improvement engine” will become more common and applied more widely. You could train “detail improvement engines” on all sorts of data, like the faces of your family and friends to make sure you never take a bad photo of them, or on famous landmarks to improve your holiday snaps. In time, we’ll probably forget we ever called such images fake.
Samsung says “no image overlaying or texture effects are applied when taking a photo”
But for now, Samsung’s Moon imagery sticks out, and I think this is because it’s a particularly convenient application for this sort of computational photography. For a start, Moon photography is visually amenable. The Moon looks more or less the same in every picture taken from Earth (ignoring librations and rotational differences), and while it has detail, it doesn’t have depth. That makes AI enhancements relatively straightforward to add. And secondly: Moon photography is marketing catnip because a) everyone knows phones take bad pictures of the Moon and b) everyone can test the feature for themselves. That’s made it an easy way for Samsung to illustrate the photographic prowess of its phones. Just check out this advert for the S23 Ultra with a Moon-zoom at 11 seconds in:
It’s this viral appeal that’s gotten the company into trouble. Without properly explaining the feature, Samsung has allowed many people to confuse its AI-improved images for a physics-defying optical zoom that cannot fit in a smartphone. In turn, that’s made others keen to debunk the images (because the tech world loves a scandal). Samsung doesn’t exactly claim its Moon shots are representative of all its zoom photography, but a consumer would be forgiven for thinking this, so it’s worth emphasizing what’s really going on.
Ultimately, photography is changing, and our understanding of what constitutes a “real photo” will change with it. But for the time being, it seems fair to conclude that Samsung’s Moon photographs are more fake than real. Presumably, in a few years’ time, this may no longer hold. Samsung did not immediately respond to The Verge’s request for comment, but we’ll update this piece if they get back to us. In the meantime, if you’d like to take an unadulterated photo of the Moon using your Samsung device, just turn off the “Scene Optimizer” feature and get ready to snap a picture of a blurry circle in the sky.