
Modern smartphones shape how our memories appear — from simple enhancements to creating hallucinatory facial features. The results may please you, but they can alter our perception of reality, reports the BBC.
Have you ever tried to photograph the Moon with your phone? Unless you own a Samsung Galaxy, your experience was likely unsuccessful. Devices from this brand offer a "100x Space Zoom" feature that allows you to capture the Moon with astonishing clarity. However, it is worth noting that photos taken with Samsung are fakes.
One Reddit user demonstrated this by holding his Samsung up to a blurry image of the Moon on a computer screen. The phone effortlessly created a clear shot, rich with craters and shadows that were not present in the original image. Samsung calls this a "detail enhancement feature," but in fact, artificial intelligence has been trained to recognize the Moon and add missing details when the camera cannot capture them.
Not all smartphones have such an impressive feature by default. Nevertheless, regardless of the model, each press of the camera button triggers a series of algorithms and AI-based processing operations running in the background. These systems can perform trillions of operations before the image is saved in your gallery.
These technologies aim to create beautiful and often accurate photographs. However, in some cases, smartphones apply AI enhancements that can differ significantly from what you see with your own eyes. So next time you take a photo, consider: is your camera capturing reality or trying to change it?
“This is called computational photography,” says Ziv Attar, CEO of Glass Imaging, who was involved in developing the portrait mode for the iPhone. “Your phone does much more than just gather light hitting the sensors. It tries to predict how the image should look if the camera were better and recreates that for you,” he adds.
A Samsung representative claims: “AI-based features are designed to enhance image quality while preserving its authenticity.” “Users can disable AI features according to their preferences,” he adds.
However, even when such features are turned off, algorithms continue to process your photos.
What happens when shooting?
“When you press the ‘Capture’ button, your phone doesn’t take one shot, but usually between four and ten shots in good lighting,” explains Attar. The device combines these images to create one higher-quality frame. Some shots may be duplicates, while others may have different focuses on certain parts.
These basic processes eliminate flaws that an average person likely wouldn’t want to see. For example, noise reduction smooths the texture of the image, removing graininess. Color correction brings the photograph closer to what you see in real life. High Dynamic Range (HDR) is also used, which combines several shots taken under different lighting conditions to preserve details in both shadows and highlights. Your phone actively combats blurriness using a variety of methods.
For instance, the iPhone employs a Deep Fusion feature, based on AI trained on millions of images. These neural networks not only process the aforementioned methods but can also identify objects in the photograph, processing them differently and altering individual pixels based on other previously seen images. “This is very high-level segmentation,” claims Attar.
The result is sharp and high-quality photographs in good lighting conditions. However, some critics and attentive amateur photographers argue that modern phones sometimes go overboard, creating images with unusual, plastic textures or flat, watercolor effects. Devices are so actively removing flaws that they can cause strange distortions in the form of AI hallucinations if you zoom in on fine details. Some users are so frustrated with overly processed photos on new models that they revert to older versions or use a second phone solely for photography.
“At Apple, we have always aimed to help users capture the real thing so they can relive their memories,” says a company representative. “While we see great potential in AI, we also value the traditions of photography and believe it should be treated with care. We continue to focus on developing devices that create real and authentic photographs that look great and provide users with tools to personalize these shots as they wish.”
On the positive side, most edits could be done manually if one had enough experience and patience. Now, “instead of struggling with numerous settings, we have automation,” says Lev Manovich, a professor of digital culture and media at the City University of New York. “Some features that were previously available only to professionals are now accessible to amateurs.”
However, your phone often makes creative, and sometimes artistic, decisions about what you are shooting. Users may not even realize this, as on some devices, artificial intelligence does much more than just adjust settings.
“I believe smartphone manufacturers genuinely want photographs to reflect what users captured. They are not trying to create fake images,” says Rafal Mantiuk, a professor of graphics and displays at the University of Cambridge. “However, there is a lot of freedom for creativity in image processing. Each phone has its own style. Pixel has one style, Apple has another. It’s reminiscent of the work of different photographers.”
“This is pure hallucination”
In these discussions lies an implicit standard: the notion that a “real” photograph should look like a shot taken on film. Such a comparison may not be entirely accurate. Every camera originally included certain image processing processes. It’s easy to hear the term “AI” and assume it’s something negative. In reality, in many cases, algorithms correct flaws inherent in the small lenses and sensors used in phone cameras.
However, some features may cross the line of acceptability.