Technological improvements are like going to the movies or seeing a magic show: you want to be impressed, but it works better when you don't see what's going on behind the scenes. You don't want to know if there's a trapdoor or strings holding people up as they float in the air, because that takes away some of the magic itself.
Apple often ends up here. The company's ethos is rooted in the desire to provide technology that feels magical and amazing to customers. With each passing year and with the release of each new device, Apple loves to show off new features, but some of the biggest technological breakthroughs happen on a level completely invisible to users.
In such cases, the company faces the difficult task of convincing them how advanced some of these technologies are, without going into details. And with the huge amount of AI features, it also means that the company has a big task ahead of it if it wants to continue to be the best example of magical, invisible technology.
Screen designed for two people
The idea of invisible technology finally hit me when Apple showed off the new Ultra Retina XDR display for the iPad Pro. Not only does the display have two separate OLED panels stacked on top of each other, it also requires a carefully calibrated map of all the different brightnesses (which can vary greatly between OLED pixels) to ensure colors are displayed evenly. It's an enormous amount of work just to get an end result that you'll hopefully never notice (“Look how uniform all my reds are!” is something no one has ever said before.)
This display also requires an entirely new display controller built into Apple's M4 chip, and building a new feature into a system-on-a-chip is no simple task. A lot of time, energy and money has been spent building technology that ultimately only gets attention when something goes wrong mistake.
Picture perfect
Perhaps the best example of Apple's invisible technology is a feature that has become the central attraction of smartphones: the camera. The amount of computational work required to take a “simple” photo is much greater than the average user realizes.
In principle, analog cameras were relatively simple: press the shutter button and the light entering through the lens exposed the light-sensitive film. You can change a variety of aspects of the image based on factors like the aperture and how long the shutter stays open, but at the most basic level, the image the lens captures is what ends up on the film.
Contrast this with Apple's “computational” photography, where the camera often takes multiple images at once to combine different elements so that the image you want is Look It becomes similar to what the eye sees. It all happens automatically and invisible the moment you pull the trigger, and you'll never notice.
But that's the goal: making beautiful photos look as easy as pressing a button. Even if Apple allowed features like exposure time or even different types of simulated “lenses” on the new iPhone 15 Pro, it would clearly prefer you not have to touch them at all — and most users probably wouldn't.
Quiet intelligence
So how does this deal with artificial intelligence, which is a contractual requirement for all technology these days?
Apple's platform updates this year are expected to focus prominently on artificial intelligence across all of its operating systems. While it's not yet clear exactly how this technology will go into effect, it's not hard to imagine that they want it to be as seamless and transparent as possible. This is challenging because, as many AI technologies today show us, the outcome is often not the same from Invisible or worse; Invisible on one in a bad manner road. Apple certainly doesn't want any examples of artwork being created artificially using the wrong number of fingers, or of Siri giving weird answers to questions about pizza.
But many of these problems are inherent in the nature of generative AI. It's unreasonable to expect that Apple has somehow addressed these flaws in the relatively short time the company has been developing these features. All of this tells me that while Apple may have ambitions to show off powerful features that leverage its AI prowess, those features may not be quite what we expect — or what the competition is flaunting.
Since Apple prioritizes invisible technology that “just works,” I expect these AI features to be more distinct than what we've seen from Google, Microsoft, and Open AI. There are no bedtime stories, AI-powered search results, or even a feature that lets you see your entire computer history. What Apple releases will be intended to blend in and disappear, giving you the information you need without drawing attention to itself — in the same way you press the shutter button and get the exact photo you thought you took.
Translated by Peter Ahrnstedt
This article was originally from our sister publication Macworld Translated from English.
“Entrepreneur. Freelance introvert. Creator. Passionate reader. Certified beer ninja. Food nerd.”
More Stories
EA President Talks New Dragon Age: 'A Return to What Made Bioware Great'
She thought she had bought a phone – she was shocked by its contents
Rumor: Lots of AI in Google's Pixel 10 and 11 cameras