
Yet Another Apple Event, also known as an opportunity for The New York Times columnists to mock it, is in the rearview mirror. Unlike the Times, I was impressed that Apple created a machine (Mac Studio Ultra) that is about the size of three hard drives and is 25 percent as powerful as IBM’s Watson Supercomputer. 21 Teraflops for about $8000 — sounds like a pretty big deal to me.
None of the products announced by Apple’s CEO came as a surprise — the show, as I said on Twitter, is “the video form of all the rumors and leaks that emerge a few days before the event itself.” As expected, Apple announced a new iPad Air (with an M1 chip), a Pine-colored iPhone 13 Pro, and a new iPhone SE.
I can’t stop thinking about it and obsessing about it. And that is despite the fact that I currently edit my photos on Apple’s top-of-the-line XDR Display. With 1600 nits of brightness and a 32-inch 6K screen, it allows me to get to the thin fine line between ‘black and white’ and ‘color” that I seek in my artistic interpretation of reality. And despite that, I am feeling a little jealous of this new Studio Display that costs between $1600 and $2300.
It is a mere 27-inch 5K Retina screen with only 14.7 million pixels and just 600 nits of brightness. XDR (which costs about four times as much) runs circles around it as a screen. Aiden Fitzpatrick, who runs Reincubate, makers of fantastic Camo app (that turns your iPhone into a webcam), in an email points out that “it’s the same as the 2014 iMac 5K or 5K LG. display.” In other words, it is really a 7-year-old screen. “People will buy these screens because… well, the market is under-served and they feel XDRs are expensive. It’s better than the crap screens in the market, but it’s not a game-changer,” he writes.
And yet, I keep going, damn? Why?
Because sometimes, a computer display is not just a display — it is a peek into the future.
Apple has put an A13 Bionic, a chip-powered iPhone 11 Pro three years ago, in this display. ( wrote about it in Wired. Think about it — and now they are putting that much computing oomph inside the display — and it the process of turning it into a device that does much more than be just a dumb screen.
It has GPU power to video on the screen appear more fluid. It has a neural engine to make Siri less dumb, music sound more spatial, and microphones can discern between voice and noise. It has enough machine learning capabilities to morph the screen into a more personal screen — though I don’t think any amount of chip power can make Siri understand my accent.
Suddenly you can see why the studio has a 12-megapixel ultrawide camera, a three-microphone array, four woofers, and two high-performance tweeters. In other words, this is not only a high-end high-resolution video conferencing system, but it is also as good as a HomePod. If you own an M1 Mac, this will do portrait mode. (Booyah!) Since it can get firmware upgrades, don’t be surprised if this is an ever-improving portrait mode in the future of this display? I am pretty sure the video quality will match the quality of a modern iPad.
And don’t be surprised if you can AirPlay to it even when the computer and the screen are in sleep mode. Why? Because it is evident that it isn’t touching Mac’s internal chip. I wondered if the display is running iOS. No, it is “running a custom-built software system that takes advantage of the latest libraries across the ecosystem,” a spokesperson communicated.
Apple has taken the windfall of its smartphone gains — chips, sensors, speakers, and software — and turned the dumb display into something that can not only be personal but enable a future of computing that is more spatial and dimensional. More cameras, microphones, and sensors can turn the display into a spatial machine — enough to imagine three-dimensional interactions that work with Apple’s rumored augmented reality glasses. A smarter keyboard? A haptic trackpad or a mouse? Apple Watch? These are all inputs for a spatial computing environment.
After all, spatial computing takes cues from actual physical machines, people, objects, actions, and environments and brings them into a digital domain. It is about how we interact with information in the future.
I know I am dreaming a little, but if you continue to believe in Moore’s Law, don’t be surprised that the Studio Ultra of today in the not-so-near future will become a budget machine of tomorrow. It is a machine that can handle multiple video streams, crunch many sensory inputs, and enable and display a new “reality” on the screen, powered by its discrete chip.
Updated: 7.20 am PST, March 10, 2022
March 9, 2022. San Francisco
- I use Reincubate’s Camo with my iPhone 13 Pro in my home office. It is mounted on a desktop tripod and ball-head and some addons from Really Right Stuff.
- I also have a beta version of Opal Camera, which is spectacular. However since it isn’t available, I am not going to recommend it just yet.
Just found myself nodding to every point you made in this. I was SO excited about the new display. It’s been ordered and on the way. In typical Apple fashion the new keyboard and trackpad have already arrived. Seeing where the future goes with all these various chips in devices is really exciting and much more interesting to me than the pure raw computing power. Ubiquitous computing has been a buzz word for a long time, but you can really feel momentum building these days.
spot on, om—my first reaction, too—this is not just a better display—it is a high end experience engine for the remote first world we now live in!!!
Few people make tech advancements as accessible as Om.
Great article. Although I’ve ordered a pair of Studio Displays, I’m second guessing myself if I should have ordered an XDR instead. The Studio Displays (from what I understand) are the same panel used in the 27” iMac so no local dimming and peak brightness of only 600 nits. Decisions, decisions.