As part of writing a review, I have been using Apple’s 2022 version of the 13-inch MacBook Air. I am not the first one to say it  — many others have said before — it is a great device. It is a great testimonial of Apple’s hardware excellence and knowledge crammed into this thin sliver of engineering marvel. The new M2 chip, the longer battery life, improved webcam, keyboard, and speakers — everything is of exceptional quality. Apple does know how to build good premium hardware at scale. 

Sadly, you can’t say the same when it comes to its software & services that rely on machine learning and augmented intelligence. The obvious deficiencies, whether on the Mac, the iPhone, or the iPad, are quite annoying. Take the Mac as an example. By now, it should be easier for the “notification” system to understand that showing notifications of events that have already happened is mere noise and a nuisance. And yet, you have to manually delete them. I mean, it should be obvious to any computer system and any application that date and time have passed. 

Don’t get me started about Siri, which feels like a kindergartner compared to highly effective Alexa, and Google’s Home. If you have an accent that is not “classic American” or “classic English,” Siri will never quite understand you. Much as I loved the HomePod, it could never play “Nitin Sawhney” when I asked Siri to play his new album. 

Things on iOS are comically calamitous. We all know about “What the Duck,” and by now, we have decided to live with it. Whether it is the spontaneous capitalization of words without reason, amazingly incomprehensible autocorrect, or lack of competence to transcribe effectively makes you wonder what is the point of all those neural engines Apple’s hardware team keeps cramming into the newer generation of Apple’s chips. 

Many folks weighed in with their experiences and opinions when I tweeted out my observation. Ken Kocienda, who spent most of his working life at Apple and is the inventor of the iPhone auto-correct in a tweet-reply noted: “To make good computing experiences for people, you have to understand computers and people, what people want to do with computers, and what new technology can do to make things better. This sounds obvious in theory, but it isn’t so easy in practice.”

It is obvious that Apple competitors — Google, Amazon, and Microsoft — have become much better at helping people with their computing experiences. Microsoft’s Outlook, for example, has become very effective at helping with autocorrecting spellings and grammar and learning my idiosyncratic idioms. Apple showed many improvements in the Mailapp in its next Mac OS, Ventura, but they have been a bit of a letdown. For example, the “Follow Up” functionality in Mac OS Ventura Beta version of Mail is pretty hit or miss.

Together, this might seem like a bouquet of small annoyances, but it can have larger ramifications for the company. The future of hardware is not just hardware; it is constantly morphed and shaped with software and “augmented intelligence.” I pointed this out in my piece about Apple’s Studio Display. It is about adaptable and personalized hardware. AirPods, for example, could become more powerful and personal in the near future. 

But all that needs intelligent systems to “augment” what we need as humans. And nothing needs it more than Apple’s next big bet. The much-rumored mixed reality platform depends on glasses, phones, and access to network information. 

This platform eschews text entry for gestures and voice commands. For this new post-touch interface to work, the hardware has to be flawless, and its software experience has to be perfect. Imagine my “commands” getting a Siri-like response? I will return the glasses to the store and demand my money back. This is the real Achilles heel of Apple.

In a strange bit of irony, the piece started with Otter. I use that service to dictate notes, ideas, and random thoughts. It takes my voice and transcribes it. By now, it is good enough to deal with my accent and my pronunciation. I drafted this piece on Google Docs, enhanced by Google’s intelligence, and then used Grammarly’s AI tool to help make sure that I got my grammar right and didn’t skip the commas. For all this, I used the new 13-inch Apple Macbook Air, which I don’t mind saying, is one helluva computer.

August 30, 2022. San Francisco.

Related Reading: The Hype — and Hope — of Artificial Intelligence/The New Yorker.

U.S. President Joe Biden has informed Americans that a potential cyberwar with Russia is likely. And we should be prepared for the consequences and havoc it can cause on such a society. It is common knowledge many of our industries, corporations, and infrastructure services, such as the electrical grid, are weak and can fall victim to large-scale attacks. Even Americans have weak defenses on our computers.  

“Given the administration’s stellar track record in predicting Russian moves in its attack on Ukraine, we should take this warning seriously,” wrote Richard Bennett, a writer, and analyst focusing on telecom and network. “Cyberwar is a business conducted by firms and individual actors with a rapidly changing arsenal of software-based tools.” 

The timing of the White House release was at best a coincidence, and at worst, curious since it came on the same day Apple experienced a massive outage in its online services. To be very clear, what I am about to write is hypothetical, and I am putting it in my “what if” buckets. 

An odd and somewhat crazy thought crossed my mind — what if the outage resulted from an attempt to compromise Apple’s crown jewel — its Keychain, end-to-end encryption, and the iMessage. Like many, I have had blind faith in Apple’s capabilities to protect my privacy and data. I am not alone, and many folks in the government and corporate America have faith in Apple’s capabilities. So this keychain could be a single point of information security failure that could impact a lot of folks across America. Forget America — with hundreds of millions of iPhones, a breach’s impact will likely have an impact globally. 

If the cyberwar starts to unfold, password management services such 1Password, could find themselves under extreme pressure. Yes, they all have pretty blue chip reputations, and impecable infrastucture. It is crazy to even suggest as much. However, read the headlines just from today — a ransomware group is rumored to have accessed Okta’s database, which provides trusted authentication services to about 15,000 companies. The same ransomware group has claimed that it stole source code from Microsoft. If this is even fractionally true, then basically every company is vulnerable. 

No matter how secure we might feel, at this point, our password defense is our biggest strength and our biggest weakness. 

PS: Bennett has some good advice: follow the news, check for patches, and update daily. Keep a local copy of your data, just in case you need to wipe your computer and restore it. That’s the best for now. 

March 22, 2022. San Francisco


The Pursuit of Productivity is a trap: “in a culture so focused on managing time, we have become subservient to it,” writes Lawrence Yeo. “By scheduling your day down to the last minute, you introduce an anxiety from managing your real-time progress to an imagined vision.”

The Joy of Physical Media: David Mitchell, a British comedian, points to the growing sales of physical media formats (including books) as a sign that digital (streaming) lacks a loving feeling. I am sadly in the “streaming is convenient” camp, though I tend to buy my music from Bandcamp to support the artists, not because I want a download taking up space on my hard drive.

We all still don’t understand Substack: Nathan Baschez, who was with Substack in its earliest days, explains why the newsletter platform differs from most other media startups. I kind of agree with Nathan.

I woke up this morning thinking about the new Apple Studio Display’s webcam hiccup. It has reaffirmed my belief that the camera, and by extension, the visual sensor, is becoming a key interface to the information and how we interpret it. What keyboard and mouse were to what was textual computing, visual (and other sensors) will be a key to computing in the future.  

An article in the New Yorker laments that smartphone photography is too algorithmic. Similar laments were made when William Eggelston started experimenting with color film. Since then, our everyday memories have been captured on color film, each generation getting better than the others. It is the same for computational photography — we started with the grainy photos from Nokia, Blackberry, and the first iPhone. I remember the first iPhone and the photos that came off its puny sensor. We have already come so far in this journey. Writers need to overcome nostalgia have to think different – the camera isn’t just a camera. It is so much more!

As I said, it is the camera stupid

But back to the Studio Display camera problems. 

Looking beyond, the speed with which Apple can fix the problem by issuing a software upgrade will reaffirm the advantage of what I wrote earlier about putting “smarts” into previously dumb devices. Apple’s ability to take all the gains offered by its iPhone business & its scale gives the company a significant leg-up in its ability to reinvent products. It will help it become the key player in the next evolution of computing — spatial computing, as it is colloquially known. Yes, sometimes a display is not just a display

Talking about iPhone — it accounted for 37 percent of all 5G phones sold, according to data from CounterPoint Research. “The 5G smartphone penetration for North America and Western Europe reached 73% and 76% respectively,” they point out. Over 51 percent of the phones shipped now are 5G phones, though it doesn’t necessarily mean you get to enjoy the benefits of 5G speeds in the U.S. 

OpenSignal data shows that the South Koreans have got their 5G zooming! In South Korea, average download speeds were 129.7 Mbps at the end of 2021, up from 52.4 Mbps at the start of 2019, before 5G. The U.S. is not in the top 25, even though more people have 5G iPhones around. Why? Because AT&T and Verizon are essentially shit when it comes to 5G. 

FYI: I like to read a lot. When I find something interesting, I share it in my link blog. Think of it as "collected wisdom." You can visit it here. And if you want to see my photos, visit my photoblog. 

March 19, 2022. San Francisco

The endless quiet of the Elk Refuge in Jackson, Wyoming. Winters in this part of the world are my favorite and I can’t wait to go back next year.

Due to unavoidable circumstances, I missed the visit this year. This photo is also a testimonial for iPhone and its ability to capture amazing details in raw and then let me play around with it in Photoshop.

March 22, 2022. San Francisco

The reviews for Apple’s new Studio Display are in — and they aren’t kind. Some of them are brutal. I read, viewed, and heard what the reviewers had to say — and my conclusion: they are fine, with a handful of misses. You can get a near-exact display from LG for about $300 less. However, Apple’s display is better built. And it is tightly integrated with Apple’s computers and ecosystem. The 4 Thunderbolt ports, six speakers to pump out great sound, and three microphones to make calls with — Apple Studio Display stands above the competition and is worth the premium. 

Well almost!


The much-ballyhooed 12-megapixel wide-angle camera for making video calls is turning out to be quite a dud. 

“I find to be crushingly disappointing,” writes John Gruber. “Image quality is astonishingly poor, and Center Stage is glitchy.” It was poor compared to the new iPad Air and the new iPhone 13 SE, two devices that also use the same camera as the Studio Display. 

“It looks awful in good light and downright miserable in low light,” writes The Verge editor in chief, Nilay Patel. “I’ve tried it connected to the Mac Studio and on my MacBook Pro running macOS 12.3, and on both machines, it produces a grainy, noisy image with virtually no detail. I tried it in FaceTime, in Zoom, in Photo Booth, in QuickTime — you name it, it’s the same sad image quality.” Wall Street Journal’s Joanna Stern sums it up well when she writes, “OK, the real surprise is, it isn’t a good webcam, especially after how much Apple talked it up.” 

An Apple spokesperson told WSJ that they “discovered an issue where the system isn’t behaving as expected. We’ll be making improvements in a future software update.” Remarkably, Apple didn’t have the right processes to prevent what is a bad product experience from making it out of its doors. It should raise questions about the software capabilities of the camera and the embedded OS that has been shipped with the displays. 


As we humans know, becoming smart is a continuous process. You first go to school and then to college. You learn from books, from life itself, and work experience. In other words, you become smarter.

And since Apple has put an A13 Bionic chip in the Studio Display, a future software upgrade can make this “smart” display work better. I am sure Apple has already fixed it internally. But it is a good reminder of something I have always believed in — the ever-increasing importance of the camera — the third eye — in our modern society.

In a culture and society that is increasingly visual, the camera is the decisive factor when it comes to buying a phone. We want to look as good on our work Zoom calls as we do on our selfies. Video is the ultimate projection of self, and the camera is the way to get it done. 

More than the screens, storage, or speeds, the camera emotionally personalizes our devices. A camera is our tool to project our reality and our image to the world, but more importantly, to ourselves.

Apple, which has become a trillion-dollar company based on a straightforward concept — “it is the camera stupid” – should never forget that simple reality.  

March 17, 2022. San Francisco