pink arrow neon sign
Photo by Ussama Azam on Unsplash

It shouldn’t surprise anyone that “tech layoffs” have been on my mind, and I wrote a column for The Spectator to explain “the why of these layoffs.” An unprecedented boom in Silicon Valley that started with the once-in-a-generation convergence of three mega trends: mobile, social, and cloud computing, has peaked. It started in 2010, and it has been bananas around here for the past decade or so. The FAANG+Microsoft companies saw their revenues go from $196 billion to over $1.5 Trillion. Let that sink in. Booming stocks helped create an environment of excess like never before. 

The companies got into the business of what Paul Kedrosky calls “people hoarding.” The pandemic and the resulting growth revved up the hiring machine even more. The over-hiring of talent has led to wage inflation, which had a ripple effect across the entire technology ecosystem. Technology insiders are happy to tell non-tech companies to use data and automation as tools to plan their future. It is easier to preach than practice. 

Why does Google need close to 200,000 employees? Or does Microsoft need 225,000 people? Salesforce, till recently, had about 73,500 employees. Profitable as these companies have been, it is also clear that they have become sloppy and bloated. I don’t want to undermine the misfortunes of those losing jobs. A lot of the blame is on the leaders of these companies, who were asleep at the wheel. The reality is that when it comes to business, companies have to appease their investors. And right now, those investors want to see companies be more efficient, especially now that growth is becoming normal. 

If you are looking for one, the silver lining is that we will soon be in a new cycle, and a new set of hype trends will converge and create opportunities. And they might not emerge in 2023 or 2024, but they surely will. By then, the industry would have put these job cuts in the rearview mirror.

Read the full piece on The Spectator website!

February 2, 2023. San Francisco

I returned from a quick trip to London on the day of Thanksgiving, thus missing the bonhomie of the weekend. While I did miss the slices of pie, it was good to spend the time watching The Silence of Water on PBS Masterpiece (via Amazon Prime.) The Italian crime show is beautiful in location, cinematography, and acting. And despite having to follow the subtitles, it is worth binging. 

The show was an excellent way to stay away from the incessant come-hither siren call of Black Friday — a disease that has also spread to the United Kingdom. I used the opportunity to stock up on memory cards, but that’s all. For the rest of America — despite economic doldrums, it seems to be the season of shop till you drop. I call this the consumerism curse.

The long weekend was also a good time to reflect and read. 

What I am reading

Amazon was losing $10 billion a year on its Alexa business. Google, too needs to learn how to make its voice-interface business profitable. And Apple’s Siri is not going anywhere as, well. So what is the future of voice interfaces in this era of economic frugality

Talking about Apple is becoming an ad company. On its blog, Proton, the privacy company, breaks down how Apple’s tracking works. I, for one, am disgusted by this direction taken by Apple. (Related: The golden noose around Apple’s neck.)

If you are struggling with the whole FTX and Sam Bankman-Fried’s shenanigans, here is a very easy-to-understand explainer of how the whole con worked. Alex Tabarrok has done a good job, and worth a read. 

On the other end of the spectrum is a breakdown of the disaster that was FTX by an accomplished finance professor who digs into the intricacies of the con.(

Ken Kocienda, a former Apple user experience guru, breaks down the design and user experience challenges of Elon Musk’s proposed changes to Twitter’s verification systems. The whole piece is worth reading

Given all the obsession with Twitter, we must remember that the new generation of Internet natives doesn’t care much about the platform or its peer Facebook. For them, it is all about YouTube and TikTok

The A to Z of climate change by Elizabeth Kolbert is the most sobering piece I have read this weekend, and it is an important reminder of the existential threat we are facing as a collective. 

November 27, 2022. San Francisco

Captured at San Francisco on 27 Feb 2022 by Om Malik

I recently sat down to talk with my friend Howard Lindzon on his podcast Panic with Friends to discuss the future of technology. Howard has shared the show notes on his blog. I wanted to draw out three core themes I addressed in my conversation, and they are all correlated. 

I have a long-standing approach to holistically understanding technologies and their impact. I look at pure technologies such as semiconductors & networks and think about their impact on products, behavior, and change. At the same time, I look at our behaviors today and how they disrupt the present technology ecosystems. 

Much of my current and future enthusiasm stems from exciting work underway in the semiconductor world, with Apple’s M1 being the most visible example of the possibilities unlocked by cheap computing, cheap GPU, and machine learning capabilities. It is not just Apple — the entire semiconductor ecosystem is experiencing change. 


Value (of technology), not valuations, matters most.

When we try to predict the future, we usually get it wrong. It is just so because we only have the present and past to use us as references. For example, when we think about web3, we look for analogs. “What’s the new Twitter?” without ever wondering do we even need a new Twitter. Or will there be something else that will help us replace it as a source of information? No one thought TikTok would be a competitor to Google Search, yet they are starting to become a threat. 

My approach to thinking about the future is simple: always try and find the inherent value in technology. It helps take a longer view and embrace change. Take the COVID-19 pandemic, for example. The traditional view is that companies like DoorDash and Zoom got a COVID bump, and their valuations went sky-high, then—poof. Then they came down to Earth when the world was ready to return to a world more like 2019.

There’s just one problem: technology only moves in one direction. There is no going back. It is not as if Zoom lost its value along with valuations. Who wouldn’t rather talk to a screen than fly five hours for a meeting? No matter how much we want to use the past as a reference point for the future, we have to override our biases and go where the value takes us. And where it takes us isn’t always where we might think.

I’ve previously written about my optimism about technology and its impact on society; that hasn’t changed. But rather than try to wind vane the tech sector by looking at stocks and startup valuations, there’s a fundamentally better approach to gauging the future. We have to consider what we know about the foundations of tech. And to me, that’s even more exciting than trying to guess what will be the next Twitter.


Processing Into a 3D World

Apple’s M1 chip is a game-changer, even if consumers haven’t yet figured out why. Most think: “Oh, great—a faster computer. That’s neat.” But look at the value underneath it. The M1 chip puts about 25% of the power of IBM’s original Watson supercomputer at your fingertips. Yes, that Watson. Or, as my friend Michael Driscoll astutely points out, “The line between localhost and cloud is blurring.”

Apple’s M1 is a proxy for a new generation of chip technologies that will reshape our computing experiences. Apple’s approach to silicon combines CPU, GPU, AI, and memory into a single entity for a powerful bitches brew with preternatural capabilities. It is an outcome of the smartphone revolution. 

We’re used to our computing experience being flat. Today, we look at a flat screen and interact with the data in two dimensions. Apps—for all their contribution to the mobile phone revolution—are still two-dimensional. But between the M1 and Moore’s Law, we’re moving into place to alter how we interact with data and information. We’re going 3D. 

By 3D, I don’t necessarily mean AR or VR. Those are ideas we can anticipate. But we’re thinking about the future here. Or rather, our interaction with data will be in three dimensions. What about the ideas we aren’t anticipating? 

I am excited because when I think about what M1 can do today, imagine what it can do to our computing experience in half a decade. Or in ten years? Computers would easily handle inputs beyond keyboard strokes and mouse clicks, and they could (more accurately) use lidars, cameras, and microphones to create maps of our surroundings. 

Cameras can interpret our gestures and facial expressions, and earphones with sensors can give more nuance to our gestures. Computers are merely augmenting our reality. It won’t be long before we have the processing power for holographic displays. Think Star Wars-type technology, not 2001: A Space Odessey.

This technology sets the foundation for a new interaction layer between humans and our machines. Whenever I think about the future of technology, I try to imagine it from the perspective of the next generation of users. Kids in the future are growing up interacting with machines. They swipe, they tap, and they use gestures. They talk to Alexa or Siri. They’re already training for a new way of computing. Most of us haven’t noticed that for them in the future, mouse and keyboard would not be as relevant as they’re to us old fogies. 


Authentication is the Value Store

If you have been a long-time reader, you know that I firmly believe that what technology giveth, technology taketh. Technology is not without its consequences. The rise of powerful chips, coupled with new forms of artificial intelligence approach to software and services, can do both good and ill. 

Think of what computers can already do: deep fakes, phishing scams, simulated voices, etc. Future machines and software would make these even more realistic and thus more harmful. In a world of cloud computers powered by ever-powerful chips creating uncanny deep fakes, authenticating who you are will become paramount. 

And this is where we would need the emergence of a new authentication layer, which is more robust than whatever we have. Regardless of what you might think of web3 mania, it will help create a new approach to identity and authentication. 

We need the authentication layer to distinguish between the artificial “us” and the real us. Mark Zuckerberg isn’t spending billions on the Metaverse for shits and giggles. The real value of Facebook will continue to be the “login,” which will eventually become the identity verification —that’s even more important than all the information they’ve gathered. 

What’s one thing you’ve barely noticed about living in the mobile phone world? How often do you “Login with Facebook” or “Login with Google” because it’s more convenient than setting up an account? There is a lot of value in whichever company makes authentication easy in this world. 

What if Apple offers a Metamask-like product as an authentication system and in-exchange charges a small subscription fee? I would happily pay for the convenience alone. Authentication and payments can be critical to a post-app store world. Facebook, too, is hoping to ride the payments and authentication gravy train to the future.) 

Talking about Facebook, let’s talk about Metaverse. Today, we mock it as a cartoony version of Zoom call. What if it is far bigger than that. Take Facebook out of the equation, and you start to see that we have rudimentary building blocks for the next version of the Internet. You can call it web3, but in reality, it will be the next version of the Internet. 


No Longer Living in the “America-First” World

My last point is about the changing nature of the network itself. When we are trying to predict the future, those of us who live in America often have an America-centric worldview. This isn’t without good reason. There was a time when the majority of relevant tech consumers lived here. 

For as long as I can remember, American technology habits did shape the world. Today, the biggest user base doesn’t live in the US. Billion-plus Indians do things differently. Ditto for China. Russia. Africa. These are giant markets, capable of dooming any technology that attempts a one-size-fits-all approach.

Whether it is their adaptation of drone deliveries, novel climate change ideas, or revolutionizing technologies (such as solar) by large-scale adoption, these big markets will define new behaviors, inspire new ideas, and spread technology-driven change. It might not be perfect or ideal, but we need change — especially in a world facing monumental challenges.


You can listen to my appearance on Panic with Friends with Howard Lindzon on Spotify, Apple, or here

Here are links to my two previous appearances on Panic with Friends.  

March 31, 2020: The Pandemic Editon

July 2nd, 2020: Are we there yet?

Cursed by Information (Overload)

assorted book lot
Photo by freestocks on Unsplash

Nicholas Carr, one of my favorite writers, penned an excellent rumination on the perils of information overload in his must-read book, The Shallows.

The stress that Google and other Internet companies place on the efficiency of information exchange as the key to intellectual progress is nothing new. It’s been, at least since the start of the Industrial Revolution, a common theme in the history of the mind. 

Carr shared this as his tribute to Leo Marx, whose work influenced Carr’s writing and thinking. Marx passed away at the age of 102 on March 8, 2022.

Read article on Nicholas Carr

I wrote a guest piece for my former colleague Stacey Higginbotham’s wonderful newsletter on the internet of things. It tackled the need and importance of trust, privacy, and security in this new age of computing.

A couple of ice ages ago, when I started writing about technology, personal computing was shorthand for computers used by enthusiasts. Such machines were eons away from becoming the personal computers that now sit on our desks and in our backpacks. In the post-PC age, personal computing means tablets and smartphones. After all, these always-on mobile devices are our constant companions. We are now glued to their screens, and more importantly, they are personalized to serve our every need.

Going forward, however, personal computing will become something else thanks to the growing number of connected devices — what readers of this newsletter affectionately call the Internet of Things. Indeed, in looking around my apartment recently I realized just how many of these devices had entered the most personal of my spaces: my home.

Read article on StaceyonIOT