A week ago, when I sent this text message, little did I know that this would be the last text message I would ever send to someone who has been a constant in my life for around two decades. I knew his family was away in India, so he might be flying solo. And would … Continue reading A final goodbye
I recently sat down to talk with my friend Howard Lindzon on his podcast Panic with Friends to discuss the future of technology. Howard has shared the show notes on his blog. I wanted to draw out three core themes I addressed in my conversation, and they are all correlated.
I have a long-standing approach to holistically understanding technologies and their impact. I look at pure technologies such as semiconductors & networks and think about their impact on products, behavior, and change. At the same time, I look at our behaviors today and how they disrupt the present technology ecosystems.
Much of my current and future enthusiasm stems from exciting work underway in the semiconductor world, with Apple’s M1 being the most visible example of the possibilities unlocked by cheap computing, cheap GPU, and machine learning capabilities. It is not just Apple — the entire semiconductor ecosystem is experiencing change.
Value (of technology), not valuations, matters most.
When we try to predict the future, we usually get it wrong. It is just so because we only have the present and past to use us as references. For example, when we think about web3, we look for analogs. “What’s the new Twitter?” without ever wondering do we even need a new Twitter. Or will there be something else that will help us replace it as a source of information? No one thought TikTok would be a competitor to Google Search, yet they are starting to become a threat.
My approach to thinking about the future is simple: always try and find the inherent value in technology. It helps take a longer view and embrace change. Take the COVID-19 pandemic, for example. The traditional view is that companies like DoorDash and Zoom got a COVID bump, and their valuations went sky-high, then—poof. Then they came down to Earth when the world was ready to return to a world more like 2019.
There’s just one problem: technology only moves in one direction. There is no going back. It is not as if Zoom lost its value along with valuations. Who wouldn’t rather talk to a screen than fly five hours for a meeting? No matter how much we want to use the past as a reference point for the future, we have to override our biases and go where the value takes us. And where it takes us isn’t always where we might think.
I’ve previously written about my optimism about technology and its impact on society; that hasn’t changed. But rather than try to wind vane the tech sector by looking at stocks and startup valuations, there’s a fundamentally better approach to gauging the future. We have to consider what we know about the foundations of tech. And to me, that’s even more exciting than trying to guess what will be the next Twitter.
Processing Into a 3D World
Apple’s M1 chip is a game-changer, even if consumers haven’t yet figured out why. Most think: “Oh, great—a faster computer. That’s neat.” But look at the value underneath it. The M1 chip puts about 25% of the power of IBM’s original Watson supercomputer at your fingertips. Yes, that Watson. Or, as my friend Michael Driscoll astutely points out, “The line between localhost and cloud is blurring.”
Apple’s M1 is a proxy for a new generation of chip technologies that will reshape our computing experiences. Apple’s approach to silicon combines CPU, GPU, AI, and memory into a single entity for a powerful bitches brew with preternatural capabilities. It is an outcome of the smartphone revolution.
We’re used to our computing experience being flat. Today, we look at a flat screen and interact with the data in two dimensions. Apps—for all their contribution to the mobile phone revolution—are still two-dimensional. But between the M1 and Moore’s Law, we’re moving into place to alter how we interact with data and information. We’re going 3D.
By 3D, I don’t necessarily mean AR or VR. Those are ideas we can anticipate. But we’re thinking about the future here. Or rather, our interaction with data will be in three dimensions. What about the ideas we aren’t anticipating?
I am excited because when I think about what M1 can do today, imagine what it can do to our computing experience in half a decade. Or in ten years? Computers would easily handle inputs beyond keyboard strokes and mouse clicks, and they could (more accurately) use lidars, cameras, and microphones to create maps of our surroundings.
Cameras can interpret our gestures and facial expressions, and earphones with sensors can give more nuance to our gestures. Computers are merely augmenting our reality. It won’t be long before we have the processing power for holographic displays. Think Star Wars-type technology, not 2001: A Space Odessey.
This technology sets the foundation for a new interaction layer between humans and our machines. Whenever I think about the future of technology, I try to imagine it from the perspective of the next generation of users. Kids in the future are growing up interacting with machines. They swipe, they tap, and they use gestures. They talk to Alexa or Siri. They’re already training for a new way of computing. Most of us haven’t noticed that for them in the future, mouse and keyboard would not be as relevant as they’re to us old fogies.
Authentication is the Value Store
If you have been a long-time reader, you know that I firmly believe that what technology giveth, technology taketh. Technology is not without its consequences. The rise of powerful chips, coupled with new forms of artificial intelligence approach to software and services, can do both good and ill.
Think of what computers can already do: deep fakes, phishing scams, simulated voices, etc. Future machines and software would make these even more realistic and thus more harmful. In a world of cloud computers powered by ever-powerful chips creating uncanny deep fakes, authenticating who you are will become paramount.
And this is where we would need the emergence of a new authentication layer, which is more robust than whatever we have. Regardless of what you might think of web3 mania, it will help create a new approach to identity and authentication.
We need the authentication layer to distinguish between the artificial “us” and the real us. Mark Zuckerberg isn’t spending billions on the Metaverse for shits and giggles. The real value of Facebook will continue to be the “login,” which will eventually become the identity verification —that’s even more important than all the information they’ve gathered.
What’s one thing you’ve barely noticed about living in the mobile phone world? How often do you “Login with Facebook” or “Login with Google” because it’s more convenient than setting up an account? There is a lot of value in whichever company makes authentication easy in this world.
What if Apple offers a Metamask-like product as an authentication system and in-exchange charges a small subscription fee? I would happily pay for the convenience alone. Authentication and payments can be critical to a post-app store world. Facebook, too, is hoping to ride the payments and authentication gravy train to the future.)
Talking about Facebook, let’s talk about Metaverse. Today, we mock it as a cartoony version of Zoom call. What if it is far bigger than that. Take Facebook out of the equation, and you start to see that we have rudimentary building blocks for the next version of the Internet. You can call it web3, but in reality, it will be the next version of the Internet.
No Longer Living in the “America-First” World
My last point is about the changing nature of the network itself. When we are trying to predict the future, those of us who live in America often have an America-centric worldview. This isn’t without good reason. There was a time when the majority of relevant tech consumers lived here.
For as long as I can remember, American technology habits did shape the world. Today, the biggest user base doesn’t live in the US. Billion-plus Indians do things differently. Ditto for China. Russia. Africa. These are giant markets, capable of dooming any technology that attempts a one-size-fits-all approach.
Whether it is their adaptation of drone deliveries, novel climate change ideas, or revolutionizing technologies (such as solar) by large-scale adoption, these big markets will define new behaviors, inspire new ideas, and spread technology-driven change. It might not be perfect or ideal, but we need change — especially in a world facing monumental challenges.
Here are links to my two previous appearances on Panic with Friends.
Since everyone has decided that Elon Musk’s $54-a-share offer for Twitter is just a troll, the question remains who else can buy the company? Is there a suitor who can digest Twitter and deal with all its baggage? Or is the company destined to be a middling underachiever? Twitter had adopted the poison pill plan … Continue reading So who else can buy Twitter?
The stress that Google and other Internet companies place on the efficiency of information exchange as the key to intellectual progress is nothing new. It’s been, at least since the start of the Industrial Revolution, a common theme in the history of the mind.
Carr shared this as his tribute to Leo Marx, whose work influenced Carr’s writing and thinking. Marx passed away at the age of 102 on March 8, 2022.
I wrote a guest piece for my former colleague Stacey Higginbotham’s wonderful newsletter on the internet of things. It tackled the need and importance of trust, privacy, and security in this new age of computing.
A couple of ice ages ago, when I started writing about technology, personal computing was shorthand for computers used by enthusiasts. Such machines were eons away from becoming the personal computers that now sit on our desks and in our backpacks. In the post-PC age, personal computing means tablets and smartphones. After all, these always-on mobile devices are our constant companions. We are now glued to their screens, and more importantly, they are personalized to serve our every need.
Going forward, however, personal computing will become something else thanks to the growing number of connected devices — what readers of this newsletter affectionately call the Internet of Things. Indeed, in looking around my apartment recently I realized just how many of these devices had entered the most personal of my spaces: my home.