Earlier this morning, I read a piece about Wordle in the New York Times and decided to find more articles about the red-hot word game. Even Google has an easter egg that would make you believe: Wordle is a cultural phenomenon. 

But why?

Matt Baldwin, a social psychology professor at the University of Florida, offers six reasons why we Wordle

  1. An “Aha” moment, even when you lose.
  2. A pleasurable relief from pandemic-related negativity.
  3. It is bingeproof.
  4. It is a shared experience. 
  5. An opportunity for social comparison
  6. Everyone is doing it & we want to be in the herd.

“Sharing it on Twitter is a way of saying like, ‘look at me, I’m also doing Wordle just like everyone else.’ That makes me a good group member. Shared experiences give a lot of meaning to life,” Baldwin noted in the article. “They help us orient toward what’s good, what’s meaningful and what’s worthwhile.”

Wordle, in many ways, is reminiscent of the “ice-bucket challenge” when we forgot our differences and got behind a worthy cause. 

January 24, 2021. San Francisco

Ever since buying the new MacBook Pro, I have wondered about Thunderbolt 4.0 and how it impacts my old accessories. I want to make sure I take advantage of the latest technologies on offer in my MacBook Pro. And like everyone else, the various connectivity options leave me scratching my head. During my quest for answers, I ended up on the blog for the accessory maker, Satechi. They have a good breakdown that answers some of these questions as we transition to Thunderbolt 4.0. 

In case you were wondering, Thunderbolt 4.0 is:

 ….is the most recent version of the Thunderbolt line by Intel. It comes with a range of benefits including backward compatibility with USB Type-C and Thunderbolt 3.0. All the cables that are Thunderbolt 4.0 certified can work with everything including USB 2.0, USB 3.1, USB 3.2, and USB 4.0. 

So what is USB 4.0?

USB 4.0 is the latest specification or version of USB that’s housed within the USB Type-C cables. It takes over from USB 3.2 and 3.0 and offers either 40 GB/s or 20 GB/s of data transfer speed. Just like Thunderbolt 4, it uses the same Type-C reversible and rounded connector. A thunderbolt 4.0 cable can also be called a USB 4.0 cable but the opposite is not true because not all USB 4.0 cables are Thunderbolt 4.0 certified.

All the acronyms and standards are so confusing, something highlighted by veteran writer Glenn Fleishman in his excellent piece, USBefuddled.

USB4vsThunderbolt4

What is the difference between USB 4.0 and Thunderbolt 4.0? How are they similar? Should you care? And what should you look for before buying devices that are certified for one or the other?

Both USB 4.0 and Thunderbolt 4.0 use the same USB Type-C connector which is reversible and rounded. The USB 4.0 is also based on the very same underlying protocol as Thunderbolt 4.0, and both types are tightly connected. All thunderbolt devices come with USB 4.0 support which means if you have a device with Thunderbolt 4.0 connectors you can use USB 4.0 cables.  However, not all the devices with USB 4.0 connectors will be as powerful as the ones that are fully Thunderbolt 4.0 certified. But the good thing about USB 4.0 is that it’s cheaper than Thunderbolt 4.0.

The article was good enough to consider buying their new Thunderbolt 4.0 dock. Sadly, it is sold out

PS: More often than not, corporate blogs are bland. More often than not, they are marketing fodder posing as content. However, the linked blog post did a great job of helping me as a reader and was also good marketing for the brand, and it made me want to try out their products.

Read article on Satechi Blog]

water droplets on glass panel

With all the conversation of breaking free from big social platforms, owning your own digital identity, and being independent, I have been asking myself: how can all of us who have slowly become online performance artists ever be post-social? 

***

For the past two decades, most of us have grown accustomed to the idea of being online, being connected, and being part of a larger collective. It might have started as a social network of friends, but the social Internet has become a performative art since then. A decade ago, in an essay, Now You, Starring in a movie about you, I pointed out that “In our 21st-century society, we all want to stand out and get attention.” 

Today we have easy and free access to platforms that help spread the word about the movies of our lives — quickly. The Internet makes easy work of distribution. The concept of “followers” and “subscribers” is another way of saying “audience,” and by sharing carefully crafted words, a handful of shared links, and artistically snapped photos and videos, what we’re doing is essentially performing for this audience. We are all Lady Gaga — be it for one person or a million people.

A decade later, words like creator, influencer, YouTuber are now part of everyday vernacular. Every tweet, every selfie is a chance to virtue signal, an opportunity to market yourself as someone — pundit, guru, genius, or goofball.

There is no other way of putting it — we are addicted to the idea of an audience. When we go online, we are programmed to react to engagement triggers — likes, shares, retweets, hearts, and thumb-ups. Social and this addiction of audience have made us addicted to something even harder to give up once tasted: a constant feeling of self-importance. 

We have all experienced those interactions where friends, colleagues, family members, and lovers got upset because we didn’t like their Facebook entries or Instagram photos fast enough. Or, god forbid, you missed the updates! Social networks have weaponized this sense of self-importance. 

***

The affliction is even more acute if you happen to live in the creative realm. We are now programmed to evaluate our creative work using metrics, and nothing illustrates this reality more than Instagram and its insidious hold on the photography community. In conversations, some of the most creative photographers dismiss their work because it didn’t get enough validation.

The idea of giving the invisible “others” so much influence over one’s work and creativity is baffling. It is not as if social platforms exist with our best interests at heart. They have a simple motivation — keep you addicted to the screen for as long as possible and thus create as many opportunities to sell you “advertising.” 

And yet, if people don’t like or heart your photo on the tiny screen of your phone, no matter how much creative energy you spent on it, you deem it worthless. You quickly forget the joy you experienced from the act of creativity. Instead, you are constantly seeking the approval of an audience. 

On the flip side, a photograph or a tweet that gets hundreds or thousands of likes makes you feel giddy, mainly because it reinforces your importance. We sadly forget that platforms don’t distinguish between your creation and a proverbial monkey selfie. 

To me, this is the real challenge of post-social reality. To live in this post-social future, one has to embrace ideas that are the antithesis of self-importance. After two decades of being trained by micro-dosing on dopamine, I am not sure we can!

January 12, 2021. San Francisco   

In the late 1990s, when mobile chip behemoth Qualcomm still qualified as an upstart, I started writing about the mobile Internet. I dreamed of a mobile broadband revolution. It was when Japan and the now-forgotten iMode service enthralled the world. Imagining the future, I wrote enthusiastically about everyone — Ricochet, Nokia, Blackberry (when it only made pagers), Treo, Palm, and Windows CE devices.)

Intuitively, I knew that much like how when the (landline) phone network was decoupled from fixed connections, the always-on Internet, too, when set free from the fixed network, could profoundly impact society and its people. However, it was at the introduction of the iPhone launch in January 2007, it slowly dawned on me the world had changed. The future had arrived quietly, amidst a lot of skepticism. The magnitude of change was enough for me to overlook the launch of the Apple TV or dismiss the transition to the Intel processors. iPhone was all that was on my mind and how it would change the mobile landscape. In my blog post that day, I wrote:

That also might be the epitaph of the PC era. And it is sweet irony that the company that sparked off the desktop computing revolution is the one announcing its passing. Dropping Computer from its name is a sure sign that Apple, from this point forward, is a consumer electronics company, a mobile handset maker – one that also makes computer hardware and software as well.

Apple is making the phone do all things a computer does – surf, email, browse, iChat, music, and watch videos. Nary a keyboard or mouse insight, and everything running on OS-X. While I am not suggesting that this replaces our notebooks or desktops for crucial productivity tasks, the iPhone (if it lives up to its hype) is at least going to decrease our dependence on it.

iPhone & the End of PC Era

It wasn’t until six months later, at the WWDC, I finally came to grips with what Apple had unleased. Here is what I blogged:

  • A true web applications platform for the mobile
  • Break the Wireless Walled Gardens
  • Shift of control to the customers
  • Slow demise of subsidized, boring phones filled with bloat ware
  • Keep it simple or else

Looking back, the iPhone delivered on all those fronts, and in the process, has changed the mobile landscape.

***

The applications — essentially web services sliced and diced in special wrappers — have become the dominant form of our interactions with the modern Internet. A generation of mobile natives who have never dealt with flip phones and other devices sold by large phone companies don’t quite realize how terrible the mobile experience used to be before the iPhone showed up in our hands.

These were wireless walled gardens crammed with absolutely rotten apps, games, and everything from mobile backgrounds to ringtones. They were an opportunity for carriers to nickel-and-dime their customers and extracted mafia-like fees from startups.

Today, we take the “app store” for granted, but getting whatever app you want, whenever you want, wasn’t the case. And despite Apple’s draconian and confusing policies around the App Store, we as end customers are free to download pretty much whatever apps we want.

“The iPhone is doing to the mobile world, what the browser did to the wireline world.”

Juniper Networks founder Pradeep Sindhu in an interview.  

***

“iPhone changed in the industry in two fundamental ways – decoupling applications from the network (operators) and the user interface (ease of use),” points out Chetan Sharma, a mobile industry veteran who runs an eponymously named consulting group. Today, Apple and its 30-percent cut of the Apple store comes under criticism and legal challenges, but let’s not forget what life used to be before the iPhone came along.

Think about it this way, before iPhone, almost 90 percent of the industry revenue used to go to the telecom operators because they pretty much controlled every aspect of the ecosystem layers. From spectrum to network to applications to devices. — everything was controlled by the carriers.

In the US, for example, Verizon, Sprint, or AT&T decided what networking protocol — GSM or CDMA would be the dominant protocol. They decided what OS and phones could be sold to their network customers and available applications. And oh, everything was billed through their billing systems. That decoupling has reduced the carrier cut to somewhere between “20-30% depending on the geography,” Sharma points out.

***

In an article for the FastCompany magazine, I pointed out that iPhone (and its smartphone brethren) were part of an enormous change and brought a new Victorian age.

Today, it’s the increasing mobility of “computing engines,” the marriage of microprocessors and Internet ubiquity, that is poised to reimagine our society. More than a billion people bought smartphones last year—or to put it differently, we added 1.2 billion nodes to what was already the largest network ever built. Networks—social, neural, physical, metaphorical—enable connectedness, and connectedness changes everything. Networks compress distance and time, that concentration speeds up life, and that, in turn, creates sociological and economic change. 

And this age was catalyzed by the iPhone and what it brought to our fingertips. As I wrote in an earlier article:

iPhone had this one magical quality — touch, the most human of all senses — that made it the most personal of all computers. Think about it — we shake hands to confirm our relationship. We touch and hug to show our love. We caress to tell someone we care. So when we touch that phone, we don’t just touch a device and its screen, we make it part of ourselves. The internet is not a strange, cold, uncomfortable, cluttered space. That touch is what turns an inanimate object from metal and plastic to an extension of ourselves.

Fifteen years later, we have forgotten to appreciate how much the user interface and its simplicity changed the game and allowed application creativity to thrive and bring many billions of dollars to application developers. In a world controlled by carriers and their walled gardens, every single application and service you use daily wouldn’t either exist or thrive.

Instagram, Uber, DoorDash, Dropbox, and Facebook are all beneficiaries of the device initially dismissed by everyone from Nokia to Blackberry to Palm executives. For me, it was love at first byte, and it still is the phone I am happy to use — warts and all.

Every once in a while, a revolutionary product comes along that changes everything.

Steve Jobs when introducing the iPhone in 2007.

For once, Steve was under hyping what was to come!


My favorite articles (I have written) about the iPhone.

Photos By Om. Somewhere on Northern California Coast, just near Oregon.

The new year has started rolling. And like everyone else, the holiday-enforced slowdown and social media abstinence allowed me a lot of time to reflect on the year that was and what is to come. 2021 was a significant improvement over the year before.

I got vaccinated and got my booster shots. And that allowed me to travel a bit more. I even undertook a once-in-a-lifetime trip to the edge of the planet. I got quite a few opportunities to pursue my photography.

I bought (too) many books, and to my surprise, I read quite many of them. That said, I did go through another phase of ruthless elimination of physical things. However, the best part of 2021 was something I did for self-improvement: I started taking driving lessons and hope to have a driver’s license before spring rolls around.
I am looking forward to driving around California and going out to take more photographs.

However, now that I reflect on the year, I feel one missing thing was the feeling of creative fulfillment. I have been asking myself the question — why did I feel that way? The answer is straightforward — in 2021, I didn’t do one thing that allows me to thread the needle of life with time — writing down my reflections almost daily. I managed to survive the first lockdown in 2020 by actively blogging and getting everything inside my head out into words, and posting them here.

In 2021, I wrote extensively in a private journal — about private stuff. And that was important to get a better handle on mental health. However, I don’t remember much about the year, especially about technology, art, books, and other ephemera that add up to the texture of time. This documentation is part of how I keep track of my self-development. Even my photography veered away from visual documentation. Instead, I became drawn to the concept of minimal abstraction of landscapes, objects, and even people.

***

Photos By Om. Yosemite from up in the air. Made with iPhone 13 Pro Max

Whatever the reasons, I suppose I cannot “not” blog and maintain a public journal of what’s on my mind. Blogging is my way of documenting and chronicling my life, and if anything, I should not shy away from it. With me transitioning to partner emeritus at True, I have more mental space to read, analyze and analyze in the year ahead. And for the first time in my life, I have the opportunity to be intellectually self-indulgent.

My interests have gotten wider beyond the obvious set of technologies. For example, I am learning about climate change and its impact on everything from the design of the cities, management of resources, and the human costs of change. I have returned to reading and digging deeper into materials (and related topics.) Materials have led me to learn about new printing techniques and hopefully experiment with them in years to come.

So what’s the plan? For starters, I have rearranged the website a tiny bit to make it easier for you to navigate and find things quickly. I want to encourage you to come to the blog more often. The new navigation bar is pretty self-explanatory — journal is where I will journal. Some of my photos will show up in the journal, as will my writing on the art of photography.

Essays are my longer pieces that have taken some deliberation. Interviews are, well, interviews. And there is a small new section: Reads. I hope to share more “longer” articles worth reading and do it more often. Of course, I will give you my reasons why they are worth reading.

If you are signed up for my newsletter, you will continue to get my technology-centric writing. I will try and bundle these longer articles and send them along as well, in case you want to read them later.

January 3, 2021. San Francisco.

PS: If you would like to get my technology-centric writings delivered directly to your inbox, please consider using the sign-up form below.