I was walking around San Francisco when I came across this piece of street art. I don’t know the artist, but it was so beautiful that I had to capture it. It represents surveillance — the kind that is enabled by facial recognition.

Whether it is smart cities or online services like Instagram, TikTok, Facebook, and apps that we use to see a younger/older version of ourselves, we give them our likeness, and they follow us everywhere.

In case you were wondering, I used Ricoh GRIIIx.

May 25, 2022. San Francisco

It has been over 18 months since I got off Instagram. And by doing so, I have managed to eliminate the popular influences on my work. It has been an excellent way to overcome the meaningless metrics that assign value to a personal creative effort.

I have been able to experiment more and create with relative freedom. This exploration has allowed me to narrow down my focus and find a visual language. Photography, nevertheless, is a journey in exploration, and it is an eternal exploration, and the search continues.

That said, I do miss sharing my efforts with others. I love the feedback, including constructive criticism. How else will I become a professional amateur? And despite having my homestead on the Internet, I find myself not sharing photos. It is because of some of the shortcomings of blogging as a format.

As an experiment, I will try and share photos at least twice and up to five times a week. There will be no titles. Just the date it was published. Unless otherwise stated, I use Lecia SL and an APO-Summicron-M 2/50 ASPH lens. Everything else doesn’t matter.

You can get the daily photoblog post delivered to your email inbox by signing up for my photos-only newsletter. Here is a link to scroll through my previously shared photos.

January 27, 2022. San Francisco

water droplets on glass panel

With all the conversation of breaking free from big social platforms, owning your own digital identity, and being independent, I have been asking myself: how can all of us who have slowly become online performance artists ever be post-social? 

***

For the past two decades, most of us have grown accustomed to the idea of being online, being connected, and being part of a larger collective. It might have started as a social network of friends, but the social Internet has become a performative art since then. A decade ago, in an essay, Now You, Starring in a movie about you, I pointed out that “In our 21st-century society, we all want to stand out and get attention.” 

Today we have easy and free access to platforms that help spread the word about the movies of our lives — quickly. The Internet makes easy work of distribution. The concept of “followers” and “subscribers” is another way of saying “audience,” and by sharing carefully crafted words, a handful of shared links, and artistically snapped photos and videos, what we’re doing is essentially performing for this audience. We are all Lady Gaga — be it for one person or a million people.

A decade later, words like creator, influencer, YouTuber are now part of everyday vernacular. Every tweet, every selfie is a chance to virtue signal, an opportunity to market yourself as someone — pundit, guru, genius, or goofball.

There is no other way of putting it — we are addicted to the idea of an audience. When we go online, we are programmed to react to engagement triggers — likes, shares, retweets, hearts, and thumb-ups. Social and this addiction of audience have made us addicted to something even harder to give up once tasted: a constant feeling of self-importance. 

We have all experienced those interactions where friends, colleagues, family members, and lovers got upset because we didn’t like their Facebook entries or Instagram photos fast enough. Or, god forbid, you missed the updates! Social networks have weaponized this sense of self-importance. 

***

The affliction is even more acute if you happen to live in the creative realm. We are now programmed to evaluate our creative work using metrics, and nothing illustrates this reality more than Instagram and its insidious hold on the photography community. In conversations, some of the most creative photographers dismiss their work because it didn’t get enough validation.

The idea of giving the invisible “others” so much influence over one’s work and creativity is baffling. It is not as if social platforms exist with our best interests at heart. They have a simple motivation — keep you addicted to the screen for as long as possible and thus create as many opportunities to sell you “advertising.” 

And yet, if people don’t like or heart your photo on the tiny screen of your phone, no matter how much creative energy you spent on it, you deem it worthless. You quickly forget the joy you experienced from the act of creativity. Instead, you are constantly seeking the approval of an audience. 

On the flip side, a photograph or a tweet that gets hundreds or thousands of likes makes you feel giddy, mainly because it reinforces your importance. We sadly forget that platforms don’t distinguish between your creation and a proverbial monkey selfie. 

To me, this is the real challenge of post-social reality. To live in this post-social future, one has to embrace ideas that are the antithesis of self-importance. After two decades of being trained by micro-dosing on dopamine, I am not sure we can!

January 12, 2021. San Francisco   

In the late 1990s, when mobile chip behemoth Qualcomm still qualified as an upstart, I started writing about the mobile Internet. I dreamed of a mobile broadband revolution. It was when Japan and the now-forgotten iMode service enthralled the world. Imagining the future, I wrote enthusiastically about everyone — Ricochet, Nokia, Blackberry (when it only made pagers), Treo, Palm, and Windows CE devices.)

Intuitively, I knew that much like how when the (landline) phone network was decoupled from fixed connections, the always-on Internet, too, when set free from the fixed network, could profoundly impact society and its people. However, it was at the introduction of the iPhone launch in January 2007, it slowly dawned on me the world had changed. The future had arrived quietly, amidst a lot of skepticism. The magnitude of change was enough for me to overlook the launch of the Apple TV or dismiss the transition to the Intel processors. iPhone was all that was on my mind and how it would change the mobile landscape. In my blog post that day, I wrote:

That also might be the epitaph of the PC era. And it is sweet irony that the company that sparked off the desktop computing revolution is the one announcing its passing. Dropping Computer from its name is a sure sign that Apple, from this point forward, is a consumer electronics company, a mobile handset maker – one that also makes computer hardware and software as well.

Apple is making the phone do all things a computer does – surf, email, browse, iChat, music, and watch videos. Nary a keyboard or mouse insight, and everything running on OS-X. While I am not suggesting that this replaces our notebooks or desktops for crucial productivity tasks, the iPhone (if it lives up to its hype) is at least going to decrease our dependence on it.

iPhone & the End of PC Era

It wasn’t until six months later, at the WWDC, I finally came to grips with what Apple had unleased. Here is what I blogged:

  • A true web applications platform for the mobile
  • Break the Wireless Walled Gardens
  • Shift of control to the customers
  • Slow demise of subsidized, boring phones filled with bloat ware
  • Keep it simple or else

Looking back, the iPhone delivered on all those fronts, and in the process, has changed the mobile landscape.

***

The applications — essentially web services sliced and diced in special wrappers — have become the dominant form of our interactions with the modern Internet. A generation of mobile natives who have never dealt with flip phones and other devices sold by large phone companies don’t quite realize how terrible the mobile experience used to be before the iPhone showed up in our hands.

These were wireless walled gardens crammed with absolutely rotten apps, games, and everything from mobile backgrounds to ringtones. They were an opportunity for carriers to nickel-and-dime their customers and extracted mafia-like fees from startups.

Today, we take the “app store” for granted, but getting whatever app you want, whenever you want, wasn’t the case. And despite Apple’s draconian and confusing policies around the App Store, we as end customers are free to download pretty much whatever apps we want.

“The iPhone is doing to the mobile world, what the browser did to the wireline world.”

Juniper Networks founder Pradeep Sindhu in an interview.  

***

“iPhone changed in the industry in two fundamental ways – decoupling applications from the network (operators) and the user interface (ease of use),” points out Chetan Sharma, a mobile industry veteran who runs an eponymously named consulting group. Today, Apple and its 30-percent cut of the Apple store comes under criticism and legal challenges, but let’s not forget what life used to be before the iPhone came along.

Think about it this way, before iPhone, almost 90 percent of the industry revenue used to go to the telecom operators because they pretty much controlled every aspect of the ecosystem layers. From spectrum to network to applications to devices. — everything was controlled by the carriers.

In the US, for example, Verizon, Sprint, or AT&T decided what networking protocol — GSM or CDMA would be the dominant protocol. They decided what OS and phones could be sold to their network customers and available applications. And oh, everything was billed through their billing systems. That decoupling has reduced the carrier cut to somewhere between “20-30% depending on the geography,” Sharma points out.

***

In an article for the FastCompany magazine, I pointed out that iPhone (and its smartphone brethren) were part of an enormous change and brought a new Victorian age.

Today, it’s the increasing mobility of “computing engines,” the marriage of microprocessors and Internet ubiquity, that is poised to reimagine our society. More than a billion people bought smartphones last year—or to put it differently, we added 1.2 billion nodes to what was already the largest network ever built. Networks—social, neural, physical, metaphorical—enable connectedness, and connectedness changes everything. Networks compress distance and time, that concentration speeds up life, and that, in turn, creates sociological and economic change. 

And this age was catalyzed by the iPhone and what it brought to our fingertips. As I wrote in an earlier article:

iPhone had this one magical quality — touch, the most human of all senses — that made it the most personal of all computers. Think about it — we shake hands to confirm our relationship. We touch and hug to show our love. We caress to tell someone we care. So when we touch that phone, we don’t just touch a device and its screen, we make it part of ourselves. The internet is not a strange, cold, uncomfortable, cluttered space. That touch is what turns an inanimate object from metal and plastic to an extension of ourselves.

Fifteen years later, we have forgotten to appreciate how much the user interface and its simplicity changed the game and allowed application creativity to thrive and bring many billions of dollars to application developers. In a world controlled by carriers and their walled gardens, every single application and service you use daily wouldn’t either exist or thrive.

Instagram, Uber, DoorDash, Dropbox, and Facebook are all beneficiaries of the device initially dismissed by everyone from Nokia to Blackberry to Palm executives. For me, it was love at first byte, and it still is the phone I am happy to use — warts and all.

Every once in a while, a revolutionary product comes along that changes everything.

Steve Jobs when introducing the iPhone in 2007.

For once, Steve was under hyping what was to come!


My favorite articles (I have written) about the iPhone.

GLASS’ Tom Watson

“We’re no longer a photosharing app,” Adam Mosseri, Head of Instagram, a division of Facebook.  Let’s face it: everything Facebook touches eventually turns into an engagement honeypot behind which lies an algorithmic whirlpool designed to suck attention that can be packaged and eventually sold to advertisers. And that is why I am not surprised that … Continue reading GLASS’ Tom Watson

Most people think of the Covid pandemic in binary terms. You are either for masks or against them. Vaccines or no vaccines. But in reality, the impact of this pandemic is not as straightforward. It is what I learned when reading this piece in Elle magazine.

This story isn’t about Covid, but instead it the story of Megan Lundstrom is both heartbreaking and life-affirming. Lundstrom was a young girl from a small town in Colorado, who signed up for Seeking Arrangement and became a commercial sex worker. She managed to quit, find a way to move forward, go to college, found a mentor, used her connections to get actual data, and offered insights into the world of sex work. Since then, she has helped an anti-trafficking not-for-profit organization.

“Recently, the team discovered an unsettling trend in several cities: an upsurge in SeekingArrangement usage, which Lundstrom says is a direct result of COVID-19,” the story notes. I suggest you read this piece, and you will find it very sobering.

Read article on Elle

target block in open space
Photo by Donald Giannatti on Unsplash
“The pre-Socratic Greek philosopher Parmenides taught that the only things that are real are things which never change… and the pre-Socratic Greek philosopher Heraclitus taught that everything changes. If you superimpose their two views, you get this result: Nothing is real.” ― Philip K. Dick 

You might have noticed that it has been awfully quiet here. I decided to take a “break” from reality and ended up staying as far away from the shackles of networked life as possible for as long as I could. I wanted to experience the kind of boredom that makes you come up with random and ludicrous ideas. The type that pushes you to jot down thoughts in a notebook, even if you can’t read your own scribbles. 

My disconnection allowed me to start considering what constitutes reality in our hyper-connected world. It is apparent that we no longer live in a what-you-see-is-what-you-get (WYSIWYG) kind of environment. Fact-based reality has become a figment of our imagination, or maybe we are beginning to realize that it was always so. “Reality exists in the human mind, and nowhere else,” George Orwell noted in 1984.

Much of today’s reality takes its cues from what we dubiously dubbed “reality” television. We all know that the Kardashians — like all reality show characters — are not really real, at least not as we know them. But they look and sound real enough, and they provide enough drama to provoke a real reaction. And this holds our attention, which can be sold to advertisers. 

A few days back, I watched Vanity Fair writer Nick Bilton’s documentary, “Fake Famous.” It is a great indictment on the artificial realities we all seem to live in, propped up by fake followers, bots, and machine-generated affirmations such as hearts, retweets and likes. 

The platforms encourage these falsehoods. As Bilton points out in the documentary, the social media companies turn a blind eye to these bogus, inflated metrics — after all, Wall Street rewards big numbers with big valuation. Of course, if he wanted anyone to watch his movie, Bilton also had to traffic in some artificiality. (Hint: It is highly watchable and recommended.)

“There is more than a hint of reality TV in Bilton’s social-experiment gambit,” writes Naomi Fry in her review of the film for The New Yorker. “The repackaging of individuals into a more commercial and skilled version of themselves reminded me of any number of shows, not least ‘America’s Next Top Model,’ with its makeovers and photo shoots.” 

The characters we follow on social media are essentially all Kardashians. The social platforms use the same highly crafted narratives to create a perception of reality — but unlike the television networks, they do it at hyper-scale. Even though I wrote about this way back in a 2011 essay, I am still surprised by the sheer magnitude of the simulacrum-generating machinery that surrounds us. I totally underestimated the human capacity for narcissism. 

***

The un-reality of our present is really a consequence of the exponential multiplication of realities. In the not-so-distant past, most of our societal constructs — political bodies, media entities, and the like — helped shape our collective reality, which is an extremely important thing for a society to have if it is to work in a linear fashion. The research (conducted by those more qualified than me) bears this out.

“Our culturally adapted way of life depends upon shared meanings and shared concepts and depends as well upon shared modes of discourse for negotiating differences in meaning and interpretation,” the late psychologist Jerome Bruner wrote in The Acts of Meaning. “By following a set of rules governing interpersonal communication, people inadvertently modify their private, idiosyncratic conception of a state of affairs and reach a common understanding of that situation. As noted, these shared representations constitute the contents of a culture.”

I fear that we now live in a world with multiple — and multiplying — shared realities, rather than a collective one. Recently, I delved into the work of Gerald Echterhoff and E. Tory Higgins (the former hails from the University of Munster in Germany, and the latter from Columbia University), who have both spent a lot of time trying to understand the nuances of reality. In a 2018 paper, they wrote: 

Shared reality is the experience of having in common with others inner states about the world. Inner states include the perceived relevance of something, as well as feelings, beliefs, or evaluations of something. The experience of having such inner states in common with others fosters the perceived truth of those inner states. Humans are profoundly motivated to create shared realities with others, and in so doing they fulfill their needs to have valid beliefs about the world and to connect with others.

The work of Echteroff and Higgins shows that communicators are able to tune their messages to their audience, and in turn, the audience response has an impact on the communicators. The result is the shared reality for that group. 

A study by Singapore Management University’s School of Social Sciences researchers showed that, when each person in a group was “informed of the majority opinions and allowed to communicate with only a fixed number of individuals,” after multiple communications, “opinions began to become more alike among communicating participants (clustering).” In comparison to email, social media and its network effects act faster and have a much deeper reach than other forms of communication, which can lead to more precise clustering. 

If you want to manipulate others exclusively for your gain as a communicator, you can easily find some people to join you in the alternative reality you create. Being a pied piper is as easy as writing a tweet. 

Increasingly, algorithms — aided by memes, tags, and other simplistic tools — do the work for you, clustering people into what we call “filter bubbles.” And these bubbles have a way of metastasizing. In some cases, the result is relatively harmless and straightforward, like the rapid rise in popularity of avocado toast. On the other end of the spectrum, we have attacks on the U.S. Capitol. 

In his book, “Shared Reality: What Makes Us Strong and Tears Us Apart,” Higgins writes: 

“Our shared realities become the world we live in and know: Sharing is believing. And with tight networks of people talking just to each other, these shared realities are the DNA of our social bubbles … With no shared reality being created, the interaction is treated as meaningless. It is as if the other person doesn’t matter or doesn’t really exist. Given our strong, natural motivation to create shared reality with other humans, not doing so when interacting with members of an out-group is like treating them as being non human. We want to create a shared reality with members of our in-group but not with members of out-groups.

Our eagerness to enter into these bubbles explains the rapidly escalating tribalism in our societies. The social networks have created schisms between us that make it more difficult for us to recognize each other as fellow humans — which, indeed, many “users” on our platforms are not.

***

Reality is not something we stumble into. It is deliberately created. Professor Bruner’s work showed that we are more likely to remember something when it is told as a narrative. When the story is good, the facts don’t matter. We tend to find comfort in the world of narratives, which is what enables them to snag our attention in the age of half-truths. Real or not, they have a way of becoming our reality.

In some cases, the rise of a new story or an alternative reality can be a good thing. In the past, only those who could afford to put forth narratives were able to create reality, giving those in power the ability to impose their perceptions onto others. Those who didn’t have the means to share their story didn’t get to write their history. Sadly, many of these stories have been lost. Now with the rise of the influencer class and social platforms, the idea of power is more than just wealth. The outsiders are able to now at least appear to have the opportunity to have their voice heard. For example, the recent actions around the GameStop stock opened an opportunity for alternative narratives around the stock market that were not controlled by its establishment. 

Still, Silicon Valley has significant control over our social media platforms. And not surprisingly, Silicon Valley has become really good at creating stories. That’s how we get all of our bogus (and boring) origin stories, new financial instruments, and questionable new trends. However, we are rank amateurs when compared to politicians. Steve Jobs might have had a reality distortion field, but Washington‘s reality is permanently distorted. And the media who mock politicians and technologists for a living are no different. Everyone wants attention, and attention begets more attention — and more loyalty from those who crave it for themselves. 

***

So, I highly recommend taking a moment to step away from the network if and when you get the chance. Meet some actual humans. Write down some thoughts on actual paper. Reacquaint yourself with the real world, and remind yourself of your own realness. When you inevitably return, try to stay in that mode for as long as you possibly can, which — let’s be honest — can be harder than it sounds. There are many mirages to lure you off course and algorithms eager to impose alternative ways of being. As I make my slow reentry into my networked life, I am reminded of Tim Burton’s wise words: “One person’s craziness is another person’s reality.”

February 24, 2021, San Francisco

QVC 2.0

Instagram today announced some significant changes to its design — it added tabs for Reels (TikTok clone) and Shopping in the new app — once again moving away from its core identity as a visual social network. It is now just Facebook 2.0, with fewer words and more photos. The new emphasis reminded me of … Continue reading QVC 2.0

What do Instagram & TikTok have to do with Asparagus?

Long before the pandemic made in-person coffee conversations a nostalgic memory, I was chatting with a friend about the increased frequency with which large technology companies copied their rivals. Microsoft was quick to imitate Slack with Teams. Instagram ripped off Snap Stories and brazenly acknowledged that in its initial announcement. And today, Facebook-owned IG announced … Continue reading What do Instagram & TikTok have to do with Asparagus?