Believe it or not, the harsh glare of scrutiny on big technology giants has kept them honest, more or less. Realizing how much of their present and future business depends on folks wanting to use their services, they work hard to protect data and privacy. (Like I said, more or less!) After all, our data is what they use to bundle and sell as services to their real customers: advertisers. In the case of Apple, the new marketing pitch is all about “privacy” and how they are not collecting tracking data — a handy way to distinguish themselves from Google and Facebook. I buy Apple products precisely for their stance towards privacy, at least in the U.S. In short, a noble ideology that also helps them sell more gear with fat margins—not that there’s anything wrong with it.

The companies we should be worried about are the many smaller and mid-sized companies that most of us have never heard about. Whether it is app developers surreptitiously selling information to third parties, data breaches at retailers (and their digital platforms), or data-brokers with security systems that have more holes than swiss cheese, these companies will continue to be the cause of most headaches in our digital lives. And they are the group more likely to take liberties with data and privacy. 

I began ruminating on this earlier in the week when I read this article about electric utilities resetting the smart thermostats inside residential homes in Houston in response to the rising demand for electricity due to record-breaking heat. This story is a harbinger of our future: what at first seems like some minor convenience and even a seemingly good deal becomes a major problem for those who don’t spend time reading the complicated terms of service documents — which is to say, just about everyone.

In this case, if a customer signed up for an offering called “Smart Savers Texas” from a company called EnergyHub (which is owned by Alarm.com, a seller of security services), they could be entered into sweepstakes. In exchange, they gave permission to EnergyHub to control their thermostats during periods of peak or extreme demand. 

This is yet another example of how, though we dread the future controlled by big technology companies, we will ultimately suffer most at the hands of what I call “non-technology” companies that now have access to our private data and control over our lives.  

And at the top of the list are companies that have always been hostile to their customers: telephone companies, electric utilities, insurance companies, for-profit hospital systems, big airlines, and other such organizations. They will only use “smart data” to amplify their past bad behavior. 

Dark patterns around offers like “Smart Savers Texas” make it virtually impossible for you and me to really discern what we might be signing up for. After all, no one sifts through the pages-long terms of service agreements. And I certainly don’t mean to pick on this one company — this “unclear” behavior is part of the entire digital ecosystem. 

Try getting out from under a contract at a health club or canceling your subscription to The New York Times. Good luck. In this digital age, these seemingly simple tasks have only gotten harder. I have been trying, without much success, to unsubscribe from emails from a publishing company for almost a decade. And this is neither the first nor the last time you are going to see “utilities” or other entities muck around with what you assume to be private spaces. 

What happened in Houston is among a rapidly growing list of incidents that make me pause about embracing the Smart Home, even though I was an early adopter. The safety of our “connected devices” is increasingly unclear. I have little trust in Amazon’s ability to police its digital shelves. What if the device we are buying is fake or is sending data surreptitiously to an overseas destination? I don’t know if you remember (or even read) this scary story in Vice about seemingly innocuous apps that collect personal data and sell it to anyone willing to pay. And that is just the tip of the iceberg. It is increasingly important to pause and consider: is cheap really cheap, or is there a bigger price to pay in the long term?

Belatedly, and thankfully, Apple has introduced AppTrackingTransparency (ATT), which will force apps to seek permission to track us and our activity across apps. Deservedly, many have written about the impact of this on Facebook, but it goes beyond that one company. Still, as EFF points out, it doesn’t do enough. 

“It doesn’t do anything about ‘first-party’ tracking, or an app tracking your behavior on that app itself,” EFF writes on its blog. “ATT might also be prone to ‘notification fatigue’ if users become so accustomed to seeing it that they just click through it without considering the choice. And, just like any other tracker-blocking initiative, ATT may set off a new round in the cat-and-mouse game between trackers and those who wish to limit them.”

And that’s the challenge. The pressure of protecting our digital sanctity is falling on consumers, not those who profit from it. Even Apple’s efforts shift the workload to ordinary people, and many of us are just not equipped to handle the cognitive load or don’t understand the impact. 

For nearly a decade, I have raised questions about an individual’s rights pertaining to how data is collected and used. In 2014, naively I wrote about something called “Terms of Trust,” in which companies explain what they would do with our data in plain language, instead of legalese. 

Eight years later, we are still stumbling through the fog — even in our own homes.

June 22, 2021, San Francisco


This week, Tim O’Reilly provided much-needed perspective in his essay “The End of Silicon Valley As We Know It.” If you can overlook the clickbait title, this essay is among the most valuable things you can read to understand our present and think about our future. While there has been much hoopla about folks leaving Silicon Valley, new distributed work philosophies, and other daily headlines, these are primarily distractions from a deeper, more profound change afoot in what we call Silicon Valley.

The Algorithmic Accountability Index: Ellery Roberts Biddle and Jie Zhang have created an accountability index for the algorithmic economy. They looked for companies’ answers to some fundamental questions about algorithms: How do you build and train them? What do they do? What standards guide these processes? An essential piece. 

How the race for autonomous cars started: We might be on the brink of the future where we all zoom around in self-driving cars and other autonomous vehicles. It is easy to forget that, 16 years ago, autonomous driving was a chaotic dream. In his new book, Driven: The Race to Create the Autonomous Car, Alex Davies chronicles what brought us to this moment. Wired magazine recently ran an excerpt, and you should check it out.

Did Tech prevent the World from a bigger meltdown?: While we have read many articles about technology becoming a dominant force in our lives during the pandemic, this article in Foreign Policy asks (and answers) the question from a different angle. I liked the nuanced argument, and that is why I recommend it for your weekend reading.

The cassette tape creator is dead: In time, what was a disruptive technology becomes a part of our life that we don’t even notice. One hundred billion units later, cassette tape is one of those technologies. It kicked off the ability to personalize the curation of music. You can draw a straight line between those tapes and Spotify playlists. Lou Ottens, the engineer who created the cassette tape, died recently. Ottens also helped create the compact disc, which ultimately killed the cassette tape. His obituary is a reminder that only very few are fortunate enough to create technology that touches everyone’s lives.

target block in open space
Photo by Donald Giannatti on Unsplash
“The pre-Socratic Greek philosopher Parmenides taught that the only things that are real are things which never change… and the pre-Socratic Greek philosopher Heraclitus taught that everything changes. If you superimpose their two views, you get this result: Nothing is real.” ― Philip K. Dick 

You might have noticed that it has been awfully quiet here. I decided to take a “break” from reality and ended up staying as far away from the shackles of networked life as possible for as long as I could. I wanted to experience the kind of boredom that makes you come up with random and ludicrous ideas. The type that pushes you to jot down thoughts in a notebook, even if you can’t read your own scribbles. 

My disconnection allowed me to start considering what constitutes reality in our hyper-connected world. It is apparent that we no longer live in a what-you-see-is-what-you-get (WYSIWYG) kind of environment. Fact-based reality has become a figment of our imagination, or maybe we are beginning to realize that it was always so. “Reality exists in the human mind, and nowhere else,” George Orwell noted in 1984.

Much of today’s reality takes its cues from what we dubiously dubbed “reality” television. We all know that the Kardashians — like all reality show characters — are not really real, at least not as we know them. But they look and sound real enough, and they provide enough drama to provoke a real reaction. And this holds our attention, which can be sold to advertisers. 

A few days back, I watched Vanity Fair writer Nick Bilton’s documentary, “Fake Famous.” It is a great indictment on the artificial realities we all seem to live in, propped up by fake followers, bots, and machine-generated affirmations such as hearts, retweets and likes. 

The platforms encourage these falsehoods. As Bilton points out in the documentary, the social media companies turn a blind eye to these bogus, inflated metrics — after all, Wall Street rewards big numbers with big valuation. Of course, if he wanted anyone to watch his movie, Bilton also had to traffic in some artificiality. (Hint: It is highly watchable and recommended.)

“There is more than a hint of reality TV in Bilton’s social-experiment gambit,” writes Naomi Fry in her review of the film for The New Yorker. “The repackaging of individuals into a more commercial and skilled version of themselves reminded me of any number of shows, not least ‘America’s Next Top Model,’ with its makeovers and photo shoots.” 

The characters we follow on social media are essentially all Kardashians. The social platforms use the same highly crafted narratives to create a perception of reality — but unlike the television networks, they do it at hyper-scale. Even though I wrote about this way back in a 2011 essay, I am still surprised by the sheer magnitude of the simulacrum-generating machinery that surrounds us. I totally underestimated the human capacity for narcissism. 

***

The un-reality of our present is really a consequence of the exponential multiplication of realities. In the not-so-distant past, most of our societal constructs — political bodies, media entities, and the like — helped shape our collective reality, which is an extremely important thing for a society to have if it is to work in a linear fashion. The research (conducted by those more qualified than me) bears this out.

“Our culturally adapted way of life depends upon shared meanings and shared concepts and depends as well upon shared modes of discourse for negotiating differences in meaning and interpretation,” the late psychologist Jerome Bruner wrote in The Acts of Meaning. “By following a set of rules governing interpersonal communication, people inadvertently modify their private, idiosyncratic conception of a state of affairs and reach a common understanding of that situation. As noted, these shared representations constitute the contents of a culture.”

I fear that we now live in a world with multiple — and multiplying — shared realities, rather than a collective one. Recently, I delved into the work of Gerald Echterhoff and E. Tory Higgins (the former hails from the University of Munster in Germany, and the latter from Columbia University), who have both spent a lot of time trying to understand the nuances of reality. In a 2018 paper, they wrote: 

Shared reality is the experience of having in common with others inner states about the world. Inner states include the perceived relevance of something, as well as feelings, beliefs, or evaluations of something. The experience of having such inner states in common with others fosters the perceived truth of those inner states. Humans are profoundly motivated to create shared realities with others, and in so doing they fulfill their needs to have valid beliefs about the world and to connect with others.

The work of Echteroff and Higgins shows that communicators are able to tune their messages to their audience, and in turn, the audience response has an impact on the communicators. The result is the shared reality for that group. 

A study by Singapore Management University’s School of Social Sciences researchers showed that, when each person in a group was “informed of the majority opinions and allowed to communicate with only a fixed number of individuals,” after multiple communications, “opinions began to become more alike among communicating participants (clustering).” In comparison to email, social media and its network effects act faster and have a much deeper reach than other forms of communication, which can lead to more precise clustering. 

If you want to manipulate others exclusively for your gain as a communicator, you can easily find some people to join you in the alternative reality you create. Being a pied piper is as easy as writing a tweet. 

Increasingly, algorithms — aided by memes, tags, and other simplistic tools — do the work for you, clustering people into what we call “filter bubbles.” And these bubbles have a way of metastasizing. In some cases, the result is relatively harmless and straightforward, like the rapid rise in popularity of avocado toast. On the other end of the spectrum, we have attacks on the U.S. Capitol. 

In his book, “Shared Reality: What Makes Us Strong and Tears Us Apart,” Higgins writes: 

“Our shared realities become the world we live in and know: Sharing is believing. And with tight networks of people talking just to each other, these shared realities are the DNA of our social bubbles … With no shared reality being created, the interaction is treated as meaningless. It is as if the other person doesn’t matter or doesn’t really exist. Given our strong, natural motivation to create shared reality with other humans, not doing so when interacting with members of an out-group is like treating them as being non human. We want to create a shared reality with members of our in-group but not with members of out-groups.

Our eagerness to enter into these bubbles explains the rapidly escalating tribalism in our societies. The social networks have created schisms between us that make it more difficult for us to recognize each other as fellow humans — which, indeed, many “users” on our platforms are not.

***

Reality is not something we stumble into. It is deliberately created. Professor Bruner’s work showed that we are more likely to remember something when it is told as a narrative. When the story is good, the facts don’t matter. We tend to find comfort in the world of narratives, which is what enables them to snag our attention in the age of half-truths. Real or not, they have a way of becoming our reality.

In some cases, the rise of a new story or an alternative reality can be a good thing. In the past, only those who could afford to put forth narratives were able to create reality, giving those in power the ability to impose their perceptions onto others. Those who didn’t have the means to share their story didn’t get to write their history. Sadly, many of these stories have been lost. Now with the rise of the influencer class and social platforms, the idea of power is more than just wealth. The outsiders are able to now at least appear to have the opportunity to have their voice heard. For example, the recent actions around the GameStop stock opened an opportunity for alternative narratives around the stock market that were not controlled by its establishment. 

Still, Silicon Valley has significant control over our social media platforms. And not surprisingly, Silicon Valley has become really good at creating stories. That’s how we get all of our bogus (and boring) origin stories, new financial instruments, and questionable new trends. However, we are rank amateurs when compared to politicians. Steve Jobs might have had a reality distortion field, but Washington‘s reality is permanently distorted. And the media who mock politicians and technologists for a living are no different. Everyone wants attention, and attention begets more attention — and more loyalty from those who crave it for themselves. 

***

So, I highly recommend taking a moment to step away from the network if and when you get the chance. Meet some actual humans. Write down some thoughts on actual paper. Reacquaint yourself with the real world, and remind yourself of your own realness. When you inevitably return, try to stay in that mode for as long as you possibly can, which — let’s be honest — can be harder than it sounds. There are many mirages to lure you off course and algorithms eager to impose alternative ways of being. As I make my slow reentry into my networked life, I am reminded of Tim Burton’s wise words: “One person’s craziness is another person’s reality.”

February 24, 2021, San Francisco