Stephen Robles of AppleInsider invited me to for a conversation for the AppleInsider podcast. It was a delightful chat about a range of topics, including iPad Pro, Creator Economy, Social Media Platforms, and our deal with this devil called social media. I talk about my ongoing (and unending affair) with the Internet and why we always tend to underestimate its power — both on the upside and the downside.
I came across this opinion piece about the role of social media in the demise and subsequent rebirth of blogging, a topic not unfamiliar to readers of my blog. It credits Twitter for providing a platform that allows for interactions similar to those that distinguished early blogging communities. And at least in a superficial way, that’s not wrong, I guess. But there is a wide gulf between the impulses that drive social media and the “why” of blogging. And the author completely overlooks the latter in his eagerness to report that, after many bloggers were wiped out, some elements of the activity formerly known as blogging survived. (Fact check: classical blogging continues to flourish in all corners of the Internet.)
As I have noted a time or two, blogging and the behaviors it inspired were the genesis of many contemporary activities on the Internet. Yet, despite this, we still seem unable to fully appreciate what was at the heart of blogging — that thing that makes so many of us nostalgic for its heyday, even as we tweet until our thumbs ache. And this brings me to my long-standing quibble with the media establishment: why can’t they recognize significant changes until it is too late?
Marc Weidenbaum, a music enthusiast and founder of Disquiet.com, expertly captures the distinction between blogs and social. “Social media expects feedback (not just comments, but likes and follows),” he writes. “Blogs are you getting your ideas down; feedback is a byproduct, not a goal.” In other words, one is a performance for an audience, while the other is highly personal, though others may end up finding it interesting. Weidenbaum also admirably points out the difference between blogs and all the suddenly ubiquitous newsletters. “And newsletters = broadcasting,” he says. “Blogging is different.”
Bingo. By the way, for this exact reason, I recently decided to rethink the whole notion of my newsletter. I realized that it is just a way to get my words, as I wrote when I announced some recent changes, “from my computer to your inbox in order to spare you the trouble of coming to my website.”
The main reason media stalwarts couldn’t understand blogging is that they couldn’t see beyond their all-too-familiar containers and distribution mechanisms. They were too entrapped in their dogmas. The author of the opinion piece that kicked off this post offers up a telling account of his own transition from blogger to an employee at a legacy media company.
“A key lesson I learned from my new colleagues was that we couldn’t cater to our regular readers the way many classic blogs did,” he writes. “Our salaries were supported by advertising. To make the whole project financially viable, we needed a lot of readers. Practically speaking, that meant bringing in a lot of new readers.” In other words, the company couldn’t conceive of any game other than the one it was already playing.
This problem persists. Rethinking news requires a complete reconsideration of media, what it means, how it gets consumed, and how it gets distributed to those who want it. Even now, the media establishment is so stuck in text that they can’t fully see the extent to which we are transitioning to a world of primarily visual media.
For the future of media — including blogging — look to YouTube, Snap, TikTok, and Instagram. By the way, the content on these platforms is often created and engaged with in a spirit much more analogous to that of traditional blogs than anything you’re likely to see on Twitter. A whole generation has grown up with cameras — and front-facing cameras at that. Smartphones make it so much easier to create daily logs (What else are “stories” on Snap and Instagram?). The behaviors on these platforms will define the media consumption of the future. They are already reshaping the present.
Let’s take music journalism as an example. You are unlikely to stumble upon any new music through a traditional music magazine or even on many traditional music blogs. Instead, people are finding new musical acts on TikTok. “Mainstream music journalism is largely uninterested in promoting discovery, focusing instead on blanket coverage of superstars and seemingly endless traffic-grabbing lists — which may buoy an existing reader base, but often fails to capture newer, younger music fans,” reported (ironically) Rolling Stone. “Enter the upstart music blogs of TikTok.”
TikTok’s rise as a taste maker for music (and culture) is just the evolution of (news) media away from the written word model. Magazines, radio and late-night television shows helped with music discovery before the social era. Blogs came next, by their human curation. Individuals as taste makers and cultural deejays was a trend that became stronger with YouTube, Facebook, Instagram, and Twitter. And TikTok is the newest evolution for a generation that lives at the network speed.
And a generation growing up on the beat of the network wants their news in TikTok-style packaging. The future of media and news is a combination of visual, virtual, augmented, and metaverse realities. It is not a matter of if, but when. I am not saying that the traditional media formats won’t have a role — but they will have to compete with a different reality.
Back when media companies were making a mess of the blogging world, they were hamstrung by their failure to understand and appreciate the “why” of the activity they were seeking to replicate. As they slowly key into the world of visual media — and inevitably attempt to stuff it into their preexisting boxes — let’s hope they don’t make all the same mistakes again.
June 7, 2021. San Francisco
"Don't pay any attention to what they write about you. Just measure it in inches," Andy Warhol.
Today, he would have measured everything in the number of tweets, re-tweets, shares, likes, and hearts. Sadly, this isn’t the first time I have felt this feeling of despair, and neither am I alone. The theater and theatrics of outrage are so loud that it is rendering the platforms pretty unusable.
It is becoming evident that facts, truth, reality, and happiness have no place in the social media world, where attention-at-any-cost is the only currency. Everything has become so loud. The reason is not a reason, and neither is being reasonable. Hyperbole is the order of the day. All of it, so people pay attention to what you have to say.
It reminds me of the quip David Bowie made about Madonna and her need for attention. “That kind of clawing need to be the center of attention is not a pleasant place to be,” he said.
Have a wonderful evening, everyone!
PS: A day later, I can’t help but notice the hilarious irony: many media personalities, often at the center of attention are complaining about the lack of civility on the social platforms.
“The pre-Socratic Greek philosopher Parmenides taught that the only things that are real are things which never change… and the pre-Socratic Greek philosopher Heraclitus taught that everything changes. If you superimpose their two views, you get this result: Nothing is real.” ― Philip K. Dick
You might have noticed that it has been awfully quiet here. I decided to take a “break” from reality and ended up staying as far away from the shackles of networked life as possible for as long as I could. I wanted to experience the kind of boredom that makes you come up with random and ludicrous ideas. The type that pushes you to jot down thoughts in a notebook, even if you can’t read your own scribbles.
My disconnection allowed me to start considering what constitutes reality in our hyper-connected world. It is apparent that we no longer live in a what-you-see-is-what-you-get (WYSIWYG) kind of environment. Fact-based reality has become a figment of our imagination, or maybe we are beginning to realize that it was always so. “Reality exists in the human mind, and nowhere else,” George Orwell noted in 1984.
Much of today’s reality takes its cues from what we dubiously dubbed “reality” television. We all know that the Kardashians — like all reality show characters — are not really real, at least not as we know them. But they look and sound real enough, and they provide enough drama to provoke a real reaction. And this holds our attention, which can be sold to advertisers.
A few days back, I watched Vanity Fair writer Nick Bilton’s documentary, “Fake Famous.” It is a great indictment on the artificial realities we all seem to live in, propped up by fake followers, bots, and machine-generated affirmations such as hearts, retweets and likes.
The platforms encourage these falsehoods. As Bilton points out in the documentary, the social media companies turn a blind eye to these bogus, inflated metrics — after all, Wall Street rewards big numbers with big valuation. Of course, if he wanted anyone to watch his movie, Bilton also had to traffic in some artificiality. (Hint: It is highly watchable and recommended.)
“There is more than a hint of reality TV in Bilton’s social-experiment gambit,” writes Naomi Fry in her review of the film for The New Yorker. “The repackaging of individuals into a more commercial and skilled version of themselves reminded me of any number of shows, not least ‘America’s Next Top Model,’ with its makeovers and photo shoots.”
The characters we follow on social media are essentially all Kardashians. The social platforms use the same highly crafted narratives to create a perception of reality — but unlike the television networks, they do it at hyper-scale. Even though I wrote about this way back in a 2011 essay, I am still surprised by the sheer magnitude of the simulacrum-generating machinery that surrounds us. I totally underestimated the human capacity for narcissism.
The un-reality of our present is really a consequence of the exponential multiplication of realities. In the not-so-distant past, most of our societal constructs — political bodies, media entities, and the like — helped shape our collective reality, which is an extremely important thing for a society to have if it is to work in a linear fashion. The research (conducted by those more qualified than me) bears this out.
“Our culturally adapted way of life depends upon shared meanings and shared concepts and depends as well upon shared modes of discourse for negotiating differences in meaning and interpretation,” the late psychologist Jerome Bruner wrote in The Acts of Meaning. “By following a set of rules governing interpersonal communication, people inadvertently modify their private, idiosyncratic conception of a state of affairs and reach a common understanding of that situation. As noted, these shared representations constitute the contents of a culture.”
I fear that we now live in a world with multiple — and multiplying — shared realities, rather than a collective one. Recently, I delved into the work of Gerald Echterhoff and E. Tory Higgins (the former hails from the University of Munster in Germany, and the latter from Columbia University), who have both spent a lot of time trying to understand the nuances of reality. In a 2018 paper, they wrote:
Shared reality is the experience of having in common with others inner states about the world. Inner states include the perceived relevance of something, as well as feelings, beliefs, or evaluations of something. The experience of having such inner states in common with others fosters the perceived truth of those inner states. Humans are profoundly motivated to create shared realities with others, and in so doing they fulfill their needs to have valid beliefs about the world and to connect with others.
The work of Echteroff and Higgins shows that communicators are able to tune their messages to their audience, and in turn, the audience response has an impact on the communicators. The result is the shared reality for that group.
A study by Singapore Management University’s School of Social Sciences researchers showed that, when each person in a group was “informed of the majority opinions and allowed to communicate with only a fixed number of individuals,” after multiple communications, “opinions began to become more alike among communicating participants (clustering).” In comparison to email, social media and its network effects act faster and have a much deeper reach than other forms of communication, which can lead to more precise clustering.
If you want to manipulate others exclusively for your gain as a communicator, you can easily find some people to join you in the alternative reality you create. Being a pied piper is as easy as writing a tweet.
Increasingly, algorithms — aided by memes, tags, and other simplistic tools — do the work for you, clustering people into what we call “filter bubbles.” And these bubbles have a way of metastasizing. In some cases, the result is relatively harmless and straightforward, like the rapid rise in popularity of avocado toast. On the other end of the spectrum, we have attacks on the U.S. Capitol.
“Our shared realities become the world we live in and know: Sharing is believing. And with tight networks of people talking just to each other, these shared realities are the DNA of our social bubbles … With no shared reality being created, the interaction is treated as meaningless. It is as if the other person doesn’t matter or doesn’t really exist. Given our strong, natural motivation to create shared reality with other humans, not doing so when interacting with members of an out-group is like treating them as being non human. We want to create a shared reality with members of our in-group but not with members of out-groups.
Our eagerness to enter into these bubbles explains the rapidly escalating tribalism in our societies. The social networks have created schisms between us that make it more difficult for us to recognize each other as fellow humans — which, indeed, many “users” on our platforms are not.
Reality is not something we stumble into. It is deliberately created. Professor Bruner’s work showed that we are more likely to remember something when it is told as a narrative. When the story is good, the facts don’t matter. We tend to find comfort in the world of narratives, which is what enables them to snag our attention in the age of half-truths. Real or not, they have a way of becoming our reality.
In some cases, the rise of a new story or an alternative reality can be a good thing. In the past, only those who could afford to put forth narratives were able to create reality, giving those in power the ability to impose their perceptions onto others. Those who didn’t have the means to share their story didn’t get to write their history. Sadly, many of these stories have been lost. Now with the rise of the influencer class and social platforms, the idea of power is more than just wealth. The outsiders are able to now at least appear to have the opportunity to have their voice heard. For example, the recent actions around the GameStop stock opened an opportunity for alternative narratives around the stock market that were not controlled by its establishment.
Still, Silicon Valley has significant control over our social media platforms. And not surprisingly, Silicon Valley has become really good at creating stories. That’s how we get all of our bogus (and boring) origin stories, new financial instruments, and questionable new trends. However, we are rank amateurs when compared to politicians. Steve Jobs might have had a reality distortion field, but Washington‘s reality is permanently distorted. And the media who mock politicians and technologists for a living are no different. Everyone wants attention, and attention begets more attention — and more loyalty from those who crave it for themselves.
So, I highly recommend taking a moment to step away from the network if and when you get the chance. Meet some actual humans. Write down some thoughts on actual paper. Reacquaint yourself with the real world, and remind yourself of your own realness. When you inevitably return, try to stay in that mode for as long as you possibly can, which — let’s be honest — can be harder than it sounds. There are many mirages to lure you off course and algorithms eager to impose alternative ways of being. As I make my slow reentry into my networked life, I am reminded of Tim Burton’s wise words: “One person’s craziness is another person’s reality.”
February 24, 2021, San Francisco
The actions of technology platforms such as Twitter, Facebook, Google’s YouTube, and others have generated a lot of debate. This is a pivotal moment for technology and its role in what is speech in our post-Internet society. “At every level of the tech stack, corporations are placed in positions to make value judgments regarding the legitimacy of content, including who should have access, and when and how,” notes researcher Joan Donovan. Since this is an area of interest, I found this analysis informative and educational.