Social music is streaming’s new growth driver, generating around $1.5 billion in 2020 and growing fast in 2021. It represents a natural evolution of social media rather than an evolution of streaming.
Nothing is more frustrating to me than YouTube, which decides my front page based on my likes. It seems I can’t have multiple interests — variables — and thus, I must watch certain kinds of videos. In its infinite wisdom, Twitter believes that only the people whose content I like or share are the ones whose content I want to consume. And don’t get me started on online dating services — they could learn a thing or two from Sima Taparia.
And that is because the post-social world of today is starting to coalesce around variables that are less humanistic and more biased towards corporate goals. “We live in a world that demands categorization,” I recently read in a newsletter, Tiny Revolutions. “We have to do some self-definition so the world knows what to do with us, and so that we can bond with others who share our interests, values, and concerns.”
While the writer, Sara Campbell, might have been talking about an individual’s desire not to be categorized, her words accurately describe our post-social society’s reality, dilemma, and futility in a handful of lines.
Categorization is part of the human condition. Our brain uses categories to help us make sense of a lot of facts we experience. It is how we learn. As humans, we need categories to contextualize our world, and that includes each other. What is more important is the intent behind the categories.
Categories, as such, have bias by intent. The bias allows us to ignore variables we don’t want to deal with and place boundaries around a category. It’s important because by ignoring them, we have to use fewer cognitive resources. The bias itself is not good or bad. It is the intent that leads you in different directions. That intent determines what variables we focus on and the ones we ‘choose’ to ignore.
And a lot of that intent is determined by the human condition. For example, if you have grown up in a more traditional society, the category that defines you is your lineage for most of your life. The “intent” of that categorization is to find your place in the social hierarchy. Lineage isn’t a primary variable for Americans, but college and money are. That is why in more modern societies, such as America, the college you attend defines your place in society and the workplace.
Ever wondered why most conversations start with a question: what do you do? That question is not only reflective of our fading art of conversation, and it also is a way for us to define the variables and get a quick context on the person. By doing so, we quickly decide to assign a value-metric to the person who is the recipient of our attention.
At best, in the pre-Internet world, categorization would rear its head in a social context, often giving us cues on how to engage with someone. An attractive single woman gets a different kind of attention from another woman versus a single man. Given the nature of modern consumerist society, it wasn’t a surprise that the emergence of databases allowed marketers to categorize us into “buckets” of those who may or may not buy some products. After all, the early usage of computers had been catalyzed by the demands from governmental agencies and corporations that wanted to use data to create categories.
However, in our post-social society, these categories have become even more granular and metastasized. Just take Facebook as an example. School, location, gender, relationships, and many more variables have started to create a profile of us that can be bundled no different than the dastardly collateralized debt obligation (CDO.)
And it isn’t just Facebook that is alone in using so many variables. From online dating services to online marketing to banking, most of them feel both antiseptic and plastic. These data variables are what make up an algorithm whose sole job is categorization. At present, the algorithms are relatively simplistic. They lack the rationality and nuance that comes from social science.
The bigger question is, what if all these data variables picked by companies for their own needs don’t define you or your interests. I suspect all of us be trapped in a data prison — forced to live lives that an invisible black box algorithm will decide what is good for us.
August 24, 2021, San Francisco