Facebook, Social Media & the Social Contract

Photo by Leio via https://unsplash.com/@leio

Last Monday, I attended the Milken Global Conference and participated in a panel on “Social Media and the Social Contract.” The panel was moderated by Willow Bay, dean of the University of Southern California’s Annenberg School for Communication and Journalism. My fellow panelists included esteemed folks such as Facebook co-founder Chris Hughes, Cheddar CEO John Steinberg, and Tristan Harris, co-founder of the Center for Humane Technology. VentureBeat published a report of the event (video here).

My opinions about Facebook and how it treats privacy as a yo-yo for the amusement of its real customers — the advertising community — are well documented on the blog (and elsewhere). So a lot of it might not seem new, but there are a few things I wanted to highlight and give more context.

First, why are we suddenly talking about Facebook, privacy and data breaches?

My answer to why is pretty simple: we are in between the past and the future. For the longest time, we’ve been controlled by the rules and ideologies of the industrial era, where the world moved at a more human scale. Now we’re heading into a world that runs at the speed of the network. As human beings, we’re finding out that there are actors out there and we lack deep insight into how they operate on networks. As a technology optimist, I know the very same technology which is creating problems in the world and society, will also come up with solutions. But we have to keep asking the question: How do we manage the future? The answers aren’t going to be easy and will take time.

And while it is easy to single out Facebook, let’s not forget that other actors are influencing behaviors as well. Others are just defraying privacy by the day. It’s across the board. It’s a social media problem, a digital media problem. Anyone who says it’s just a Facebook problem should see the scripts running on their website. All the newspapers are such hypocrites and run tracking scripts on their platforms.

With that said, Facebook is the largest and most efficient network, and as a result, they are the most efficient at behavior modification. For me, Facebook is like the genetically modified tobacco, excelling at social manipulation. It’s done a great job of what it was started to do – not in 2004, but post-2008 – and as a result it is now the most efficient advertising platform in the history of humankind. They have done a great job of that. You can see that in the stock price, in the earnings. The system is working as intended. There’s nothing crazy about that, from a technology standpoint.

From a social and cultural standpoint, people are finally waking up to the fact this is this is not good for us. I don’t think people have widely realized just how bad it is, but they’ve just realized it’s not right. (And yet they continue to use Facebook, as much or more.) The way I think about the problem we have right now is to think about the problem we had in the past with tobacco. Tobacco has been around forever, but it wasn’t as addictive until cigarette companies and others genetically modified it to become more addictive. Behavior modification in media has always been around, but Facebook took it to the next level.

People might not like the tobacco analogy, but when you have smart writers who should know better writing that they can’t quit Facebook, what else do you call it. In many ways making tobacco companies to put statutory warnings was a big step forward, we need to do the same with social media, data, and privacy. I have written about the notion of Terms of Trust. Instead of terms and services, companies big and small should be forced to write terms of the trust. Rather than saying what they will do with our data, how about starting with what they will not do with our data?

Facebook as a platform owner failed because they allowed user data to leak through some third party apps. Like Keystone cops, they failed to do their job of protecting what information people were entrusting them with. But the question is not just about the data they are collecting now, but also what they are going to be collecting in the future. I think it is important for us to realize that we are starting to see a big shift in technology and how it influences society. If there is any regulation, it needs to be around what they will be able to do in the future.

May 7, 2018, San Francisco

A letter from Om

Sign up & get it delivered to your inbox