Erik Torenberg, one of the partners with Village Global, recently tweeted that “startup investing’s weird bc it takes 10+ yrs to be great—not bc of skills gained but bc of feedback loops. You could invest for 3 yrs, be a beginner, go in a coma for 7 yrs, come back & be known as an ‘expert’, despite the same level of skill.” He then asked if there were examples of other fields where this could happen.
While he was focused on venture capital, I wonder if he was unknowingly getting to a more universal truth. Just because someone labels you as an “expert” doesn’t mean you are one. People get a lot of credit these days for stumbling onto things that may very well have happened had they been standing there or not. In addition to luck and talent, it takes time to become actually good or great at something. It’s not so much the 10,000-hour theory that is popular these days, but rather it’s about learning the lessons that only time can reveal.
For much of my life, my elders — both at home and at work — have taught me that things take time. The reality is that, for most, no matter their pursuit, it takes many years and a lot of hard yards to achieve success. They may be brilliant and even exceptional, but only with time can they calibrate their processes and thinking such that they can frame the questions that lead to the most productive answers.
Take Anil Dash, who has been blogging as long as yours truly. I have always enjoyed his work, but I especially appreciate his recent writing. Why? Because he has put in the work and now has two decades worth of experience that allow him to judiciously put his anger about the injustices in technology to use. With surgical precision, he crafts opinion pieces that are blueprints for action. (His “20 Years of Blogging: What I’ve Learned” is worth a read.)
Dash has developed a framework that allows him to ask the right questions. And that is why he stands sharply from a growing cast of technology workers who are having what veteran developer and current Fender executive Ethan Kaplan calls “ex-post facto tech wokeness.”
Kaplan was specifically referring to a Buzzfeed piece about a Twitter developer who helped create the retweet button in 2009, but is currently having a moment of recrimination — as if he actually had something (good or bad) for which to take credit. As Kaplan accurately observed, “retweet was a semantic function way before it was a product function.”
“Before (Chris) Wetherell joined Twitter, people had to manually retweet each other — copying text, pasting it into a new compose window, typing ‘RT’ and the original tweeter’s handle, and hitting send,” Buzzfeed reminds us. “With the retweet button, Twitter wanted to build this behavior into its product.”
In other words, Twitter wanted to provide people with a way to do what they have always done. For better or worse, we have been passing along information to others since before we began painting on cave walls. Stories were told, and retold, and then retold. They became fables. In some cases, they led to wars. Lies spread, and heads were chopped off. Things got very bloody, indeed. And all without the retweet button.
Fast-forwarding to the turn of the millennium and the golden age of email, it was truly staggering the amount of news stories, random gossip, and plain crazy stuff that got forwarded around in those days. My AOL email account was as cluttered as a home on Hoarders. Anyone remember those crazy email threads that were ruining our sanity till as recently as 2016? If “retweet was a loaded gun given to a four year old,” then the forward button on email was like giving a twitchy teenager a machete.
If you trace the history, it is clear that the retweet button was coming. The social platforms are geared toward keeping people locked into the game we call social media. (By the way, Elon was right about this too — we are all living in a simulation.) What would have been truly impressive is if someone had possessed the wherewithal to prevent it.
Just as the technology industry at large lacks empathy, it is also missing the anthropological gene, which is bizarre considering how much technology impacts humans and society. If we knew more about past behaviors or understood humans at more social level (instead of just as a collection of data points), we could better appreciate how things can go wrong.
Before Twitter, Wetherell was at Google working on Google Reader. So, he knew how information spread online. But he didn’t ask himself what would happen if these systems were metastasized. My takeaway from the Buzzfeed piece — and maybe I am missing something — was that optimism blinded the Twitter team. They were swept away by their desire to grow and keep the engagement up.
This is easy to do in the technology ecosystem, because there is a faint regard for history of any kind, be it cultural and technological. This bias is not necessarily incorrect — it is impossible to invent the future if you aren’t predisposed towards doing so. It takes an insane amount of optimism and self-confidence to think you have an idea about what the future should look like. But it takes more than that to be the one who actually made it happen.
“Tech history is poorly understood,” Dash writes. “As a result, many in tech don’t understand how tech can have negative impacts when they think of themselves as good people.”
This is such a crucial point. This lack of appreciation for all that has led up to the present moment prevents many in the industry from asking about the unintended consequences of technology and the products we create. You will only be able to make a meaningful improvement to society’s overall trajectory if you use the lessons of history to pose the right questions and make better decisions.
For that kind of expertise, you have to be fully awake. Of course, now people want to claim their “expert” status, not for any visionary accomplishment, for allowing time to run its course. That’s easy to do, as Erik pointed out on Twitter.
This first appeared on my weekly newsletter dated July 28, 2019. If you like to get this delivered to your inbox, just sign-up here, and I will take care of the rest