The New York Times and the Wall Street Journal, two publications with different ideas of moral dimensions, are asking the same question, one that has been on my mind for a long time: How does Silicon Valley reconcile the reality of its success with the fear its success instills in real people whose lives it sets to redefine?
“The days when we could just trust the geeks to have more or less our best interests in mind are gone,” notes the Wall Street Journal‘s Christopher Mims. He points out, “If tech is truly the new finance, its leaders might want to ask their counterparts on the opposite coast how they like the scrutiny they are subject to.”
Meanwhile Nick Bilton at the New York Times writes, “There have been dozens of instances when [startups] acted poorly, even unethically — sometimes playing fast and loose with our personal information, other times taking advantage of the lack of government oversight.” Bilton notes somewhat bombastically, “While every industry has its moral quandaries to contend with, Silicon Valley is in another orbit.”
Columnists and politicians are raising these questions because of Uber’s recent scandal and its take-no-prisoners culture. Uber is a wonderful service, but it has managed to create a reputation as a company without a moral compass. But focusing solely on Uber is a folly. Instead we have to look at how things are changing from a sociocultural perspective. We are in uncharted waters, thanks to the state of permanent connectedness. My March 2013 essay “Uber, Data Darwinism and the future of work” pointed out:
The shift from a generation that started out un-connected to one that is growing up connected will result in conflicts, disruption, and eventually the redrawing of our societal expectations. The human race has experienced these shifts before — just not at the speed and scale of this shift.
In a column for Fast Company, I pointed out the parallels between Uber and Google. They are both driven by data and metrics as well as the bottom line and growth. While Google ran roughshod over the media industry (helping create new ones, like blogs), Uber is running like a tractor trailer over the taxi industry. As I wrote, “If Google’s primary weapons are relevancy and speed, then Uber’s are cost and speed.” What I didn’t say: They are fairly similar in their inability to deal with consumers at a human level. That is the challenge of our times.
Having watched technology go from a curio to curiosity to a daily necessity, I can safely say that we in tech don’t understand the emotional aspect of our work, just as we don’t understand the moral imperative of what we do. It is not that all players are bad; it is just not part of the thinking process the way, say, “minimum viable product” or “growth hacking” are.
But it is time to add an emotional and moral dimension to products. Companies need to combine data with emotion and empathy or find themselves in conflict with those they deem to serve. A few months ago I wrote about the responsibility that comes with gathering data:
While many of the technologies will indeed make it easier for us to live in the future, what about the side effects and the impacts of these technologies on our society, its fabric, and the economy at large? It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse and will deserve the bleak future we fear the most.
In this piece I discussed the notion of “terms of trust.” How difficult is it for a company to say that it will not allow anyone to access your private data willy-nilly? Or that your data won’t be sold to third parties for the benefit of the said company — but not you, the customer — especially without asking for your permission? How difficult is it for highly paid lawyers to come up with language that is understood by everyone, from boomers to millennials?
The Times and the Journal are talking about regulatory scrutiny, and that is perhaps a bad idea. If there is one group that is even more ill-equipped to handle the changes of the future, it is our legislative bodies. They are woefully out of touch, focused on today’s politics and reelection instead of the long-term. In Steve Jobs’ words, “You’ve got to start with the customer experience and work backwards for the technology.”
——-
Addendum: A random search on phrase “moral dimension” led me to these Five Moral Dimensions of the Information Age outlined by Jaana Porra, associate professor at Bauer Business School in the University of Houston. I feel this is a good summary of how we should be thinking about the challenges of our times. I think Information/Data are interchangeable.
- Information Rights and Obligations: What information rights do individuals and organizations possess with respect to information about themselves? What can they protect? What obligations do individuals and organizations have concerning this information?
- Property Rights How will traditional intellectual property rights be protected in a digital society in which tracing and accounting for ownership is difficult, and ignoring such property rights is so easy?
- Accountability and Control: Who can and will be held accountable and liable for the harm done to individual and collective information and property rights?
- System Quality: What standards of data and system quality should we demand to protect individual rights and the safety of society?
- The Quality of Life: What values should be preserved in an information- and knowledge-based society? What institutions should we protect from violation? What cultural values and practices are supported by the new information technology?