Eyephone, not iPhone, dummy!

Iphon

FaceTime aka video/selfie calls imagined in the 1920s. Today doesn’t look as crazy as it seemed to a century ago. The iPhone of today can do many of the things this “eyephone” was predicted to do. I wonder if we as a society have the ability to astutely predict what might come in 50 years, forget a 100 years. I am not so sure.

It is all about the good stuff

Betaworks’ John Borthwick in a must read essay on the state of the media writes:

It seems like we have two opposing trends going on simultaneously. On one end of the curve rabid sharing is driving an attention cycle of seconds but on the other end people are reading and watching more. The social web has flattened web sites and made the home page irrelevant to many sites — simultaneously the shift to the phone/tablet and the mobile app internet is unbundling the web that we knew. The combination of these two trends is changing media and how we use and experience it. Its a complex world we are creating. And beneath this somewhat toxic mix of speed sharing and skimming there is an undercurrent of longer form media use.

You can call it long form! You can call it features! You can call it whateverthehellyouwanttocallit! The fact is story telling and great writing always finds an audience. The length of an article, a video or a song has nothing to do with the quality and how it makes you feel about it. It has and will always be about the good stuff. It was true before the Internet. It was true before the mobile revolution and it will be true till we humans continue to retain passion for stories.

Got cash? Here is some awesome stuff to buy

How about a Boon Island lighthouse for $12,000 or a 76-foot lighthouse in Maine for $30,000? And if you are feeling really flush, how about a whole village in Italy for about $340,000? If you end up buying these, invite friends to help refurbish them. When all that is done, how about inviting me over for a cup of tea? via + via

With Big Data Comes Big Responsibility

“You should presume that someday, we will be able to make machines that can reason, think and do things better than we can,” Google co-founder Sergey Brin said in a conversation with Khosla Ventures founder Vinod Khosla.  To someone as smart as Brin, that comment is as normal as sipping on his super-green juice, but to someone who is not from this landmass we call Silicon Valley or part of the tech-set, that comment is about the futility of their future.

And more often than not, the reality of Silicon Valley giants, who are really the gatekeepers of the future, is increasingly in conflict with the reality of the real world!  What heightens that conflict — the opaque and often tone-deaf responses from companies big and small!

Silicon Valley (both the idea and the landmass) means that we always try to live in the future. We imagine what the future looks like and then we try and build it. Sometimes that future delights us and we embrace it whole heartedly, like with iPhones and Android-based smartphones. And  sometimes, that future seems so dystopian that society is scared and unnerved by the unknown.

That Uncanny Feeling

Facebook’s emotional experiments are an example of that future. Sara Watson, a fellow at the Berkman Center for Internet and Society, in an essay about data and advertising brought up the 1970s concept of the Uncanny Valley aka “the unsettling feeling some technology gives us.” Watson continues in her essay: “Technologies that are simultaneously familiar and alien evoke a sense of dread. In the same way, when our data doesn’t match our understanding of ourselves, the uncanny emerges.”  

That uncanny feeling is what we are confronted with Facebook’s emotional manipulation through algorithms. It is not necessarily because of the experiment, but what the experiment portends. It is the future where machines manipulate our wants and our desires and preempt our needs and emotions. We are scared because we will lose the illusion that we are making decisions that run our life. There is no coming back once we cross the threshold.

Facebook’s emotion-driven-engagement experiments are tiny glimpse of  what really awaits us: a data-driven and alogrithmic future, where machines make decisions on our behalf, nudging us into making decisions. As I pointed out in my recent FastCompany magazine column, the new machine age is already underway, unseen by us. “It is not really just a human world,” said Sean Gourley, cofounder and CTO of Quid who points out that our connected world is producing so much data that it is beyond human cognitive abilities and machines are going to be part of making sense of it all. So the real question is what will we do and what should we — the technology industry and we the people do? From my perspective, we need to start with the raw material of this algorithmic future: data. Whether it is a billions of photos that carry a payload of emotions, relationships and location data, or status updates announcing the arrival of a new one or those searches for discount Prada shoes or a look-up about a medical condition — there is someone somewhere vacuuming our data droppings and turning them into fodder for their money machine.

For sale, our data

Forbes tells us that even seemingly benign apps like Google-owned Waze, Moovit or Strava are selling our activity and behavior data to someone somewhere. Sure they aren’t selling any specific person’s information, but who is to say that they won’t do it in the future or will use the data collected differently. I am actually amazed that cities are willing to trade data such as photos from traffic cameras that impacts its citizenry to a privately-owned company (in this case, Google) without as much as a debate. I am sure, a new parking lot gets more attention from the legislators.

Further down in the story, a Waze spokesperson remarked that the company can tell what speeds you drove from a “point a” to a “point b.” What if they sell that data to an insurance company, that then uses that information to raise insurance premiums.

Did you know at the time of signing up for Strava, that lovable cycling and running activity tracker is sharing real time user data and selling that to municipalities for 80 cents a year. In what universe does it make sense for the company to do that without asking, and have a company spokesperson blatantly admit to a Forbes reporter that, the default is opt-in — a malaise popularized by Facebook. Because not doing so means, actually explaining to people what they intend to do with that all that personal information.

And to be honest that is the crux of the problem — we, the citizens don’t really know what these data-hoarding companies — big and small are really going to do with all the data they have about us in their databases. How does a big company like Google use the data that resides in various different databases — Nest, DropCam, Waze, Android, Google Maps, Google Mail and Google Search — in tandem?

A few weeks ago when reading The New York Times interview with Google co-founder and CEO Larry Page, I kept hoping that the interviewer would really dig deeper into Google’s stance on privacy, data gathering and what they plan to do with all the information they are gathering about us. What and why of Google’s grand vision for the data it collects is an important issue and it would be nice to know what Google intends to really do with it.

The reality is that with all this information that is out there about me, then we should have a talk about things such as our rights as a citizen over that data. I am not saying let’s all go back to the villages and caves, but instead why not have a conversation (that is not hysterical but also not dismissive) about these issues around data, expectations of privacy and transparency.

When Facebook released its Home app, I was unsettled by it mostly because it took away any notion of privacy. That post just might have been about Google or Amazon or Apple as well. Data from GPS Sensors is enough to quickly deduce your home location, work location, sleep and patterns. Add data from Waze app or Google maps, and Google can figure out what route you take. The data from accelerometers and gyro meters, a company can deduce some physical ailments. New sensor processors can add even more human-like abilities to our phones.

Look, I am actually delighted about the possibilities of what can happen with all that data and sensors. I can’t wait for future of better medicine to arrive. I also can’t wait for Google Cars to become common place. What I don’t care about is that all these changes are happening with nary a thought about its impact on our society. If we as an industry are change agents and can want to talk about age of abundance in 50 years, we can’t ignore that next 50 years might mean a tear in our social fabric.

It is important for us to talk about the societal impact of what Google is doing or what Facebook can do with all the data. If it can influence emotions (for increased engagements), can it compromise the political process? What more, today Facebook has built a facial recognition system that trumps that of FBI — think about that for a minute.

Can we trust these Medici of modern times to regulate themselves and do the right thing? How long before the pressure of Wall Street and its incessant quarterly demands makes Facebook or Google go to thinkable places? These are issues of our times — something I had initially discussed in my posts about data darwinism and its impact on society.

Automation Ahead

Automation of our society is going to cause displacement, no different than mechanization of our society in the past. There were no protections then, but hopefully a century later we should be smarter about dealing with pending change. People look at Uber and the issues around it as specific to a single company. It is not true — drones, driverless cars, dynamic pricing of vital services, privatization of vital civic services are all part of the change driven by automation, and computer driven efficiencies. Just as computers made corporations efficient — euphemism for employed fewer people and made more money — our society is getting more “efficient,” thanks to the machines.

“There is an increasing realization of the pain brought about by all these changes, especially the number of industries being disrupted and the many jobs that have been lost and will never come back. We hope that, as in the past, new industries will give rise to exciting new jobs,” writes Irving Wladawsky-Berger, a veteran technologist who worked for IBM“But, no one knows for sure. It’s important that we collaborate across disciplines, – technology, business, social sciences, humanities, – to better understand and anticipate where the journey might take us.”

And that is exactly the problem — no one really wants to take that humanistic approach. The hybridization of man and machine has begun in earnest. Google, Facebook and Amazon know that and are quite far ahead than rest of the world. Take Facebook, for instance knowing how to manipulate our emotions based on information they surface for us? How about Amazon’s future ability to predict our commercial needs.

What about those new voice processing chips inside the smartphones that will constantly listen to what is happening to the world around us and help create magical experience for you. What about the sensor data collected from other sensors on the phones. What are the rules around the privacy of that information? Who is making those rules?

John Foreman, a data scientist at MailChimp, in an eloquent essay, pointed out that “humans are bad at discerning the value of their data” and that the “personal data just appears out of nowhere, exhaust out of life’s tailpipe” and thus we are willing to trade it for something that seems less valuable. Foreman’s argument points out the futility — we are trading our freedoms in the data age for some minor gains.

In March 2013, in his keynote at Gigaom’s Structure Data conference Quid’s Gourley estimated that it costs $1.20 a year for Facebook to generate over $6 per year in revenues. We are willing to trade our data for less than what it costs to get a cup of coffee at Starbucks. “Our past data betrays our future actions, and rather than put us in a police state, corporations have realized that if they say just the right thing, we’ll put the chains on ourselves,” Foreman writes. “In the hands of machine learning models, we become nothing more than a ball of probabilistic mechanisms to be manipulated with carefully designed inputs that lead to anticipated output.” 

MORAL IMPERATIVE

While many of the technologies will indeed make it easier for us to live in the future, but what about the side effects and the impacts of these technologies on our society, it’s fabric and the economy at large. It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse, and will deserve the bleak future we fear the most.

The sad part is that the legislators and the judiciary bodies of our nations are woefully under equipped to deal with the monumental change that as a society are experiencing. In a way, I feel, Silicon Valley and the companies that control the future need to step back and become self accountable, and develop a moral imperative. My good friend and a Stanford D.School professor Reilly Brennan points out that it is all about consumer trust. The concept of Waze working with municipal groups in theory should be a good thing, but we are all highly skeptical and suspicious of the motives of data collectors.

Like I said, a lack of clarity around data-intentions is to blame. And the only way I see to overcome that challenge is if companies themselves come up with a clear, coherent and transparent approach to data. Instead of an arcane Terms of Service, we need plain and simple Terms of Trust. To paraphrase Peter “Spiderman” Parker’s Uncle Ben — with big data, comes big responsibility. The question is will the gatekeepers of the future rise to the challenge?

Be willing to start again

One day, before Apple opened its first store in May 2001, Johnson was riding with Steve Jobs to a weekly planning meeting about the store Johnson was charged with designing. Johnson told his boss, “Steve, I’ve been thinking. I think the store’s organized all wrong. We’ve organized it like a retail store around products, but if Apple’s going to organize around activities like music and movies, well, the store should be organized around music, and movies, and things you do,’” Johnson recalls. “And he looked at me and he said, ‘Do you know how big a change that is? I don’t have time to redesign the store.’” Then, 10 minutes later, at the meeting, Johnson recalls, “Steve walks in and the first thing, he looks at the group and he says, ‘Well, Ron thinks our store is all wrong.’” Jobs then added ‘And he’s right, so I’m going to leave now and Ron, you work with the team and design the store.’” That lesson, of not doing it fast, but doing it well, “carried through to so many things I’ve done,” Johnson says. “It’s not about speed to market; it’s really about doing your level best.”

Former Apple Retail Chief addressed Stanford’s Graduate School of Business students. It is a great talk, worth watching.

What I am reading today

Style, Story and Inspiration

“Fashion doesn’t save lives, it supports the soul. I don’t care if you buy clothes from me or if you go to a street market and buy a fake copy. The most important thing is that you are inspired by my designs. It’s important to love life, dream, be inspired, inspire others and become interested deeply in something. If you miss these you become a businessman of life. I don’t want to become a businessman. Style is not a question of how much money you have in your purse, it is a state of mind” Giambattistia Valli

Valli is an Italian fashion designer who makes clothes for women, but her suggestion that one should “dress from the heart” as if you are in love is a good yardstick for men as well. I love her concept that style is a story  — where in even the most basic of outfits, or the things you own tell your story. She says, “How can I think about tomorrow if I don’t know what was yesterday?” That is a good advise not just for fashion and clothes, but also on how we design the future here in Silicon Valley. 

Enter your email address to follow this blog and receive notifications of new posts by email.