For the past week or so, our little corner of the Internet has been abuzz with news of mobile apps uploading iPhone(s AAPL) address books without asking us — the iPhone (and address book) owners. It all started with Path, a much talked about iPhone app that offers a very limited private social network. A programmer/blogger in Singapore discovered that the San Francisco startup co-founded by Dave Morin, an early Facebook employee, was uploading and storing the address book data to its servers.
In the wake of ensuing outrage, the company changed its policies and is now asking for explicit permission. It also issued an apology and deleted the data uploaded to its servers. As hours (and then days) went by, it was obvious that Path wasn’t the only company or mobile app that was indulging in this behavior. Other apps, such as photo-sharing service Instagram, quickly changed their policies and issued software updates.
The whole issue seems to have boiled over with Nick Bilton’s piece in the New York Times — So many apologies, so much data mining. Today, we have seen a lot of discussion, including some like angel investor Chris Dixon wondering why is Apple allowing this behavior and not forcing apps to ask for permission? These are all legitimate arguments and counter arguments. However, the most important question is, what do we learn from all this and where do we go from here? What is that question we should be asking ourselves?
Do The Right Thing
Apologies, outrage, anger and blaming Apple for not being more strict overlooks one of the most important aspects — the moral imperative. As I sit here are watching the white-hot arguments, I wonder why we aren’t talking about doing the right thing. Why do I bring this up? Today’s apps are inherently more social and thus by extension more human. The relationships on this social web are going from increasingly virtual to more real. In a sense, these apps have started to reflect our daily lives. As many have said before, we are the social web and the social web is us.
Dave Winer in his post, How industries react to crisis writes:
It’s time to make this change in tech, once and for all. Your products are not toys, they are used seriously by real people. You need to show respect for your product, and that means respect for your users.
Our daily lives have many layers of trust built into them. There is an implicit social contract that implies that trust. Doing business with your bank, dry cleaner, green grocer and coffee shop is built on that trust. We are friends with others whom we trust. We work with people we trust. And that trust is what drives us to do the right thing. As my colleague Mathew Ingram wrote in his piece, Lessons from Path and Pinterest:
The lesson here is that for social apps, the trust of users is paramount, and the best way to maintain that trust is to be as open as possible about everything that is occurring, particularly if it involves a user’s personal data. Whatever you are doing with it may not seem like a big deal to you, but better to be open about it than have it revealed by someone else, at which point you look sneaky. As Craigslist founder Craig Newmark has put it, “Trust is the new black,” and it never goes out of style.
Social apps of today need to understand this concept of trust and doing the right thing. Just because Apple doesn’t have a “permission” behavior imposed on the apps, doesn’t mean that the apps have to do it. Just look at Marco Arment’s Instapaper. It uses the address book access on the iPhone in a manner that is respectful of his customers, their privacy and yet balancing it with the needs to grow his business. Here is what he wrote:
When implementing these features, I felt like iOS had given me far too much access to Address Book without forcing a user prompt. It felt a bit dirty. Even though I was only accessing the data when a customer explicitly asked me to, I wanted to look at only what I needed to and get out of there as quickly as possible. I never even considered storing the data server-side or looking at more than I needed to. This, apparently, is not a common implementation courtesy.
Could he have grown faster had he followed the herd? Perhaps. What I am saying is that as the web becomes more social, we need to think about “people” first when designing software. Instead of asking for forgiveness, the app makers have to start by being explicit in what they do or don’t do.
As we look into the future, the web of today is not that of early adopters. It includes moms, grand-moms and others who may not be savvy about permissions and privacy. And that is why the imperative is on the app developers to do the right thing. Is it hard? Perhaps. Will it slowdown the growth of some of these companies? Maybe, maybe not — we won’t know until someone tries it. However, building trust and being upfront about how apps work and how our data is consumed and relates and impacts us can’t necessarily be a bad thing.
What is trust?
Trust is a tradeoff between different risk factors. Therefor it is seen differently by person, age, experience in any of the risk factors and …. It’s in general more like an analog, what will flip if any or all risk factors go over a personal acceptable level.
As for this risk, may I suggest the people who think no biggy read up on tcpdump. Just one doesn’t know or think nothing can happen means it will not happen or nobody can or will do it. Then think of data finds data or cross correlation and that I got a firewall whith has rules and data in a conventional db connected to kernels, means it can run massive parallel (correlation)…
I’d like to know what other apps do this.
Yes, that’s where this discussion should be headed. Finding out the dimensions of the mess, and then cleaning it up. Quickly and deliberately, without any side-trips into the personalities of reporters and bloggers.
Agreed I think it is important enough issue that instead of finger pointing we start talking amongst ourselves and come up with best practices for this changing reality of social/apps.
funny, I never needed to have a discussion with my gardener that he is not supposed to go through my mailbox while he is there on the property.
This is a really old phenomenon. I had just acquired my iPhone 3G and downloaded the app Fring. To my consternation, I immediately found my whole address book sucked up in their app and to their servers.
I was really annoyed and even wrote an email to Late Steve Jobs’ public email address. Nothing happened.
Shortly after, WSJ also wrote a piece on what information the apps were pulling, storing un-securely and relaying to their own servers. It was like party time for the apps, with no concern about who owns what data on the device.
It’s like you got a contractor in your house to clean the carpet and they proceed on to go through your mail.
Couldn’t agree more. Read our report from 2011 – Netpop | Connect: Trust is Social Currency. It’s free! http://www.netpopresearch.com/node/26713.