When it comes to my iPhone (or any smartphone for that matter), the biggest frustration I have is when the phone switches between Wi-Fi and 3G networks and just hangs. The data connection enters a weird state of “hang.” The same catatonic network status returns when switching between two Wi-Fi networks. Instead of finding the strongest network, you are stuck on a network that is weak at best.
One would think by now we would have figured out this hand-off problem, right? Wrong. As more and more Wi-Fi networks come into our lives, the hand-off problems are becoming worse and worse. A group of researchers at Massachusetts Institute of Technology got so frustrated they started working on solving this problem.
In doing so, they’ve come up with a set of new communication protocols that use information about a smartphone’s movement to improve handoffs. In experiments with these protocols, they decreased the need of portable devices to switch networks by 40 percent and improved the throughput by 30 percent. These protocols bring about many other network improvements, but that’s not the story.
The Sensory Overload
The real story is how these MIT researchers — graduate student Lenin Ravindranath, Professor Hari Balakrishnan, Associate Professor Sam Madden, and postdoctoral associate Calvin Newport, all of the Computer Science and Artificial Intelligence Laboratory — used various mobile phone sensors such as GPS, accelerometers and gyroscopes and took that data to solve a problem.
Balakrishnan jokes that the protocols came as a result of their own annoyances with the network problems, but he’s hopeful these protocols are going to be widely adopted by others.
To me, this usage of sensors to build an application that solves a common problem offers a futuristic view of what mobile apps could do. And in the process, it could bring about higher level of engagement. Sure there are some apps — like some gaming apps on the iPad — that leverage the sensors on the device, but most apps today are still nowhere close to capitalizing on the capabilities of these devices.
So far, apps that use single-sensor inputs, such as the GPS, microphone or the camera, have generated tons of excitement. Now imagine many of these (and other) sensors working in tandem and the experiences created on top of this sensor mash-up.
In his research role, Balakrishnan had been involved in the Pothole Project, which essentially used the data from the sensors to figure out all the potholes in the Boston area and plotted them on a map. That’s a clever use of sensor data from mobiles for building a web-based application. Now imagine taking that entire sensor input and making it part of an app experience.
Philippe Kahn, a veteran entrepreneur and co-founder of MotionX, described this sensor-enriched environment: “The motion-aware mobile platform is the new media.” His company uses variants of the principles articulated by Balakrishnan in its apps such as Motion X-GPS and Motion X-GPS Drive.
Motion Magic
Balakrishnan doesn’t see why there couldn’t be other applications built that are able to decipher our common motions — walking, sitting, commuting — by taking data from various sensors. This activity layer built on top of sensors can provide much-needed context, and in the process, make apps more engaging and give them a touch of serendipity.
Jeff Jonas, an IBM researcher and one of the keynote speakers at our Structure Big Data conference, often says machines inside corporations need to understand the who-where-what-when-and-why in order to get a better grip on the explosion of data and benefit from it. The iPhones and iPads are no different.
The mobile phone is not made for textual interactions, but instead, it is one, which has similar visual and contextual capabilities as we have. To achieve that goal, the app developers need to think differently and use sensor data inputs as a core building block of their overall user experience, just as they do with the data that comes from the social graph.
When we think of mobile phones, we need to stop thinking of them as computer-like devices, and instead, think of them as extensions of us. The mobile machine in our hands needs to understand what’s happening in our lives and factor that into experiences based on those inputs. In a post last year, I asked the question, can mobile phones think?
If they don’t, they will soon becoming tools of interruption and thus annoyance –- much like the irritation felt by the MIT researchers when bad network handoffs prevented them from getting their email.
What do you mean by hangs? How long is it in that state? I’m asking because I haven’t noticed a problem when my Palm Pre 2 switches between 3G and WiFi. Maybe it hangs for 1 or 2 seconds, such a short time that I wouldn’t notice it. Maybe it’s a problem with the AT&T and Verizon networks (I’m on Sprint). Or do you just not consider the Pre 2 a “smartphone”?
Smartphones aren’t the only devices with this irritation. There’s a public library in my neighborhood that I often work at. My iPhone has no problem with the WiFi there. It’s ‘g’ only and connects to one of two channels of 2.4 GHz ‘g’ that are relatively fast. The problem is that my MacBook always connects to an aging and woefully slow 5.6 GHz ‘a’ network and there’s no way I can force it to connect to anything else.
WiFi utilities that let me choose what channel to join don’t work. Moving around to find a place where the 5.6 GHz signal is weakest does no good. OS X is apparently hardwired to connect to a 5.6 GHz or ‘A’ signal whenever it’s available. That leaves me feeling like screaming, particularly when I have to wait for a slow file download. You’d think Apple, with its state-of-the-art wireless lab could fix something like that.
And you’d also think they’d give us a mode that would allow us to be informed about our actual choices, so we can select which WiFi channel to use when, like this library, multiple channels have the same name. No, all I get in one choice that hides four channels. With effort, I can force OS X to show me two of the four channels, but without being given a clue as to which “spl” is which channel.
No, Apple seems intent on giving users a system that ‘just works’ based on rules that might have made sense years ago but doesn’t ‘just work’ now. Mac users need a way to make more intelligent choices. Right now, we can’t even make choices.
Agreed and that is why I think what the MIT crew has done is actually very valuable and it would be great to see that roll into modern products.
On your comments about Apple and your woes with the 5.6 Ghz signal, I am sorry you have to suffer.
For years I have a weird thing going on with my iPhone 3G and iPhone 4 on the AT&T network. Sometimes in areas where lots of phones are being used, my 3G data rate will drop to near zero because of all the shared users. By switching manually to 2G (switching 3G off) I always get a faster rate. It seems to me the network, and not me, should switch the phone to 2G if the 3G is overwhelmed.
The same thing happens here (UK) as well. I do what you do and turn 3G off when I’m at sporting events and the like.
+1 to that. Of course, you could switch to Verizon, but their network has its own issues as well 🙂
I did a comment on your post at http://www.iakttakelser.com/2011/04/why-apps-need-some-sense-and.html
I’m stunned that you described our aim so clearly, namely: “The mobile phone is not made for textual interactions, but instead, it is one, which has similar visual and contextual capabilities as we have.”
In our billing apps we are using GPS, search, camera and gestures. That’s four senses. Just to reduce need for textual interaction.
Great article!
Thanks and good luck with your apps as well. Clearly thinking in the right direction.
…And soon we can dispense with literacy entirely! Oh boy!
I was having a similar discussion with a friend yesterday. Why should I need to manually turn on a GPS logging app on my iPhone when it should be able to pick up signals that tell it I am driving or cycling and automagically switch on its logging function. Like a personalized Shazam for movement tracking.
Looks like its coming – hopefully in the future revisions of iPhone and Android devices. Or so I have heard.
“When we think of mobile phones, we need to stop thinking of them as computer-like devices, and instead, think of them as extensions of us.”
very very well said.
“When we think of mobile phones, we need to stop thinking of them as computer-like devices, and instead, think of them as extensions of us.”
very well said indeed.