Updated. Earlier today, our little corner of the web (aka the blogosphere) lit up with rumors that Apple (s AAPL) was buying Nuance (s NUAN), a Burlington, Mass.-based company that makes voice recognition software. The reports were based on some comment made by Apple co-founder Steve Wozniak. Some of our friends were quick to dismiss those rumors, and rightfully so. Update: Woz himself admitted that he got the story wrong.
That said, the sentiment behind those rumors –- Apple buying Nuance — makes perfect sense. There are three reasons why I think so.
- Apple typically buys companies with technology depth and technology leadership position.
- Apple’s software development kits presently lack voice recognition capabilities.
- After multi-touch, voice is generally viewed as part of any future mobile user experience.
In comparison to Apple, both Microsoft (s MSFT) and Google (s GOOG) have their own in-house voice recognition technologies. Remember Microsoft bought TellMe for about $800 million, back in March 2007. Google used now-defunct service like 1-800-Goog-411 to build its own voice recognition technologies. Both Google and Microsoft have been quite public in their desire to use voice as a key part of their mobile OS experience.
At an event earlier this year, Google showed off some voice-based applications that gave us a glimpse into their ambitions. One of these apps, Voice Search for Android, was essentially a voice-to-mobile interface that allows you to perform about 12 actions including search, text messaging, looking up music online and playing it back using your favorite service, writing emails, looking up locations, and placing calls to those locations.
The voice-to-mobile interface means that Google now can offer an infinite amount of computing on their mobile devices, as it makes using the cloud more effective. It can do that because it has built its own voice recognition technologies.
Apple, which bought Siri in April 2010, can build a similar compute-in-the-cloud experience and use it as a way to pack more punch in its platform. Apple already hinted at this last year. Of course, it could license voice recognition technologies, or it could just go out and buy Nuance, which is the king-maker in the world of voice recognition technology.
Nuance owns numerous (and more important) patents that give it a massive leg up over rivals. The company has been pretty aggressive in defending those patents, and it makes perfect sense for Apple to own these patents and voice recognition technology. Further more, Apple would be getting access to technologies that are fairly mature and are in use by many iOS app developers.
Owning a major part of the speech-related technologies would allow Apple to dominate what GigaOM Pro analyst Dr. Phil Hendrix likes to call “speech as a platform” (SaaP) market. In his research report, “How Speech Technologies Will Transform the Mobile Use,” Dr. Hendrix points out two main reasons why speech’s time has come on the mobile:
- Mobile devices now have more powerful processors and expanding memory, so new mobile devices are capable of running speech-enabled apps that just 2-3 years ago required a PC.
- The most sophisticated speech technologies require enormous computing power and ubiquitous wireless connectivity, which makes it possible to do all these complex tasks in the cloud.
Now that I’ve laid out a case for Apple to buy Nuance, will it happen? I haven’t the foggiest, but one thing working against the deal is the price –- Nuance has a market capitalization of $5.22 billion. In order to buy the company, Apple would have to pay a premium: somewhere between 30-50 percent over closing price. That’s assuming that none of the other companies — Google and Microsoft included — won’t make a play! Nuance would be one expensive purchase for Apple, even with their big cash hoard.
There are other aspects of Nuance’s business that Apple would need to deal with. Nuance is a big player in desktop speech recognition software, medical transcriptions, and speech-to-text business for carriers and handset makers. There’s no way Apple wants to get its hands dirty handling those non-core technologies. Of course, Apple could buy Nuance and then spin out all these product lines by selling them off to a private equity group. In other words, this deal could be distracting and messy — something Steve Jobs might abhor.
Still, if there’s one company they should be looking at, Nuance could be the one.
17 thoughts on “Why Apple Should Buy Nuance”
Fair points but there’s 2 speech players with equivalent or perhaps better technology, good enough market position, and much less expensive than roll up Nuance 😉
Steve want to share some names here 😉
Maybe Apple can fix the disaster that Paperport has become?
Om sure why not 😉
Hi Om: I appreciate your sentiment, but I don’t think an Apple acquisition of Nuance is in the cards. I’m assuming that Steve is referring to Loquendo (an R&D business unit of Telecom Italia) but I’m not sure who the second one might be (tho there are other subsidiaries of larger companies that might be spun off.
Don’t forget that Apple and Nuance already have a pretty good business relationship, with Nuance “powering” both Siri and Voice Control on the iPhone. Apple could quickly add APIs for “native” voice in the iPhone SDK based on the tight integration of both recognition and text-to-speech software from Nuancd.
I don’t think either company wants an exclusive relationship at this point. Apple has a bunch of patents of it’s own to support a multi-modal user interface, including multi-touch. Ditto for Nuance surrounding predictive texting and speech recognition. As you correctly note, there’s quite a bit of common interest here to be leveraged. But Nuance is also working closely with IBM to define the next gen of mobile interface that includes speech. How Big Blue and Apple would work together is an issue as well.
It’s ironic that all the speculation started with a poor choice of words by Steve Wozniak. We could all dream of more “speechable moments” on the iPhone thanks to the acquisition of Nuance but, as you noted, divesting or shuttering the unwanted elements, which is more than half the business right now, will probably preclude an acquisition.
Thanks for the insight. Looks like speech technologies are part of those key technologies that go into the core of the OS/platform and if that is indeed the case, Apple should think about buying Nuance and making its patents/technology part of the Apple SDK. Sort of like location-based services.
On the IBM/Nuance connection, well Apple has a history of working with IBM in the past. 🙂
honestly … we aren’t socially at the point yet where high levels of voice recognition/transcriptions make sense on mobile devices
one of the prime uses of mobile devices is the privacy, you can use them anywhere without people really noticing … note the massive increase in text and other data, and the corresponding decrease in MOU
in order to effectively use voice, the user would have to speak out loud exactly what they want done in a public setting … sure itll be useful in your car or in private, but then it requires the user to master both a verbal and non-verbal skill set …
the other issue is that true voice recognition is also semantic recognition … which is not just recognizing the the wording but the intent of the query … if you look at how people converse, everyone gives instructions in a different way, will everyone learn a single voice method? ,,, i personally doubt it
voice queries tend to be much more free flowing than text ones … the limitations of the results return is an issue … try asking for “best place to eat in vancouver” … and having a program list verbally 100 results a la google … the results must be absolutely concise … thus semantic recognition not just of the intent of the query but also the best result
you just have to look no further than your favorite IVR for the pitfalls …
crappy voice recognition is another reason why iPhones are inferior to Androids
crappy virtual keyboard is another reason why androids are inferior to Androids 😉
PS: I use blackberry so for me both don’t matter as much.
Recognition != understanding
Understanding requires context. Context includes more than a monotone command, well one can program with punch cards too.
In other words the machine has to be situation aware to make sense out of a command and apply the right action. Anybody really solved that? IBM has some nice research and even commercial products in that area, but none combined effort. Apple?
Pardon me Om, but wouldn’t this be as bizarre as Intel acquiring McAfee?
Speech Recognition technologies span more than Mobile or Mac based computing. If Apple does acquire Nuance, the latter’s speech processing innovations will probably be limited to Apple based platforms or apps only. That would be too bad because although Nuance will increase adoption by sheer numbers, I don’t personally see Apple working with banks to incorporate speech recognition technologies for their ATMs, or with Gesture-Speech detection possibilities, voice based authentication services, etc.
Needless to say, future roadmaps will be highly tailored to Apple’s requirements and Apple could really care less about any other Speech applications than the ones it needs.
What about the voice recognition that Apple has built in to their OS since the 90’s (at least)? Is it not good enough?