Imagine this. The baby is sleeping upstairs. One of those monitors in her room plays her noises down to the kitchen. The parents can hear her thrash and gurgle. But those sounds are in the background. More prominent is a computer voice that announces: "The baby wet her diapers at 1:23. She's been awake for four minutes." She cries. Is it time to nurse her already? No, the computer says. Her stomach hurts.
I picked up this idea from Smart Machines: IBM's Watson and the Era of Cognitive Computing. It is co-written by John Kelly, the director of IBM Research and Steve Hamm, my friend and former colleague at BusinessWeek.
I wrote a book about 2011 Watson (Final Jeopardy). So you might think I would find this material familiar. But it's a very useful, concise and engaging guide to the future of computing--which is also the future of knowledge, sensing, decision-making and discovery. I read it in about two hours. It led me from employment opportunities for Watson to frontiers of Big Data and the physics of new computing. It's hard to summarize the future of cognitive computing, but these two sentences come pretty close: "In the programmable-computing era, people have to adapt to the way computers work. In the cognitive era, computers will adapt to people."
Returning briefly to the baby example, the idea is that apps will eventually be able to crunch enough data to decode their noises and, effectively, put words in their mouths. Of course, not all babies will use the same noises. I imagine that the program will come with a standard template, and that parents will have ways to correct the machine's early mistakes, helping it to customize its analysis for each baby. And as those fixes make their way to the cloud service, it will grow more sophisticated, just like Google Voice or Siri.
Another similar challenge, I imagine, will be to interpret the noises and gestures of animals, and to get them also to talk to us. This animal analysis could probably benefit from smell sensors, another innovation previewed in the book. Such sensors could pick up molecules of chemicals signaling an animal's fear, confusion, hunger and sexual drive.
Do we want a machine announcing that Rover is hungry or horny or needs to go out for one reason or another? That could be too much information. But the marketplace will iron out those issues. For now, we at least know from a very good IBM book that the technology is en route.