IBM Shows Off Machines That Can Dance — and Sense the Sadness and Anger in Your Dating Profile — at GTC  

by Brian Caulfield

You may talk about taking long walks at the beach and beginning a new chapter in your life. But everyone can see through you. And so can at least one machine out there.

Rob High, IBM fellow, vice president and chief technology officer for Watson, showed how IBM’s Watson cognitive computing technology can tease out the sadness and anger in a particularly hackneyed dating profile — or teach a robot to playfully dance in response to a teasing question — during a keynote address Wednesday at our GPU Technology Conference.

It’s just the latest example of how IBM’s Watson technology continues to amaze. Five years ago, Watson leaped into the public’s consciousness when it grabbed a $1 million prize on Jeopardy!, competing against a pair of the TV quiz show’s top past winners.

“I don’t know what a machine would do with $1 million,” quipped NVIDIA’s Ian Buck, our vice president of accelerated computing, as he introduced High to a crowd of more than 4,000 people. “But from a reinforcement learning point of view, I’d say that’s pretty good reinforcement.”

Just What the Doctor Ordered

Since that victory, IBM has turned the capabilities that amazed the world during a television quiz show into an array of services relied on by doctors, lawyers, marketers and others to glean insights by analyzing large volumes of data.

IBM's Rob High speaking at our GPU Technology Conference.
IBM’s Rob High explains how Watson is learning how to better understand the humans it interacts with.

All these are examples of how machines are learning to think, learn and react in more human-like ways, a concept High calls “cognitive computing.” It’s a skill that will be in ever greater demand as the amount of data in the world around us grows.

Right now, 2.5 exabytes of data are being produced every day — the equivalent to 650 billion Harry Potter books. By the year 2020, that will jump more than 20,000-fold to an unfathomable 44 zettabytes.

That kind of information overload is already a pressing problem in medicine, High explained. A typical physician has just five hours a month to devote to reading the latest research, High said. But by some estimates, they need as much as 160 hours of reading a month to keep up.

“This isn’t about the academic exercise of trying to recreate the human mind,” High said. “We’re trying to be inspired by the human mind, so compute systems can understand information and bring it to your fingertips.”

To bring humans the information we need, however, machines will need to interact with us more like the ways we interact with each other, High explained.

More Power Needed

Doing so is incredibly compute intensive. High sees such cognitive computing tasks soon gobbling up a majority of the world’s computing capacity. “We need more speed, we need more computing power,” High said.

To that end, IBM announced late last year that its Watson cognitive computing platform has added NVIDIA Tesla K80 GPU accelerators. As part of the platform, GPUs enhance Watson’s natural language processing capabilities and other key applications.

GPUs have been particularly helpful during the most computing intensive stage of cognitive computing, ingestion, during which Watson takes in information and categorizes it for future use. Using GPUs has improved training time by 8.5x, High said.

It all comes together, High explained, as we create a new generation of machines that are better able to interact with — and learn from — humans.

“We think this is very promising as a potential area for advancing cognitive computing,” High said. “You could argue there’s a natural connection between artificial intelligence and the embodiment of that through robotics.”