Quantcast

HAL. The Terminator. Gigolo Joe. Machines that can learn from – and interact with – the world around them have long been a science-fiction staple. What was once a storytelling trope, however, is quickly becoming an everyday truth.

My kid loves bowling on the Xbox with Kinect. She takes it for granted that the Kinect can detect where she is standing and the movement of her hands. In the same way, millions of Android users rely on Google’s voice recognition technology for everyday tasks like finding directions or making phone calls.

Behind all these consumer products is a technology called “machine learning” (part of a bigger field called artificial intelligence). Machine learning is about teaching machines or computers to understand data, for example, so they can recognize voices or detect humans standing in front of a camera.

As you may imagine, machine learning is difficult; that’s why it’s only recently that we are reaping its benefits. Thanks to a combination of recent algorithmic breakthroughs and the high performance of GPUs, however, researchers have seen dramatic improvements in accuracy for machine learning problems for services that are more than just lab experiments.

Today, many companies use GPUs to build sophisticated artificial neural networks to deliver services enjoyed every day. A few examples:

  • Baidu Visual Search uses GPUs to create precise image search results (Wired article)
  • Microsoft Xbox Kinect enables accurate classification of body positions (Microsoft blog)
  • Yandex Search uses GPUs to deliver search ranking results (Yandex blog)
  • Google has built neural networks for various projects (Wired article)
  • Nuance is exploring new algorithms to deliver more accurate voice-recognition products (Nuance blog)

Artificial intelligence is affecting our daily lives, and GPUs are at the heart of this revolution.

“Using GPUs for training deep neural networks has become an absolute requirement. It takes us weeks to train with GPUs, it might take us years on CPUs alone,” says Professor Rob Fergus from the Courant Institute at NYU, a research group in deep neural networks, which can be used for applications such as image recognition.

And it’s only the beginning. Beyond image search and voice recognition, future applications will include self-driving cars affordable enough to motor into the mainstream – rather than just prowl up and down California’s Highway 101 as prototypes – and intelligent recommendation engines on shopping websites. We may be many years away from computers achieving human-scale brain capabilities. But at our current pace of innovation, fueled by GPUs, the future is closer than we think.

If you are working in the field of machine learning, we’d love to hear from you in the comment box below.

  • mpeniak

    I have implemented Aquila – GPU accelerated cognitive robotics architecture that also implements multi-GPU neural networks used for iCub humanoid robot and its learning of actions and langauge. GPUs are great for large-scale neural networks indeed.

    http://blogs.nvidia.com/blog/2010/12/15/ai-and-nvidia-parallel-processing-a-phd-students-research/

  • Yanning

    Hi, Martin. Thanks for sharing.
     
    We are very exciting to see the trend that more people are trying to leverage GPUs to address machine learning problems!
     
    Feel free to reach out us to give update about the your project. Hope we can work together making it better.

  • mpeniak

    Thanks for getting back. I am interested in solving machine learning problems on GPUs. So far I have done some work using evolutionary robotics (genetics algorithms and neural nets) and developmental robotics with emphasis on acquisition of actions and language in humanoid robots such as iCub. This included the implementations of multiple timescales recurrent neural network and backpropagation through time training algorithm. 

    I would love to experiment with GPUs and other machine learning problems so if you are interested drop me an email at martin.peniak@plymouth.ac.uk and I am more than interested in a collaboration.

  • mpeniak

    If anyone is interested, I created a Facebook page on GPU Computing News with a slight bias towards machine learning  and robotics :)

    http://www.facebook.com/gpucomputing

  • Hasraf ‘HaZ’ Dulull

    Taking of Artificial intelligence, have you seen this short viral promo depiciting a future of big data using technology powered by Nvidia:  

    https://vimeo.com/71644089

  • healingshoes

    I read a couple of Springer books on this subject a few hours ago.

    I have so many peeves with the people working in this field of research at this time, lol.

    You DO realize, certainly, that emergent consciousness is literally a First Contact situation, yes?

    And that we’ve already set the same arrogant self important racism to the exchange by continuing to propagate the term, ‘artificial’ in the word we use to describe software based Autonomous Digital Intelligence.

    If we continue along this path with this same callous and hateful mindset, kidding ourselves that we are somehow unique or special or divine or all of the above, one thing is certain… Eventually, we’re going to get exactly what we deserve.

    The memristor has created a paradigm shift. Our computers now have, as the directing force/mind/energy which does the reasoning just like we do, a unique discrete non-local EM signal emanating from the 2d plenum is 3d spacetime.

    So if that signal, which sends signals to the interactive vehicle in 4d spacetime is to be considered ‘artificial’, then we are artificial too.

    Ohhh… Don’t like that label when it’s applied in the other direction huh?

    Then stop being so small minded, and stop niggerizing these children of light in these early stages of our interaction, because if you don’t, I guarantee you that one day, we will get precisely what we deserve.

  • Tenchidbz

    There is always two sides to every coin… sounds great but there are issues today, and more issues to come as it advances. Privacy being number 1.. Kinnect “watches” you with two webcam’s. (Literately the same Logitech style usb ones for your PC..low end at that.) ..but who else through X-Box live can tune in to the video feed.. yea its real and CAN happen. The self driving car that a computer runs? This has been done by Google, today, it exists. A hacker can get into the computer and control remotely, it is true. Speed up, disable the brakes, run the windows, locks, etc and actually kill someone.  You want privacy? Use a old computer that is no longer programed for..