Lending a Helping Hand: Jules Anh Tuan Nguyen on Building a Neuroprosthetic

by Clarissa Garza

With deep learning, amputees can now control their prosthetics by simply thinking through the motion.

Jules Anh Tuan Nguyen spoke with NVIDIA AI Podcast host Noah Kravitz about his efforts to allow amputees to control their prosthetic limb — right down to the finger motions — with their minds.

Using neural decoders and deep learning, this system allows humans to control just about anything digital with their thoughts, including playing video games and a piano.

Nguyen is a postdoctoral researcher in the biomedical engineering department at the University of Minnesota. His work with his team is detailed in a paper titled “A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control.”

Key Points From This Episode:

  • Nguyen and his team created an AI-based system using receptors implanted in the arm to translate the electrical information from the nerves into commands to execute the appropriate arm, hand and finger movements — all built into the arm.
  • The two main objectives of the system are to make the neural interface wireless and to optimize the AI engine and neural decoder to consume less power — enough for a person to use it for at least eight hours a day before having to recharge it.

Tweetables:

“To make the amputee move and feel just like a real hand, we have to establish a neural connection for the amputee to move their finger and feel it just like a missing hand.” — Jules Anh Tuan Nguyen [7:24]

“The idea behind it can extend to many things. You can control virtual reality. You can control a robot, a drone — the possibility is endless. With this nerve interface and AI neural decoder, suddenly you can manipulate things with your mind.” — Jules Anh Tuan Nguyen [22:07]

You Might Also Like:

AI for Hobbyists: DIYers Use Deep Learning to Shoo Cats, Harass Ants

Robots recklessly driving cheap electric kiddie cars. Autonomous machines shining lasers at ants — and spraying water at bewildered cats — for the amusement of cackling grandchildren. Listen in to hear NVIDIA engineer Bob Bond and Make: Magazine Executive Editor Mike Senese explain how they’re entertaining with deep learning.

A USB Port for Your Body? Startup Uses AI to Connect Medical Devices to Nervous System

Think of it as a USB port for your body. Emil Hewage is the co-founder and CEO at Cambridge Bio-Augmentation Systems, a neural engineering startup. The U.K. startup is building interfaces that use AI to help plug medical devices into our nervous systems.

Behind the Scenes at NeurIPS With NVIDIA and CalTech’s Anima Anandkumar

Anima Anandkumar, NVIDIA’s director of machine learning research and Bren professor at CalTech’s CMS Department, talks about NeurIPS and discusses the transition from supervised to unsupervised and self-supervised learning, which she views as the key to next-generation AI.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Tune in to the Apple Podcast Tune in to the Google Podcast Tune in to the Spotify Podcast

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.