Sleep deprivation. Rushed meals. Caffeinated cookies. Cramped quarters. And utter chaos. Princeton’s twice-yearly HackPrinceton engineering gathering is our kind of fun.
So we couldn’t resist putting our Jetson TK1 development kit – with its parallel processing and computer vision capabilities – into the fray. The event draws hundreds not just from the school, but from as far afield as MIT, University of North Carolina, University of Waterloo and Columbia.
The result was amazing. Three Princeton students, led by sophomore Ethan Gordon, used Jetson and OpenCV – an open-source real-time computer vision library – to build a system able to interpret sign language. They grabbed the gathering’s “Best Hardware Hack” award and a $1,000 prize.
It’s a feat that might have been farfetched a few years ago. But it was made possible by the 192-core Tegra K1 system-on-a-chip that powers Jetson – the first mobile embedded computer to feature NVIDIA’s powerful Kepler GPU, the same architecture used by some of the world’s most powerful supercomputers.
Response times, needless to say, were snappy. Techniques like GPU-accelerated edge detection and least-squares matching allowed the team’s 250 lines of code to quickly translate hand gestures into their corresponding alphabet characters. Gordon was able to keep his code lean by drawing on Tegra-optimized OpenCV packages that NVIDIA fine tunes for snappy performance.
To build your own version, you can find the project source code for “ASLTegra” on Ethan’s GitHub page.
You can read more about his winning hack in the Daily Princetonian and Deaf Weekly. To learn more about the Jetson Embedded Platform, or the Jetson TK1 DevKit, check out: https://developer.nvidia.com/embedded-computing.