Imagine all the data that goes into making a CAT scan useful. Now imagine that your cardiologist can’t access it because he’s wearing a blindfold.
That’s more or less the state of things when it comes to big data. We can compute the answers to big data questions on a GPU, but we have a much harder time visualizing the output into something that makes sense to our brains. It’s even harder to turn this output into something that we can interact with in real time, something that helps us see what the data is trying to tell us.
Now, with GPUs in the data center, you can compute and visualize the answers to life’s big data questions. And you can do it without having to move massive amounts of data around the network to get it to individual workstations.
This capability comes thanks to NVIDIA GRID technology, which brings the graphics capabilities to a data-center environment. And, since every data-center now sits on top of a virtualization layer, hypervisor support from industry leaders is key. One more name can now be added to the list of innovators backing NVIDIA GRID, as Red Hat plans to add support for the technology in an upcoming release.
Using Kernel-based Virtual Machine (KVM) technology, you will be able to assign a GPU directly to a virtual machine for computing (Tesla) or visualization (Quadro and GRID).
With a NVIDIA GRID-enabled node right next to an NVIDIA Tesla-enabled compute cluster, you can keep the data in the data-center and visualize answers. It’s like having a super-smart brain (Tesla) and perfect 20/20 vision (GRID and Quadro) all accessing the same data. And no more blindfolds.
Stop by the NVIDIA (613) or Red Hat (3613) booths at SC13 to see of demo of this new capability in action.