Moore’s Law. Number theory. Quantum mechanics.
In the mind of Epic Games CEO Tim Sweeney, these ideas are connected. And they’ll all affect how visual computing evolves in the decades ahead.
The good news: whether Moore’s Law as we know it keeps moving forward or not, great things are coming. And NVIDIA is well positioned to help lead the way as virtual reality and augmented reality experiences become commonplace.
“NVIDIA is doing a damn good job right now,” Sweeney said, addressing a crowd of several hundred engineers, and hundreds more online, who had gathered to hear him speak today at NTECH, our annual internal engineering conference at our Silicon Valley campus.
Sweeney cited our NVIDIA GeForce GTX TITAN X GPU as an example of how advances in computing power are delivering immersive, real-time experiences that rival the visual quality of movies. “What TITAN X is able to deliver is really revolutionary and game changing,” he said.
Sweeney is one of the pioneers of the modern gaming industry. He founded Epic Games in 1991 while a student at the University of Maryland. And the company’s Unreal series of first-person shooters, which debuted in 1998, are seminal titles in modern computer games.
In addition to its own games, Epic licenses its Unreal Engine technology to other developers. Unreal Engine has underpinned scores of notable games over the past two decades. That puts Sweeney at the intersection of key trends in visual computing and gaming.
Powering Visual Experiences for the Next 4 Billion Customers
Sweeney sees an explosion in demand for new experiences coming, as we find ways to put more visual computing power to work more efficiently. Put to artful use, technology able to render 9 million pixels per eye will be enough to satisfy the human mind’s desire for rich, immersive images.
“When you reduce the hardware to the size of your Oakley sunglasses and make it easy to carry around as a smartphone, that’s an experience that’s going to appeal to 4 billion people,” Sweeney said.
And we’ll get there whether or not engineers are able to create processors able to wring more work out of each watt of electricity. If Moore’s Law stalls, he argued that advances in software and hardware engineering will let developers pick up more of the “slack,” by eliminating inefficiencies in the way images are created and presented.
That’s what Sweeney called the “Malthusian” computing scenario. But he also noted many reasons for optimism. Sweeney cited prospects for advances in the way we understand the universe, as researchers in fields as diverse as quantum mechanics and mathematical logic grope towards fundamental principles that explain the universe all around us.
Understanding these principles could drive unprecedented breakthroughs. Even if the odds of a world-changing technology emerging from any one field are small, Sweeney sees increased understanding of the way our universe works improving chances of major new technologies emerging from areas with world-changing potential, such as quantum computing and superconductivity.
Increases in computing power will play a role. When creating a commercially viable electric light bulb, Thomas Edison’s engineers tried thousands of different materials before finding a suitable one to serve as the filament in electric lights, Sweeney noted. Ever more powerful computers will let researchers arrange atoms in trillions of possible combinations in the search for breakthrough materials, he added.
“The next 30 years are going to be far more interesting than the last 30 years,” Sweeney said. “And the folks in this room are the most likely folks to lead the hardware side of VR and augmented reality,” Sweeney said.
The future, in other words, will be epic.