Over the past two weeks, I had the opportunity to attend two conferences: the Intel Developers Forum (IDF) and the GPU Technology Conference (GTC). The contrast between them was readily apparent, beyond just the venue. IDF is a mature conference focused on the relatively stable ecosystem of Intel processors and support chips. GTC is a brand new event, focused on the rapidly growing ecosystem of GPUs and GPU Compute.
IDF – Tick, Tock, Tick, Tock, Zzzzzzz
At IDF, things just seemed to be going through the motions. To me there was a lack of any real excitement, because much of the “new” technology was already well known and any changes were largely incremental. The most interesting new technologies were Light Peak for 10-100Gb optical interfaces and embedded Atom chips. At IDF, you could still feel the cold winds of recession, with the lower attendance and a lack of interesting start-ups. In fact, even AMD’s usual counter-IDF presence seemed more muted this year. Intel filled some the space with their own booths at the expo and a large contingent of Intel employees filled keynotes.
Part of the problem is that with its “tick-tock” processor and process roadmap, Intel has become as exciting as watching a clock the company appears to want to emulate. Oh look, there’s an Intel executive holding a wafer, again. A lot of stories leading up to IDF were written about the recent departure of Intel’s Pat Gelsinger. Intel’s announcements were pretty much preordained mobile and server Nehalems, a SIX core processor coming next year (shocked and amazed), Moblin and an application store for Atom developers to OEMs (But wait, wasn’t the advantage of the x86 instruction set that you didn’t need special tools and development across all platforms?), and a lethargic Larrabee demo that seemingly no Intel executive wanted to talk about.
GTC – And the new computing model alarm goes off!
On the other hand, GTC was busting at the seams with overflowing rooms and a real, nearly audible buzz. The show exceeded registration targets and registration had to be shut two weeks before the conference started. Expo traffic was very good and we had companies renewing on the spot for next year (there will be a GTC 2010).
At GTC, the conference really showed what the future can be – with pervasive 3D (stereoscopic) entertainment, improved cancer diagnostics, augmented reality and virtual shopping, and building amazing supercomputers. We even managed to inadvertently provide a little controversy – well, at least people actually cared if there was a real Fermi chip under that heat sink and running the demo.
There was an exciting mix of entrepreneurs, students, professors, developers, VCs, artists, all interacting and creating a new future (one analyst called the atmosphere “electric”). The stories from GTC are about NVIDIA creating a new computing market based around GPU Compute.
The brand new Fermi architecture was designed to be both a breakthrough for compute and a category leading graphics processor. The architecture is best described by the white papers NVIDIA sponsored from leading microprocessor analysts and another in-depth report by Real World Tech. With 3 billion transistors, this chip is massive and represents a new vector in computer design.