Last Friday we celebrated the US launch of our newest GPUs, the GeForce GTX 480 and GTX 470 at PAX East, in Boston. We picked PAX East as the venue because it’s a show for gamers, by gamers. Having built these cards with passionate PC gamers in mind, it was really the only option.

Since launch, we’ve been getting great feedback from you on all that the GTX 480/470 has to offer. With it, you can “crank up” your next gen PC games. From advanced tessellation engines, 480 compute cores, to new ray tracing technologies and of course 3D Vision Surround support. (You may have seen shots of our presentation at PAX that projected across three giant screens – each 80’ in diameter – with 3D stereo demos of BattleField: Bad Company 2, World of Warcraft, and Metro 2033. It was spectacular. Video of our keynote is below)

We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast. The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.

The GF100 architecture is great and we think the right one for the next generation of gaming. The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.

As always, we hope that you enjoy our new products and let us know what you think. We built them for you.

  • Uzair

    Drew, is there a chance of having the GTX 495? Wouldn’t heat be a problem? I respect what you said, at least we get more performance if there is a lot of heat but i just can’t think of GTX 495 coming out which is what i’m hoping for. But dual GPU always produce much more heat.
    In a month, i will be getting the Asus g73-jh which has HD5870. I would love to wait for nVidia to have a GPU for notebook as they have the features of tomorrow games (ill be keeping notebook for years to come) but will the mobile GPU produce this much heat?

  • Uzair

    Henry, are drivers going to solve this problem?

  • uzair

    Drew, is there a chance of having the GTX 495? Wouldn’t heat be a problem? I respect what you said, at least we get more performance if there is a lot of heat but i just can’t think of GTX 495 coming out which is what i’m hoping for. But dual GPU always produce much more heat.
    In a month, i will be getting the Asus g73-jh which has HD5870. I would love to wait for nVidia to have a GPU for notebook as they have the features of tomorrow games (ill be keeping notebook for years to come) but will the mobile GPU produce this much heat?

  • Drew henry

    Yep, this is a bug. Will be fixed shortly.

  • http://www.overclockingwiki.org Jebo_4jc

    Mr. Henry -
    Thanks for taking the time to write – and respond – to the community. We hear you loud and clear.
    My thoughts:
    1. First and foremost, I am eagerly waiting for developers to take advantage of this hardware we have! I have an i7 rig with 3x GTX275 because I run folding @ home. If I didn’t run FAH, however, the sad truth is, even a single GTX275 or 285 can max out nearly every game coming out because many of the engines are designed with consoles in mind. PC devs need to push the envelope more. It would swing some console game purchases over to the PC side of the PC offered a drastically better experience.
    2. The GTX480 and 470 don’t offer enough performance in games to justify their price premium over the Radeon 5800 series options. Price cuts or rebates would help this out a lot.
    3. As a current SLI owner, I am very anxious to see the Surround capability enabled on my GTX275s. Is this coming soon?
    Thanks again.

  • patrick

    Kyle didn’t activate DX11 DOF. :)
    There are no reason for AMD/ATI or Intel (in the future) to support CUDA.

  • Betacentury

    yeah, 450W tdp?

  • Cherub

    I’m sorry but I don’t think we want to play Unigine Heaven or Stone Giant on my tricked out rig all day long.
    The only game that’s out of the ordinary in terms of performance increase is Metro 2033 and – oh surprise – it’s a TWIMTBP game… .
    Though it’s still not playable when you crank everything up.
    Do you really think that all game developers will code their engine in a way that it runs on the edge of being unplayable with a GTX 480 whereas when you got a 5870 or 470 you’re SOL? I don’t think so, they will take it down a notch or two to target a wide array of cards – not even talking about 5870 and 470 but further below.

  • http://profile.typepad.com/dihapus Dihapus Hilang

    damn i miss my first nvidia card GF7200 256mb
    ha ha ha,
    i wait for low-end fermi 420 or 430
    i hope the cost will be low to
    hi i hear from my friend “OPEN-GL 4″ can do Tessellation on windows Xp
    is that right?
    that mean some DX11 feature(win7) can do on Windows Xp OS_

  • http://masterbytez.de meltman

    thank you for every videocard in the past, i love your work from tnt2 over geforce4ti to 6800 and 8800ultra everything. i hear many bad things about your new model gtx470/480 and dont know to buy it or not! but i will count on you and what you say and buy the 470 because i never ever want to have a ati card.
    for the future i wish you good luck and hope you can fix many problems to lead the video scene like in past :)

  • tpi2010

    I have had many Nvidia cards over the years in different rigs: started out with a GeForce 2 MX400, had an FX5200 and a 6200 on a spare computer, a 7600GS on my main one, and now a factory overclocked 8800GT. I also have several ATI cards. And from all models and both brands only one X700 card from ATI has failed on me. Apart from that, I’ve had no complaints. So, I’m not a fanboy of either side, I just look at the market and try to get the best value for money.
    But, truth be told, I’ve grown very skeptical of Nvidia’s behaviour towards gamers. Nvidia has clearly stated a few months ago that they are focusing a little away from games, which is no good indication, but worse than that, they now have their PR department saying that “We want tessellation perf to rock since we deeply believe that PC games should have the same geometric complexity as movies”.
    Yes, and now that you do want that, the industry will finally get it. But ATI has had tesselation support since 2001, and yet where was Nvidia all this time? And we had to wait for Nvidia.
    Where was Nvidia when DX10.1 came out ? Where was Nvidia when Ubisoft removed DX10.1 support from Assassin’s Creed ? And again we had to wait for Nvidia, because without Nvidia the gaming studios don’t move forward.
    And where is Nvidia’s ethics when you rename your 8800GT’s to 9800GT’s (while some 9800GT’s were still 65nm parts and others 55nm), and then to GTS240 ? And do the same to the 9800GTX+ becoming the GTS250, thereby confusing your customers ?
    An now you say a performance card is designed to run hot ? And be power inneficient ? The card idles at twice the power draw of the ATI models, despite droping Core Mhz count threefold. Is that by design too ? Is the fact that the chip has 512 cores, but you can’t get them out to the street in numbers with those many cores because you didn’t adapt to the manufacturing process properly ? Please. You just missed a perfect opportunity to stay quiet and work on the next generation. Because you calling next generation to the GTX 480 and GTX 470 is six months late. We are already in the next generation, making it the “current generation” and, unfortunately, Nvidia doesn’t yet have a single customer with one of their “current generation” cards.
    You have a lot of homework to do Nvidia. I wish you well, not because of any fanboyism of either side, but mainly because you have provided me with countless hours of both work and fun and I appreciate that, and also because competition is good for everyone. That said, I surely won’t do you any favors, as I’m not expecting anybody to do you any favors when you screw up. Especially when you ridiculously pretend everything’s fine. It’s a lesson I hope you learn for the future.
    Take care.

  • Drew Henry

    Hi and thanks for posting. 3 x GTX275 … Cool! My rig is 2x GTX285, so you have me beat. We will definitely support 3D Vision Surround on your rig. Support should come at the same time as for GTX 480/470, which is about a month from now.
    Thanks also for supporting F@H. Please check out the F@H for Stephanie contest at bjorn3d.com. Good cause.
    We’re trying hard with the Devs. I believe we have to give them a platform to build these next gen games on … or they won’t do it and we’ll just keep getting DX9 console ports. Tessellation is key, imo.

  • PeterB

    I dont trust these cards, so I’m sticking with ATI for this generation. With a massive power consumption and massive heat output vs. ATI, I’d rather just steer clear. I dont believe a chip can be designed to last just as long at a high temperature vs. a low temperature. Its a comforting idea, but flies in the face of everything I know about silicon manufacture and operation. As we all know, Fermi has been plagued with design issues since the get-go. Now the company tells us it’s been well designed, and we shouldnt worry?
    I dont buy it, and I wont buy it. It’s marketing spin. Better luck next time around guys.

  • John Mann

    Just curious, but what do you think “ALL In Game Settings” means? We enabled everything but THIS and didn’t mention it? If they say they had all ingame options enabled, they were enabled to include your prescious DoF.

  • John Mann

    Also, they dont have to if they dont want to support CUDA, but should they ever decide to do so. ITS FREE, NO COST!

  • Shin0bi272

    “We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above.”
    Orly? try checking the power and heat on the amd 5870. A card thats a whopping 5-10% slower than yours but uses HALF the power. Care to rephrase your carefully thought out BS statement now? Oh wait I forgot about physx … apparently so did the rest of the gaming community. With havok cloth in the works and amd pushing their open source physics “bullet” You have no excuse to put out a card with these specs. Dump the GPgpu idea and give us more fps at a lower temp. In short FPS or GTFO!

  • Bachir

    Do we gamer’s have any hope that the next gen play station4 GPU will be Designed by Nvidia??? That company deserve 1000000% of a new cooperation by Sony for the new awaited PS4. i’ve done recent researches about the subject and found something interesting: rsx2 chip based off the Gtx480 architecture codenamed “Fermi”
    So i hope u all share u’r opinions beacause it is worth the discusion.

  • Brian Blessed’s Beard

    Sorry Drew, but all of your points are pointless (not in the eyes of NVIDIA fanboys maybe – to them, as Joseph Heller so wonderfully put it, NVIDIA has “a thousand points of light.”) ATI’s Evergreen refresh, or next-gen if we’re lucky, which should beat even a 28nm 512 shader Fermi card in performance (I’d be willing to bet money on it, and no, tessellation orientated non-game demo’s do not count) and will probably do so without running dangerously close to shut-down temp (unless AMD/ATI are aiming to lose money) will be out this calender year (unless TSMC run into problems with their 28nm node, and past 40nm issues don’t exactly inspire confidence).
    For me there are simply too few good reasons to buy either a GTX 480 or a 470. Maybe if the 470 was cheaper and had launched 6 months ago. But it isn’t and it didn’t. If I were to buy a new card now, and I may do if I need DX11 compatibility sooner rather than later, I’d get a HD 5850, which is £100 cheaper than a GTX 470, and overclock the living hell out of it.
    ‘Til then my GTX 260, which won’t hit 105C this coming summer unlike the GTX 480 (and Drew, I’d be willing to bet money on that also), will suffice.

  • epiclulz


  • stefanodisabatino

    FERMI is a Fantastic GPU with C likes language… but TOO HOT and few less CUDA ACTIVED CORES than original project… i hope that next time there will be a better full function product :-))!

  • no thanks

    Single versus dual gpu is a red herring; used to paper over the real comparison points. Given a watt budget (eg. < 300 watt to be valid PCIe card), $ budget, etc etc, which is the best graphics card product. Whether that is 2 chips or 1 chip on the card is completely irrelevant. Nvidia has and will done 2 chip implementations just as ATI has done.
    Multiple cards is yet another point of differentiation, with more tradeoffs (more slots/space/power required, greater cost etc.). Frankly, its a disappointment the GTX480+470 can’t drive 3 monitors from a single card.
    Given your juvenile comment its perhaps pointless to explain further.

  • maxx

    what about multycore card i hope not far

  • Momo

    Fabulous card indeed, but I’m sure you’re desperate to get down to 28nm. What we can say if this had been released as a 28nm part with 512 cores enabled (the intended chip) along with digital pwm to regulate everything better, the tech press, no doubt, would have hailed it as another G80 8800GTX type milestone for Nvidia.

  • UltimateGTR

    I’m waiting for GTX 495 with not less than 2048MB video RAM!

  • Aaron D

    Is nvidia going to release a less expensive, less powerful version of these cards anytime soon? I want a DX11 card, but I don’t need that much power and I would prefer not to buy from ATI again because my current setup is riddled with problems.

  • Drew Henry

    Yes we are. I can’t comment on time frame just yet, but we’ll have new GPUs available at a bunch of different price points. All are based on our new arch designed for next gen games like Metro 2033.
    Thanks for asking.

  • Rochak Gautam

    Hey nvidia Crew!!
    Since the new fermi is taking rid of HD 5870, why don’t you work on dual GPU card of the fermi archtecture to swipe out “The king of graphics”, ATI RADEON HD 5970, and take the throne to the king.
    Work on the dual GPU- GTX 495- with 4 gb of graphics and Liquid cooling recommended !!!! HAHAHAHAHAHA….LOL!!!! :-)

  • Ven

    Too bad that 3D Vision doesn’t work as expected in so many games. You mentioned Battlefield: Bad Company 2. It’s pretty much unplayable with 3D on.

  • http://profile.typepad.com/andrewfear Andrew Fear

    Hi Ven
    Can you tell me more about 3D Vision not working with Bad Company 2? I play it almost every day with a GTX 285 on a 3D Vision Acer 1920×1080 LCD and its very playbable for me. What GPU and settings are you using?

  • http://www.dbdclan.co.uk Bornprouduk

    If you want to drive a Ferrari you go NVidia
    If you want to drive a skoda then go ATI
    its a poor mans alternative just like AMD/Intel
    I spent to many hours years ago trying to make games work with ATI face facts the future is 3D NVidia
    My GTX480 is on order with Scan and I cant wait.
    forget the heat get a bigger case and fill it with blue led fans looks good and stays cool…..
    Drew Great launch loved all the Hype and want a tshirt Crank it up

  • Matt Leshman

    nVIDIA RULES!!!!!

  • Zimma

    Hey Nvidia!
    I have my 480 on pre-order – nicely timed release to coincide with the demise of my 2 8800 ultra’s which have until now have given great performance with pretty much all games!
    Ordered on day of release but the negativity surrounding the fermi has given me second thoughts in terms of my order. As I live in the UK, ambient temps won’t be too much of an issue and I read reviews that the card is noisy – That doesn’t really bother me as my rig is gaming orientated – If I want less noise, I’ll get a PS3. One thing I do need though is a reliable card that will stand the test of time. Can you reassure me that this card is not a release to cater to timelines and is in fact a reliable, well thought out card that will not see a newer and better version within a year?

  • Candyman

    As about GTX495 and the famous TDP issue.. you forgetting one little detail.
    GTX 275 had 220w TDP and GTX295 (290w TDP) was 2×275 with clocks of 460. Right?
    So.. GTX 470 has tdp of 225w… why you people think that 2×470 underclocked will not be possible?
    And as about people bragging about ati offering more for less…
    Does ATI has CUDA? PhysX? The best drivers in the market?
    TWIMTBP titles which take advantadge of specific features of the hardware?
    Hmm…. don’t think so..

  • http://www.facebook.com/profile.php?id=1549787987 Zeek The Geek

    Hello Drew, I have a few simple questions to ask. What is the estimated release date of a GTX 480 with 512 cores? Is Nvidia going to turn around and leave the GTX 480′s to stay with 480 cores and create the GTX 485 with 512 core instead? If that’s the case I would see flaws in that, since the GTX 480′s are already $50 over priced compared to that of ATI’s 5870 in price and performance standards. I know very well that creating your cards is indeed cheaper to do than it is for ATI. Since Nvidia cards have very few stream processors and lower clocked speeds than ATi cards, the manufacturing cost is simply that much less. On top of which you guys sell your cards with a much larger mark up price than ATI does. 5-15% doesn’t justify $100 more pricing on a GTX 480, simply because Nvidia could easily create an extreme cash flow by selling their GTX 280′s at $450.
    The amount of money spent creating the Fermi was certainly mind boggeling. However Nvidia cannot forget that it will lose out on potential business when they charge nearly %100 or more markups on the manufacturing of their cards. ATi has a stunning near %40 markup value on their cards which perform within a marginal tollerance for the price of their cards.
    It is my suggestion that the deffective products should have started out at $420 to compete with the current 5870′s and steal the spot light. While having fixed the 512 stream processor situation, changing the price to a reasonable $450. This is excellent strategy if Nvidia want’s to hog the enthusiast spotlight, let ATI get the mainstream cards. Nvidia can’t afford to make a mistake like this, it’ll deffinately come back to haunt Nvidia if they do not restratigize the pricing for the GTX 480. The GTX 470 is perfectly placed in pricing, I could not argue with the position in which it was placed for cost.
    I’m not an ATI fan, since I’ve had nothing but bad luck with ATI cards. I’ve had a Radeon 9550, outdated too quickly. Radeon x1300, card failed after 3 weeks, got a new card it failed within 3 weeks. I got a 5870 to test DX11, my issues with that card, poor drivers, ATI drivers have always been riddled with really bad bugs. Any antialiasing would cause horrible lag. Any game would lag for the first 5 minutes before going away.
    My Nvidia cards, 8800GT, the first graphics card I enjoyed and loved having, infact I’m using it right now for a reason explained later. I bought this graphics card because it was cheaper, decent in performance and had the next gen GPU in it. This was an awesome card, however the FAN noise was that of a dryer. This single slot card also got really hot in both idle and gameplay, I have managed to reach 101C with it on a hot day. This was bad, but oh well. My second Nvidia card was a mac daddy GTX 295, this card was utterly amazing in every aspect along with my 8800GT. As long as the game had SLI support I would enjoy max possible settings and max Anti-Aliasing, this card was a beast. However I sold it a few weeks ago to buy a GTX 480 when it came out. But sadly I’m going to wait a while until prices on it drop, it’s outragous to think that the GTX 480 which currently yeilds less performance than the GTX 295 should cost the same. As well, the GTX 295 is much more expensive to make than the GTX 480 will ever be.
    My only proposal is that Nvidia will drop the MSRP $80 on the 480 Stream Processor models. Afterwards raising it to $450 on the 512 Stream processors. That will be the only way I’ll buy the card as it is now. I can guarantee that this is what the majority of people will be thinking as well.
    Drew, if you can pull some string and let me be heard throught Nvidia I would appreciate it to the utmost sencerity. If I could be contacted I would appreciate it, I’d love to have a 1 on 1 chat or 1 with many chat with people from Nvidia.
    I’m currently a college student, on track for two associate degrees, Computer Forensics, and Computer Information Systems. I spend my free time researching and studying hardware and specifications. I guess you could say it’s a hobby of mine.

  • http://www.x-3dfx.com Obi-Wan Kenobi

    That is exactly what I am hoping for Dual GTX 470 cores with the ram of the GTX 480, this could be a good card that will wipe the HD 5970 from it’s performance crown, and to be honest even a GTX 480 has the power to keep up either beat this card when rendering DX11 the right way ;)
    the GF100-375-A3 is by far the most powerful GPU on this face of the earth and if supported properly like the Stone Giant DX11 showed us this GPU wiill pulverize anything the compettor puts against it.
    That has been proven and here’s the funniest part, it has been proven with beta drivers , in time these fermi GPU’s will only get better and faster, manys eem to forget that new cards don’t get better and should be seen asa pile of rectrum extracts… they are just missing the point of VGA card engineering.

  • http://profile.typepad.com/rainiergascon Rainier Gascon

    when will be the release of the economic / midrange versons of 400 series? will it consume the same power and produce the same amount of heat?
    im curious, i am from the Philippines and we all know its a tropical country.
    I want to by one. how much will it cost? will there be a 256-bit vcard?

  • http://profile.typepad.com/guptaaditya Aditya Gupta

    GTX 480 and 470 are good performers but i don’t understand why these cards just consume so much of power and generate so much of heat.This is the 1st time that ATI GPUs are beating NVIDIA’s GPUs.Also ATI GPUs are beating GTX 4XX in many bechmarks.
    are there any chances that NVIDIA can beat ATI with GTX 485 or 475?I am still hoping something good.The heat generated by GTX 4XX cards can alter the conditions of other components too.
    Uhmm , NVIDIA can bundle a GPU cooler which may help to keep the temperature down.
    What else? My all hopes are shattered but I still hope something good from NVIDIA in the coming months.

  • http://www.youtube.com/strikerhiryu striker_gt

    Nvidia Fermi fail! :D

  • dan

    the 480 is a monster, no doubt, although I personally prefer the 5870 at the moment because the configuration I am running likes the ATI side and ATI always seems to be cheaper for some reason. Another concern is heat, I live in Tucson AZ USA and when its 120 or 130 outside the AC is screaming just to get the inside temp 90 or 95! I am running air so PC parts cook! Also being fiscally challenged I tend to want my PC parts to last a while! My 5870 idles at room temp and never goes above 70 fully loaded [with 600 CFM + case ventilation!] and a second power supply for the case fans! At this point I don’t know if I could run a 480 for any length of time before it overheated.

  • http://profile.typepad.com/6p0120a4fec334970b Drew Henry

    TypePad HTML Email
    Sounds like you know something about cases. Thats good. Put a
    GTX 480 in it and youll have no problems. Best DX11 gaming, 3D Vision (you
    gotta try this!), plus all the other features of our GPU. Enjoy.
    From: TypePad