by Drew Henry

Last Friday we celebrated the US launch of our newest GPUs, the GeForce GTX 480 and GTX 470 at PAX East, in Boston. We picked PAX East as the venue because it’s a show for gamers, by gamers. Having built these cards with passionate PC gamers in mind, it was really the only option.

Since launch, we’ve been getting great feedback from you on all that the GTX 480/470 has to offer. With it, you can “crank up” your next gen PC games. From advanced tessellation engines, 480 compute cores, to new ray tracing technologies and of course 3D Vision Surround support. (You may have seen shots of our presentation at PAX that projected across three giant screens – each 80’ in diameter – with 3D stereo demos of BattleField: Bad Company 2, World of Warcraft, and Metro 2033. It was spectacular. Video of our keynote is below)

We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast. The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.

The GF100 architecture is great and we think the right one for the next generation of gaming. The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.

As always, we hope that you enjoy our new products and let us know what you think. We built them for you.

Similar Stories

  • sabbathius

    I’m waiting for the future gtx495 ^^

  • RaymanDK

    I hate my 295gtx. I’m going after the GTX 480.
    Release date in Denmark?

  • Schneider

    why did u not say anything abut that is get really HOT 95c or more that is not good -.- me see that as biggest problem u got that my dear nvidia most ppl will go for ATI that are lower in temp and lower and power me have hade nvidia card long time but now me are disappointed whit card just for it get to HOT me dont care about the power

  • zcx

    Don’t hold your breath.

  • Efe Selcuk

    I sure hope Drew (or at least someone with some say in nVidia) reads the comments that will surely follow this blog post. I’ve ended up writing quite a bit.
    I’ve been with nVidia for the past decade. My brother built his desktop way back when with the Ti 4200, I bought a prefab with a 5950 ultra, my last budget build had an 8600 GTS in it, and I upgraded to the GTX 275 last year. I am in no way a fanboy, nVidia just has treated me very well. If I had made that last decision a few months later after the price hike, it would’ve definitely been the 4890; almost identical performance for ballpark $100 less.
    I recently built a new high-end rig (Core i7 and all), but I waited out on dropping the money on a 5800 series card. I knew nVidia’s new cards were on the way, and I was excited and willing to wait it out; I expected a lot out of them.
    Now that they’re are out in the open, I have to say I’m a little shaken. In many cases, the performance of the cards are not where I would’ve hoped they be (the general consensus seems to be 5-10% increase in performance over their ATI counterparts; I see that failing in many cases, however). It seems like the effort that nVidia put into the cards gave them lots of potential, but most of it is wasted.
    “The future of PC gaming” is right in the title of this post, and that’s what these cards have been built for. Nvidia has a strong lead over ATI in compute and tessellation performance now, that’s obvious; however, that will only provide useful if and when developers decide to put the extra effort into taking advantage of those technologies. Nvidia is gambling right now; it has already given ATI a half-year lead on the DX11 market, and it’s pushing cards that won’t be fully utilized until who-knows-when (there’s no telling when these technologies will be more widely integrated into the gaming market). What will it do in the meantime? ATI is already on it’s way to producing its 5000-series refresh; and this time it knows the competition’s performance.
    I was hoping for the GTX 400s to do the same thing that the GTX 200s did: give nVidia back the high-end performance throne. ATI is not only competitive with it’s counterparts, but it still has the 5970 for the enthusiast performance crown (don’t forget Eyefinity!). I think nVidia made a mistake in putting so much focus into compute and tessellation performance; it would’ve been smarter to produce cards with similar die sizes (crappy wafer yields, anyone?), faster raw performance with tessellation/compute as a secondary objective, and more competitive pricing. It wouldn’t have been a bad option to create a separate chip for the Tesla cards, one that focused on the compute performance while the GeForce cards focused on the rest.
    I still have faith. Maybe nVidia will work wonders with the drivers and producing performance we were waiting for. Maybe it has something awesome brewing deep within its labs. Or maybe my fears will embody themselves, and nVidia is crossing its fingers and hoping for its tessellation/compute performance to give it the market share later on. If so, ATI will provide me with my pair of cards.
    That was quite the rant; I wasn’t planning on writing that much when I decided to comment on Drew’s post. I suppose I’m passionate about this sort of thing, and I really hope nVidia doesn’t lose me after all this time.

  • Max

    Nvidia, it’s good to know your response to our biggest concern in these several days. I love what you guys have been doing for the PC game industry and I’m thankful for that. The GTX480 has awesome features that an enthusiast gamer need in the present and future. It’s funny that I was a little disappointed at first because many DX9-10 games were tested in those reviews and the card wasn’t much faster in those games, but then in DX11, it does a very good job, which is worth my wait for a DX11 card. It consumes a lot of power, but I agree with you that it was a trade off, anyway, that some more power doesn’t change my power bill much, at all. The problem is the heat, you said that the gpu was designed to work at high temp but it may even go up to 10 degree more in my country’s hot environment which can break the card. I want to hear your explanation about this, or I will have to wait for the next Fermi card that is a lot cooler, in that case, please release them soon. Thank you, keep up your good work and always be the leader with newest technologies for PC entertainment.

  • Kyle

    You pretty much said exactly what i have. i have 9800gtx sli and was holding out for the 480 and after much deliberation as to just suck it up and move to red… ive decided to wait till the next 4xx.
    <3 nvidia


    Max I agree with you. I appreciate the overall quality of the fermi architecture. A good scenario would be a car company trying to sell a car based on just horsepower. Horsepower is nice but what about dynamic traction control? What about bout leather seats? what about a really nice suspesion for a smoother ride? Sunroof? GPS? etc. A good car is not just graded by raw horsepower but the overall luxury and quality that goes into the build. For $100 to have all the luxuries of the Fermi architecture, someoen would be crazy not the purchase the GTX 480 I think it’s better value for my money.

  • Efe Selcuk

    You and Max both make valid points. However, this late into the DX11 game, I would’ve really appreciated cards with raw power to match or beat the 5000 cards. The compute/tessellation performance is ridiculously awesome, I won’t deny that. But when looking at current games, the extra features aren’t utilized to quite justify the price tag; I’m not seeing that face change until maybe late into 2010. I don’t know the nitty gritty details of GPU engineering, but it would’ve been nice to see cards at least with further disabled tessellation/compute performance at lower price points. At least then the two announced cards could at least be competitive price-wise.
    You used the car analogy, which somewhat fits. However, when one buys a car, most, if not all, features can be taken advantage of right away. Leather seats provide the comfort, GPS provides navigation, etc. Tessellation and Compute power in GPUs aren’t fully utilized yet, and it hurts to wait half a year after ATI’s releases to see cards that’ll be amazing… in the future.

  • Bryan W

    I love this card, the directx 11 demos are awesome! imagine what would they do with Nvidia Geforce 495..its off the hook! i have my Geforce 9800GTX and I’m looking forward for the Geforce 4xx series
    Released Date for Singapore??? >.< V

  • Bharadwaj Chandramouli

    i second efe selcuk’s perspective…
    seriously now its all upto the developers to back nvidia for its big gamble at tessellation/compute…
    hope they do…
    nvidia FTW!!!

  • Cagan

    I watched the video allright. It’s bunch of PR. If you guys actually cared about gamers, then you wouldn’t sell gamers Tesla cards as Ge-force. GF 100 based Geforce GTX 480 and 470 have lots of unnecessary parts inside chip which has nothing to do with gaming yet the chips heat up to 95 because of extra billion of transistors that gamers don’t need. You don’t care about gamers you care about making money. My feelings about this company would have been lessened if GTX 480 and 470 priced competitively but i said myself “the same old nvidia” after seeing the street prices. 470 costs more than 5870, and 480 has the same price label as 5970 here in Turkey yet 5870 and 5970 outperforms their comptetitors for the same price. Only stupid brand-loyal people pay much and get less.

  • Drew Henry

    Wow! You had a lot to say. 🙂 Thanks for taking the time. We knew a few years ago that it was time to build a new GPU architecture. Some of today’s games (with our first set of drivers) show it off better than others. Believe me, as we tune this baby, she’s just going to go faster on many games. But we did build it thinking about the future, less so the past. We want tessellation perf to rock since we deeply believe that PC games should have the same geometric complexity as movies (think of the cool characters in the Pirates of the Caribbean movies). Explosions should be physics based, not animated. Clothing should act like fabric, not rubber. And finally, you should be able to feel like your inside the game … 3D Stereo! Time to “Crank Up” games!
    The guys I had on stage with me have that same passion about making PC games great. They are the developers trying to make a game that is different from what you see on consoles (that are pretty old now). The graphics in most console games look alike now. I think the PC can do better. We built GTX 480 and 470 with that in mind.
    Stay passionate!

  • Drew Henry

    We build all our products to work in extremely hot and extremely cold environments. The GTX 480 is no different.
    Enjoy it!

  • Drew Henry

    Hope you’ve seen all our tech demos, Bryan. Tessellation is a great adder to games and our demos of hair, water, and grass show it off.
    Don’t know exactly when our partners will have the board in Singapore, but we’re building a lot of them now so hopefully very soon. BTW, I love the Pepper Crab in your country!

  • John Mann

    Bah, PR crap. Where is the 512SPs? Why are you NOT doing a B1 run when it is clearly called for? Why have you screwed this up so badly? Seriously, One should not have to invest in a window AC unit just because they bought your GTX4xx cards or a new sound proof case or sound proofing for their case.. The GTX480 performance DOES NOT warrant a price tag 100+ US dollars more than the 5870 it beats by on average 15%. They are way to hot, way to late and way to noisey.
    I have been a long time fan of Nvidia to the point of defending you on message boards until being banned for a period of time. There is no way in HELL I can seriously tell someone to buy the 480. Do what you should do, rework it, do a B1 stepping, get the heat down and GIEV US THOSE 512SPs you sorry sacks.

  • Drew Henry

    We’re building a lot of them right now, so hopefully it hits Denmark fast. BTW, you can download our new Design Garage ray tracing app and it will work on your GTX 295. Check it out!

  • Efe Selcuk

    Cagan, of course it’s PR; that’s not significant in any way when looking at their cards. The problem was how they decided to approach GF100. You mention that they come with “lots of unnecessary parts inside chip” that don’t pertain to gamers and extra billion transistors that we don’t need; however, you are mistaken. As I mentioned earlier, the way they designed the chip was to heavily push tessellation and compute performance, and they have definitely succeeded; as for the transistors, the card would rip ATI to shreds if there were fewer advanced features. If you look at any proper benchmarks that have looked at examples where tessellation/compute power is stressed (AnandTech has dedicated portions of their review to them), the GTX 470/480 kills ATI. However elsewhere, most benchmarks are showing little gain, equal performance, or lower performance in various cases compared to ATI because most games don’t take advantage of those features. The higher price tag is because of the extra features causing a giant die size (from the extra features) and low yields from TSMC. The value of these chips will show themselves in certain cases now and more commonly in the future.
    P.S. Where are you from? Ankara myself, currently in the States 🙂

  • Efe Selcuk

    I suppose I wouldn’t take as aggressive of a stance, but I understand the frustration. Skim through my giant rant if you want my opinion. The cards are great for what they were made for, just not utilized yet.

  • Drew Henry

    Thanks. Let’s all keep pushing on the developers. They listen to their customers as much as we do. They will do the right thing.

  • Drew Henry

    We decided not to do the 512 version at this time, John. Might in the future. Thanks for being a long time fan. I hope we can change your opinion as more and more people get to experience the product.

  • Ryan Bridgeman

    I’m currently content with my GTX 280 but have been keeping a close eye on the NVIDIA blogs. I’m not in need of a new card at the moment but I’m certainly interested in the nicely equipped 400 Series.
    To some of the other posts.. NVIDIA knows what they’re doing. They might have made some tradeoffs but did them to preserve other positive points on these cards. I’m certainly not disappointed from what I’ve seen as of now. Keep up the great work.

  • Efe Selcuk

    Thank you for reading and replying Drew. I’m proud of nVidia for taking the initiative and building something fantastic, and I hope I didn’t seem to demean that in my rant.
    Do you remember what the GTX 260/280 did when they were first released? They destroyed ATI’s top cards at the time; when ATI fired back, a back-and-forth battle started with nVidia finally ending up with both competitive cards and the performance crown around the middle of last year. That’s what I wanted to see with the GTX 400s, especially after waiting so long from ATI’s releases. Unless the drivers are vastly improved within the next month or two, it’s hard to justify the new cards when I’m able to buy a Crossfire setup that’ll beat the 480 for not much more.
    I’m giddy for games to resemble your vision. But it’s going to take a while for developers to jump on your dreamboat. Until then, I’m tempted to go for raw performance at a good price.

  • John Mann

    I have a feeling we will never se a 512 part this generation. Kyle has confirmed you guys canned a B1 run, you are insane for not doing it. I also seriously doubt the 480 will have a very long life span, teh 470 only slightly longer.
    All I can say is the midrange stuff better make up for this massive screw up.

  • Drew Henry

    Thanks. We try. 🙂

  • blackpawn

    i’ve been eagerly awaiting this card and super excited! CUDA is great and i can’t wait to see what performance gains the Fermi architecture gives. benchmarks on existing games may be a bit misleading because this architecture should unlock whole new algorithms and techniques. hooray for innovation! and if this part runs hot enough i can turn down my heater. 😀

  • Ryan Bridgeman

    Compared to the GTX 280, when the fan is running at full speed in a game like Crysis or BFBC2, what’s the dB difference? It’s not to say the 280 is loud. Plus, when it’s in the case, it dampens the sound a bit as well. Thoughts?

  • Sven Martienus

    I currently have a GTX285 (i7,6GB,SSD etc) and been a long time Nvidia fan (since the Geforce2 GTS) only went Red when nvidia released their 5xxx serie’s 🙁
    From what ive seen so far is impressive(DX11) but on other ”older” games is somewhat meh …
    Crysis and Warhead i thought it would fly, no crush, no nuke the s**t out of ATI but is acctually slower by a frame!.
    Same goes for BF:BC2 and the Stalker series, its just like there are some driver issue or its the best it can do.
    Heat on the other hand is a big issue for me, i live in holland and during the summer it can get quite hot and im using a Corsair 800D wich isnt known for its amazing airflow.
    So im trying to come up with a solution for a better airflow.. even if you guys designd the card to run at 96C in my case that will be 105C so i have a big problem and in the summer it will get 115C im sure of that..
    I wanted to SLI the GTX480 but thats not a option anymore from the reviews ive seen and maybe ill try to WC it but its new to me and it will cost me around €300~400 euro’s to do it right ..
    So with the card included i need to spend up to €900,- just the get the card to run cooler >.<, ive recently upgraded my rig and already spent €2200,- .. Really dont know what to do know ... 🙁

  • Mauro

    Hello Drew, it’s really a nice thing you spend some time answering to fan’s questions about the new product, that’s a +1 for you and for Nvidia of course.
    I’ll not talk about temp and power issue, they’ve already been discussed, and in my opinion they’re a problem but not so relevant, in my opinion the problem is another one, it’s price.
    The question is this: if I’m an enthusiast, who has a lot of money to spend on gaming, who has a lot of money to spend on a 3 system monitor with 3d goggles, nvidia 480gtx is NO DOUBT my card, but, if I’m a normal gamer who has 300-400€ to spend on a video card (and I think it’s a lot of money for something you change every 2 year or less), and I use, like 99% of ALL pc gamers, just 1 screen, why should I buy 480/470gtx instead of ati’s 5970/5870 that costs the same but have more game performance?
    For the price of a 470 today you can buy the 5870, and for just 20€ more than the 480, you can have ati dual gpu 5970, and both ati cards are faster than nvidia ones, who have of course a lot of interesting technologies, but those techs in my opinion do not affect the majority of gamers, but just that small proportion called enthusiast.
    I think Nvidia has put a lot of effort in very, very interesting techs like 3d sorround, which is GREAT, but I think you’ve forget about the medium gamer, about that vast majority that does not have the money to buy 3 100hz monitor, 2 gtx 480 (because you can attach just 2 monitor on every card, not 3) and 800W+ PSU.
    Right now, that vast majority is choosing ati 5850 or 5870, and I can’t see why they should buy a 470gtx, wich costs the same as 5870 but has less game performance.
    Sorry for the long post, and congrats for your job 🙂

  • Cagan

    South-east,Mersin. How did you end up in States?
    I’m aware of their good tesellation and computing performance. But they’re only good for scientific applications such as Folding@Home and synthetic benchmarks like Unigine Heaven. In todays DX11 games with tesellation enabled there’s nothing mighty about GTX 480/470. It’s a tricky thing (in positive way, it’s a compliment to nvidia) to have CUDA cores do the tesellation work but i guess it loses the efficiency when CUDA cores are required to do other tasks.

  • Hanif Roshan

    I’m Nvidia blood for the past 13years. i still have all the gpu’s from Riva 128. Love Nvidia! Getting my 480 on 14th this month! I hope you guys release better drivers i need Crysis to be above 30fps at any cost!

  • Amorphous

    I think you’ll sell yourself short if you just jump into what’s just got great performance on today’s titles.
    GTX 480 is built around DirectX 11’s main new feature: tessellation. It’s a mixed boat with ATI’s solutions at today’s titles, but are you really buying a high-end, next-gen GPU to play yesterday’s news? Or do you want to crank it up tomorrows titles?
    This slide tells the news:
    GTX 480 is 2.03x to 6.5x faster than HD 5870 at DX11 tessellation.

  • Cheshyr

    What is the recommended power supply size for running the various SLI configurations of the GTX400 series?
    I’m incredibly excited about the GTX400 series products, but I’m having power supply concerns. Maybe I’m ignorant to how this works, but I thought there was a certain limitation on how much power a video card is allowed to draw… something like 300W per card. Tomshardware is posting numbers well in excess of 400W per card. While part of me questions the accuracy of their numbers, the other part is more concerned with practicalities… I just bought an 850W power supply. After seeing these numbers, it sounds like that’s not big enough to run even 2-way SLI 480’s. It also makes it sound like there’s no Power Supply big enough to run a 3-way SLI GTX480 rig. Are these numbers being misrepresented, am I misinterpreting them, or is it a little of both?

  • Johannes

    I’ll pre-order my GTX 480 today, so I have a question regarding the cooling of the card in my case:
    As far as I understand, the fan of the card sucks in air from the side and blows it out at the front (the I/O panel). In my case, I have a fan directly pointed at the backside of the card. Should it direct fresh air at the back end of the GTX or hot air away from it, what will be more effective?
    I also plan to mount an additional fan at the side of my case, blowing fresh air at the heatsink of the card.

  • Mike

    Sorry, I’m not convinced Nvidia’s for gamers anymore.
    First, they blocked PPU support in the latest Cuda, and blocked physx if a non nvidia card was present in the system, under some false excuses (as if you work “around” the block, it works perfectly fine, so there’s no technical issue, hence, nvidia lying big). Then they released a driver that fried their own cards, well done on testing…
    Now, they’re releasing a supposely godly card, that barely beats the PREVIOUS generation of their competition… and they’re celebrating it as a big event? Oh look how good we are, it took us this long to build something that beat the FORMER best card of our competitor, but their current flagship card still beat ours by far! – I’m not sure what NVidia’s so happy about, for the past couple of years, they seem to have done mistakes after mistakes, which is why I went back to ATI a couple of months ago, when I realised nvidia was far behind, and happy about it – time to wake up boys, Maybe it’s time for Nvidia to retire for gfx card. Stick to physx add on cards, and motherboard chipsets. Actually, no, don’t do motherboard chipsets, I only have bad memories from those also.

  • okeefe58

    I for i applaud you nvidia for leading the way in technological advancement. the software companies will catch up, and it will be amazing, THANKS TO YOU. I find it extraordinary that this much technology can be had at the prices at all. (as in your chip alone = 4 I7’s) keep doing exactly what your doing.

  • VWXYZadam

    Hi RaymanDK
    eller, Høj Rayman DK :b godt at møde andre danskere her på teh interwebz!
    But lets keep it english for the sake of others ^^
    I’ve been looking around, and it seems to major outlets in Denmark first get the cards in stock May 1 🙁

  • Moopy

    Both ATI and Nvidia have a great product but the main and most important thing that sets Nvidia apart is the drivers. ATI drivers are buggy, especially in 64bit OS’s and I would never waste $500 again on a product that, althou has great hardware capabilities, creates blue screens and windows aero deaths constantly popping up warning bubbles and interrupting your gaming. And I refuse to go to 32bit OS just to have the latest ATI product that may or may not be the better at the time.
    Thankyou Nvidia for your constant customer support, service makes a products worth just as much as its performance. Keep it up!

  • Klaus Schuster

    Dear Nvidia in my history of pc gaming i had only nvidia cards gf256 gts2 gf4ti4200 gf6800gt gf7800gt and now i have a gts250.
    For the imminent future i want to buy a sli of gtx480 with the gts250 for physix to great 3dvision performance on multi screen.
    Take out the 512 cuda cores gf100 and i bought it immediatly hehehehe.
    gf100 is bigger hotter and great watt eater, but is the fastest single gpu of the world, performance are closed to the 5970 on the 1080p resulution and in the minimum fps(frame rate never drop over 30) are the best. Great work!
    hd5xxx series ??? yes its funny cards but if you want to play metro2033 with tassellion on, you simple can’t do it… dx11 and tassellion go best on nvidia gf100 architeture.
    For the future cut down temperature with new copper cooling system ^^

  • neliz

    Time to crank up the Heat eh?

  • Alistair

    I’m not fussed about the heat but, at least for the UK, nVidia’s cards in reviews have shown no outstanding performance increase over competing cards and are priced ‘competitively’ meaning they’re no worse or greater of a deal when measuring price vs performance BUT the increased fan noise and power consumption are clearly going to sway buyers towards the ATI cards (which now have 6 way multi-screen support).
    nVidia, drop the price of the 480GTX by £50, drop the price of the 470GTX by £30 and you’ve got yourselves a no brainer, right now you’re going to lose a lot of sales all over a couple of pounds.

  • fan

    max:>It’s funny that I was a little disappointed at first because many DX9-10 games were tested in those reviews and the card wasn’t much faster in those games, but then in DX11, it does a very good job, which is worth my wait for a DX11 card.< The GF100 is not much faster in DX11 than the same games in DX10, in percentage it is most the same difference. Overall there is no DX11 advantage (in real world situations), that is not in DX9/DX10 too.

  • kadir

    efe selcuk:> you say that the new technologies inside Fermi will show themselves in future applications when they support them.. Just think about that; why do we spend 780$(included tax in turkey)on a video card which is inferior to counterparts and chepaer ones at present and wait for a long year or two for the applications that they will really take advantages from tesellation. etc. Besides, most probably a new generation of video card will take place while waiting for this long time. In computer technologies future is a long time that can not be waited. I donT need what ı will not use and I don’t spend my hard earned money for a dream..

  • Michael

    I’d like for you to tell me how you think Unigine Heaven is a synthetic benchmark??
    Synthetic benchmarks are benchmarks like SiSoft Sandra, WEI etc… Heaven shows a REAL game engine.
    Not only that, Heaven is VERY FORGIVING to GPU’s – remember that when Unigine is implimented into an actual game, you have to think of all the constant hard-disk activity, all the AI rendering that has to be done by the CPU… I cant comprehend why you think its a synthetic benchmark.
    I’m not a huge fan of Fermi, but it can do a lot more than what you brand as “scientific” or “synthetic benchmark” applications. It can play most games with relativly decent performance (with the added downside of heat and noise and power)
    Credit where credit is due. Its a good card. But flawed.
    I must give KUDOS to nVidia for bringing such a rich featureset to this new GPU, congratulations! For enthusiasts, I find this card attractive, however for gamers (and deluded people plagued by fanboiism) this card probably wont be as alluring.
    Saying that, I’m not an nVidia fanboi. I was, but recent activity by nvidia and ati have cause me to not be biased towards either. But my next GPU will be a 5970 (With my current GTX285 as a PhysX card.

  • drakishar

    I had Nvidia Cards since TNT2 . That was in early 99 or 2000.
    Anyway in last 10 years i used only one ati card, radeon 9800 pro when Nvidia launched the dust buster geforce 5800 . Hold the radeon for 1 year or something then jumped on 6800 GT then on 8800 GTS .
    Now i have to upgrade.
    Also i don’t want to spend more then 300$ to be able to play games @1920×1080 with 2XAA and such.
    I was waiting for nvidia to 4xx series with the hope that 470 will be 300$ in stores . However seeing the reviews and the temps& noise that a 470 creates its ……. I don’t feel confortable when my pc generates a noise close to my 3.2 FSI 300 horse power audi engine.
    Regarding the drivers, to be honest i expect a redo of the Aquamark 2 fiasco drivers back in 2003-2004 , you know very well what Nvidia has done back then just to stay on top of the Radeons.
    So lets redo :
    I have a 8800 gts and i want to upgrade . I am 100% sure that in next 2 years less then 10% of non shooters major game titles will use tresellation.
    – max budget 300$usd with VAT
    – what options do i have ?
    – last generation GPU
    – well only ATI has a last generation card in this price range
    Why do you force me to buy ATI ?

  • Sotos

    The GTX 480 is a FAIL !!
    a) Power consumption is big.
    b) Heat is pretty bad
    c) Noise is bad as well as the 480 sounds like a Jet engine under load
    Also the competition (ATI 5970) has still a faster product
    Sorry Nvidia but your sollution reminds me the old and bad FX 5800 days…

  • Robert Johnson

    I don’t mind the heat if your fan was running quietly. The design of the fan solution is absolutely terrible. It is too loud and reminds me of the FX5800 disaster. Scrap the current cooling solution fan and find something far more quiet.

  • Datsun

    I believe GF100 is a great GPU. I did not see any wrong in its architecture. Unfortunately, the process node which GF100 was fabricated did not meet expectation. I think nVIDIA should try to use GLOBALFOUNDRIES for its future GPUs.
    NVIDIA Fermi architecture has some potentials as general purpose accelerators based on GPUs. It doesn’t matter how the main CPU’s performance but I believe if it was using customized chipset from nVIDIA, I think it can boost many HPC applications. Unfortunately, since Drew Henry has said that he did not believe AMD’s processor will be competitive and its market share is going to be declined forever, nVIDIA will not develop its own chipset for AMD Opteron especially for the newest 6100 series. I think Drew Henry did not have a clue that there are many variables in computing solutions market. One of them is in the chipset. Since nVIDIA has been made a single chip server chipsets for AMD Opteron, I think any improvement from current vanilla AMD’s chipsets will be welcomed by every party especially by OEMs and users.

  • Andreas

    Yes PLEASE come to Denmark as fast as possible! xD We always get things late :-/… Hope it’s comming before my birthday april 30 2010 🙂

  • Flanker

    Been playing since C64’s to the 486’s, Pentiums etc with 3D cards through Voodoo and Glide to OpenGL and DirectX. Since then only 2 seriously credible corporations have remained, ATI and nVidia. Their competition sure makes industry leap forward, but with what price? In some games the user of the “red” card suffer because the game has the “green” gaming slogan thus causes lot of problems at times and vice versa. Why this artificial dumping of competition by blocking them or similar? Does it serve gamers. Hardly.
    As a gamer I want to play all my games without thinking is this branded for the “green” or “red” team so I might have problems using my existing card. No! I buy a game with my hard earned dineros, go home to install it and want to PLAY! Not take part in the competition between these two companies. And seems this situation will only get worse in future.
    Both companies have released great products that suit gaming more than adequatly now and in near future. Just if I look from my point of view the 5870HD currently in use beats the crap out of 480GTX. This if I look at the money I would have to invest to get this card to work versus performance gain over my current card: the new card (500€ or even more), a new PSU(200€ for a good one), possibly a new even better ventilated case(150-200€) and for sure a better cooling solution(100€ or more)for the card.
    Now let’s see what I had to change on my current rig to install the ATI back in november: Nothing! Just bought the card for 375€ and slammed it in. Been running since flawlessly, cool and quiet and with more than adequate performance in games I play: flight simulators and occasional RTS or FPS. So quite a difference there when looking at the overall picture. No mumbojumbo or whatever technical gimmick convinces me to go the “green” way because of this. I am enthusiast player, but not made of money or my parents are not funding me anymore 😉
    Long rant but hey, everyone is entitled to their opinion. Both companies provide a lot of power to a gamer, but the mileage varies on how efficiently it is actually brought to you 😉
    Happy Easter all..

  • Drew Henry

    The Tom’s Hardware charts you are looking at are total system power (not GPU power) running an app called Furmark. No one that builds HW likes this app because it’s completely unrealistic for how anyone uses a PC. You’ll see comments about this in their review. All our GPUs meet the power requirements for PCI Express.
    Power supplies vary in performance, which is why we have SLI certified supplies that we have tested ourselves. There is a lot of crap out there. We post the supplies we test to SLIzone. GTX 480 certified supplies will be listed soon.

  • Drew Henry

    Thanks Moopy. They’ll appreciate hearing this from you. Our driver team is the best.

  • Drew Henry

    Metro 2033 is a great game, Klaus. It’s full of next gen effects that make it so much better than playing on a console. Enjoy.

  • Drew Henry

    i have to disagree here. There are games that have a little DX11 sprinkled in to what is really a console port and there are games that are embracing it. We rock on the games that embrace it (like Metro 2033). Check out also what the guys at Bitsquid are doing (former GRIN guys who did Terminator:Salvation). Their Stone Giant demo shows also what their next title will look like. Amazing.

  • Drew Henry

    The GF100 architecture powers all our new GPUs. we have more coming at all price points throughout the remainder of this year.

  • N0_R3M0RS3

    How mature were the drivers that were released with the review cards is what I want to know. Were they beta? Were they designed specifically FOR the Fermi architecture, or did they just support the Fermi cards?
    I think that the 480 is a step forward for gaming personally. For too long we’ve been stuck in the DX9 quagmire imposed by the consoles, it’s time we REALLY pushed the envelope of what gaming can, and should, be like. In my opinion 10-15% performance increase over the 5870 in already released games is just a bonus, the real value in the cards is for the games of tomorrow and the applications of tomorrow, as more and more applications and games take far better advantage of all the parallel processing power of GPUs. ATIs cards, while not horrible cards in the slightest (5870 is still a beast, no matter which way you look at it), just don’t support the true heart of DX11 as well as the Fermi cards do. The only gamble is whether or not the devs out there can actually separate themselves from the consoles long enough to implement things such as tessellation.
    Being a tech-whore, I will be picking up the 480, if for nothing else than it’s the fastest and therefore I must have it. But if nVidia and developers can work together to really push DX11 features into games properly, then the Fermi cards might just be the best thing to have.

  • Bob

    Crank that sheet up. Just remember boys and girls it the other company that has filckering in BFBC2.
    I can tell you first hand it isn’t happening here.
    Why because we are the green team. We are going to crank that sheet up. Your kdr will be even worse. So you will either wise up and jump ship. Or cry even more than you already have in the past.
    Because drivers will not save you. It didn’t work yesterday (bf2) and it’s not working today BFBC2.
    You flicker, your mine…

  • Drew Henry

    I played BC2 in stereo (3DVision) at the GeForce LAN at PAX. It was great. Got my a$$ kicked by EA’s BC2 product manager, though. 🙂

  • wehan vd westhuizen

    when will the gtx 400 series be in south Africa

  • Drew Henry

    I can’t say exactly since our partners bring it in, but we are shipping to all our board partners in the coming days, so hopefully very soon.

  • ceebitt

    I also have to handle with my budget like drakishar because if I decide to do some graphics upgrade I do this in a specific time for more than one PC..
    So is there any list available witch shows me the release dates of several cards including the technical specifications for just preplanning?
    So anyway GF 100… Crank that shit up!!! (if its not getting so hot) :-))

  • Arthur

    I hope that the GTX485 is going to be better.
    The whole Internet says that the fermi is bad.
    High priced and hot. but i hope that they will fix it with the GTX485 like they did with the 280 -> 285.
    I hope the GTX285 will have : 512SP’s, lower heat, and less power consumption.

  • Arthur

    oops i meant the GTX485 sorry my bad.

  • atik kurniawan

    great product nonetheles. nvidia just have to encourage developers to implement their dx11 games with their distinctive features. and the heat,, lower it please, just to pleased those noisy customers.

  • Arthur

    I can’t wait for Fermi2.

  • Sykologic

    Just wanted to say that I really think the wait for fermi that so many people had to go through was not really worth it. I myself was planning on getting a 5770 until I realized that I don’t need that much power for my crappy screens right now. I decided to order a MSI 1gb GTS250 and a new power supply. I hope it lasts a while, especially since my Evga 7800gt can still get “ok” framerates on BC2. But my point is, why did the people have to wait roughly half a year for something overpriced, overheated, and only 5-10% performance over the competition. I know you have stated that the GTX4XX series is meant to run at these high temperatures, but many enthusiasts care deeply about their case temps and power bills. Nevertheless, good job on the worlds fastest GPU. Hopefully the temperatures won’t stop you from making a dual GPU card. Otherwise, we all know you will be moving on to a 28nm process soon enough =)

  • babygogo

    I have previously owned 8800 gts sli, 8800ultra tri sli, tri sli gtx 280 and was always fond of performance advantage over the competition and driver support. i like Nvidia performance oriented product and every time i go for water cooling so heat and noise are not an issue. i know i represent a minority of enthusiasts here, but i know nvidia focuses alot on enthusiast even though we are outnumbered.
    My question is: my understanding is the 512 cuda cores were cut down by 1 cluster to 480 (to yield and heat issues i imagine), has it been done by laser cutting them off or via a BIOS patch. in case of the latter could an enthusiast who is able to cool the card properly enable the disabled cluster for extra performance?
    thank u for hearing me out

  • gsg

    Fermi is a new FX 5800…
    Epic fail

  • cyrix133

    Fermi is the BEST !!!!!!!!

  • John Mann

    What Nvidia? Couldn’t take the heat/noise of my post so you removed all record of it. Very funny.
    Someone should not have to invest in a dedicated AC Unit or a noise dampening case or materials, EVER! You need to do a B1 stepping to improve yields and get the heat and power under controll. The 480 IS a salavage part.

  • john mann

    Dude I would freaking hope the 5970 is faster in most cases, IT IS A FREAKING DUAL GPU CARD! Why do you insist on trying to compare it to a single GPU card? That is stupid and nuts.

  • John Mann

    Not hardly. Unlike the FX5800, Fermi beats its intended single gpu conterparts from ATI. The only things it has in common with the FX5800 is noise, heat and late. Fermi DOES beat the 5870 and 5850. The FX5800 could barely beat a 9500Pro

  • NVLover

    After reviews many gamers cancelled their preorders in my store there was 20 preorders now we have only 6. New GTX 480/770 are great but da prize? cut some prizzes 50 bucks at least on 480. Oh and one more thing future of gaming=consoles sorry mates… you better make some elegant graphic chip for PS4 but not Fermi not this power hungry and hot FAILURE. Och and one more thing!!
    Intel should Buy Nvidia… and fired the team behind Fermi…

  • AffricaKing

    There is too hot in Africa for new NVidia GTX series! sorry mate!

  • EasterBunnyPissed

    I think there will be NO easter bonus for NV emplyees

  • Someone

    “The chip is designed to run at high temperature so there is no effect on quality or longevity.”
    Wow. You just keep telling that to yourself, fanboys. What about the PCB and all the components on it? What about Fermi running just shy of the 105C throttling point under load? A bit of dust and a hot summer day and you’re going to be cursing.
    You WILL see a bunch of dead, warrantied 480s within a year. Guaranteed.

  • Someone

    Enjoy your monstrously hot leafblower, fanboy.
    You probably bought a 5700 Ultra and actually believed it was better than a Radeon 9800.

  • Someone

    I’m sure those 480 SLI certified power supplies will run on plutonium.

  • Someone

    Yeah, because Nvidia didn’t release a driver that literally fried a bunch of cards by screwing up fan control!
    Oh, wait…

  • Matt Leshman

    Recognizing that I am a fan of Nvidia, and always will be, I was eagerly awaiting the release of Cloverfield Fermi.
    But at launch, my friends (before Nvidia Fanboys now ATi users) tried to pull down my expectations. They said that while benchmarks often show their superior performance over the ATi Cards, was doomed by his use of power and thermal excess. Fermi I was told that it would burn because of their high temperatures.
    I still believe in nVIDIA?
    I still believe in Fermi?
    I’m a gamer, but I also work with video editing and motion-graphics. And I know that CUDA can give me everything I need, not just to play but also to work.
    But what about the life of the board.
    Die playing? Die working?
    I want to believe that Fermi will not fail.
    I want to believe in nVIDIA.

  • Manny

    This decision to open yourselves up to untold criticisms,and bashing is a brave one indeed ! I dont think in all my years as a gamer,have I scene SO much flaming anf trolling on the various gamer forums,as I have just before and especially AFTER a gpu launch.
    My wife and I are also,stock holders and gamers.She is using a GTS640 from eVGA and I a BFG GTX260/216 Maxcore 55.
    As long time stock holders we have over the last two years been somewhat disappointed in NVDA’s performance financially.We bought in several times,increasing our stake in the company.
    My personal,fairly objective take as it all stands right now (after reading all the reviews,from near a dozen and a half+ sites,all the,many many leaks and hundreds of often sordid rumours,the many threads on B3D over the last 6+ months,XS,[H],Anand,Techreport,xBIT,Guruof3D,etc,etc,etc…..
    Is that AMDATI has this round in the bag.Unless NVidia software engineers,can pull the proverbial rabbit out of the hat,this soft launch/paper launch begs me and my memory back the R600 days.This is not as some have made it out,to (NOT) be your second FX debacle,but it reeks of the infamous R600.Very late,very hot running,and only acceptable performance in regards to the ‘enemy’,more so,when one thinks about the MASSIVE difference in die sizes,and performance per mm2,with the RV870.Or worse yet the price,that from anecdotal info all around the web,on many a gaming site in the last week,most seem to think the GTX480 is too pricy,via the 5870,and its mnay a derivitives.One other thing I worry about is rumours of NVDA repeating the many damning steps that 3DFX took before ultimitely committing hari kiri,like alienating AIB’s.
    Lots of rumours abound about BFG,and XFX not being happy campers.
    I am very curious about the TDP as officialy stated…Why ? Well many a review and website are getting widely varied numbers,with the 480.We are seeing Guru with seemingly fine thermals and noise,and Kyle with jet engine type noises,and wide,variances in temp readings,all over.As we both use multple lcd’s (24″,and above) we have both read up on reports of 90c temps in 2D mode ? Is this true ? If so,will this be adressed in hopefully very near future drivers ? Kyle said he did not run into this issue,but others have.
    One thing I really want to know about is,NM THE LASERED OFF 32 cuda CORE’s IN THE 480,but
    what about the not often spoken of,but tantlizing rumour,of the extra (64!) missing TMU’s…More then a few with an inside track on these things say they are there on Fermi,and also,’turned off’…. because of heat ? I can only guess yes.
    I feel that this chip was overly agressive for 40NM,and espically on a node thats been messed up as badly as TSMC’s has been going forward,over the last year+.
    I mean 3.2 Billion transistors ?! Too bad TSMC abandoned 32nm.Is it 3 billion or 3.2 billion ? Kyles GPU-z shots,show it as 3.2 billion.
    All that said,we are in for two eVGA 480GTX’s come May.Sad that 28nm is SOOO far off.
    We use all of our systems to FOLD as well as game,so we are both looking forward to see how many more PPD’s we can crank out with Fermi under the hood.As for heat,we both watercool,so its not a biggie,but something we hope to see adressed,soon.
    Here’s to hoping NVDA’s engineers are holding up under the strain,I can only imagine the incredible amount of blood sweat and tears that went into this launch.
    PS: I am also hoping to get some candid answers on my above comments and q’s if at all possible,Drew,and hope the weekend treats you and yours well 🙂

  • Doug

    You say these cards are built to run at high temps. But silicon is silicon so the same crit temp of 105 degrees still apply. Thats not a lot of ambient temp room.


    Efe, I hear and sympathize with you, however I would like to bring something to your attention. To wait till later this year is not so bad when you think of what’s at stake. If nVidia did not put these sweet ahead of the curb feature in fermi then none of the developers would design games to take advantage of tessellation/ray tracing/3D sterioscopic/etc. So it’s the question of which came first, the chicken or the egg? However nVidia has taken the resoponsibility of being the leader in advancing the videogame experience. Without nVidia investing this much “Vision” we would be stucked at DX9/10 quality games in frame rate competition. I support nVidia because they support the future. But I love AMD as well. So for a $100 more and about 9months to give developers a good chance to catch up. I’m happy to support.

  • Petrus Laine

    So how come the card, which you list with 250W TDP, is consuming around 30-40W more power (both in games as well as apps like Furmark) than a card with 294W TDP from the competitor, especially when it’s known that certain applications can take that card with 294W TDP to near or even a bit over it’s TDP?
    In fact – a site measuring only the consumption of the card itself lists GTX480 consuming 320W, that’s 70W over your listed TDP, at max, and the application run isn’t even run on it’s heaviest settings at that test

  • Dương Nguyễn

    after read all the stuff above , comments and reply , also go around some computer store near my location (pccg , cpl , scorptec ) just got some question in my mind :
    You guys said that the GTX 470 / 475 in future is a combination of p/p . but when i read some reviews from Guru3d , anandtech ,….. I found out that if I going to buy that one for the new rig , I just throw another $100 away for the same performance rather than buy the 5850 .
    Also , for gamers with tight-arse budget ( $300 or less with VGA card ) , what will nvidia have ? GTX 460 and GTS 450 , maybe ? But don’t tell me it’s ” the way it’s meant to be rename ” as the GTS 250 ( 9800GTX+ , no more ) . The chance of GTX 285 / 275 rename and put as GTX 460 / 450 in my opinion seems a bit high
    Seems that GTX 400 series are designed for server / workstation with really loose budget . not for gamers like me 😛
    I’ve been a customer of nvidia for over 9 years , since i got my first rig with a Riva TNT to play CS and all the game at that time . and all i got back for trusting Nvidia seems a bit ………….. dissapointing
    p.s : sorry for my english if there’s any grammar error
    edit : pay a visit at PCCG today and they said that the GTX 470 is $499 , while HIS 5870 after market cooling is $509 . phoney

  • patrick

    Built them for us? We are just gamers, and you built us a card, that designed for the HPC and the professional market.
    Fast tessellation? … look at Metro 2033, in Full HD max settings, that game is not playable with one GTX 480! This is what gamers want? Building SLI setups with huge power consumption? I don’t think so.
    What is NVIDIA strategy, buying games with CUDA and PhysX effects? This is wrong, we just want superior and silent card, with low price, low power consumption and fast performance. ATI did just this, and I’m going to buy a Radeon … first time in my life.

  • darmar

    what about temps in dual monitor configuration???
    almost 90°C in idle…. no excuse for that…..

  • Dan

    Hey, are there any chances of a GTX 460 to be released?

  • Dan

    “that barely beats the PREVIOUS generation of their competition”
    the 5xxx is the current, not the previous generation of ATI.
    Also, the GTX 4xx is made for future DX11 games (tesselation). Once the game developers start to implement those technologies, ATi wont stand a chance as their DX11/tesselation performance is weak compared to GTX 4xx, see Metro 2033.
    Also comparing the single GPU 480 to ATi’s flagship, a DUAL GPU card is pathetic.
    We will see the outcome once nvidia pulls out their dual GPU 4xx card.
    Also your’re simply **** when you say nvidia was behind the past years, have you been living under a rock?
    GTX 200 series was FAR superior to ATi’s 4xxx.

  • Klaus Schuster

    yes we are gamers, i am a gamers and i want to play my favorite games with all its feature.
    If you want play metro 2033 with tassellion physix and msaa4x you simple can’t on ati technology gpus(res 1920x1200hd5970 fps drop under 3 avg fps while gtx480 take 25 avg fps). Ati have six months of advantages and one of the first dx11 game dont run with new feature. its incredibile but is true. look for benchmark over the net and see.
    with metro 2033 1680×105 msaa 4x tass on af 16 gtx 480 provides 35 fps hd 5870 provides 18 fps with bfbc2 the min fps for the radeon is 10 vs the 30 of the gtx 480 on avp dx11 min fps for the radeon is 35 vs the 52 of the gtx480. radeon are eco but fps drops significantly and provides game go in lag or unplayable. Why i should buy a gpu that don’t work with new feature or game? the answer is simple buy Nvidia gpus and you have more feature and more power with new technology.
    dx10,9 games go well on old cards and new are the same. they don’t count.

  • patrick

    Look at techreport test:
    In Full HD max settings the GTX 480 get 22 FPS. This is playable?
    I don’t care PhysX and CUDA, these are just property platforms … irrelevant features for me.

  • patrick

    What 90°C? 😮 Can I get a link for that?

  • Ray Tracey

    Design Garage is a fantastic tech demo! Realtime raytracing at high resolution is finally a reality and I want to say Kudos to Nvidia and the Optix people for making such forward looking technology.

  • no thanks

    Frankly the compute performance is disappointing since the full double precision support has been deliberately degraded; GF100 should run compute on doubles at 1/2 the throughput of singles, instead it is far lower.
    Differentiating the part based on memory size and support for ECC was expected for the Tesla products. However, losing the full double precision throughput is a disappointment for development and hobbyist purposes. The chips obviously have it and it has been fused off, so it doesn’t save anything for the gamers (not even energy usage since they’d not be using it). If one is worried about gamers, one would have to wonder about the memory hierarchy which they are definitely paying for, and I imagine not getting the benefit of (unless they are into ray tracing).
    Oh, and since the last generation of Tesla were based on the Tesla architecture, the naming made some sense. New Tesla parts based on Fermi is getting a bit silly.

  • John Mann

    Try looking at a site who actually played the game using MAX playable settings
    1080 is very playable with the 480. Cuda is not a free t use platform. If AMD/ATI wanted to they could enlist Nvidia to help them write up a CUDA database that would work on the Radeons. Howvere, if ATI wanted to use PhysX, they WOULD have to pay a fee. See the difference is CUDA is free, PhysX needs a License.

  • Cherub

    For me Nvidia missed the mark with this generation.
    Yes, the GTX 480 is the fastest single GPU on the market but the package doesn’t add up when you look at the heat, noise and power consumption. Even the 470 is quite bad in this regard.
    I waited for Nvidia’s answer to the HD5 series, but now that it’s out I’m going to skip this generation entirely – the performance increase ranges from underwhelming to ok against the competitor cards. Especially in DX11 games with high resolutions, tessellation, AA and all details cranked up; neither ATI or Nvidia can provide playable framerates (>30 min fps).

  • Cherub

    Also what I forgot to say:
    MGPU sucks on both vendors.
    Don’t praise it as it is the answer to all questions just because your bar on the graphs looks so big.
    What ATI and Nv don’t tell you about are the micro stutters and the input lag, which make the gaming experience worse than on a single GPU card.
    Yes, Crossfire is worse than SLI, but that doesn’t make SLI any better.

  • psiphor

    A few thoughts…
    1. It’s all well and good saying that these cards are built for the future but let’s face it, unless and until those pesky next-gen consoles catch up with DX11/12 or whatever is present at the time of PS4/Xbox 720 etc., most developers won’t bother developing PC-specific games with all the extra bells and whistles our superior PCs and these GPUs offer. Also, we’ve already seen the recent breed of DX11 GPUs take a fair framerate hit whn running DX11 features in compatible games. The DX11 hardware’s still 1st Gen after all so by the time fully fledged DX11 games are out, these current 2009/10 cards will struggle to run them smoothly. They don’t exactly breeze through the Unigene Heaven 2 benchmark do they.
    2. Raytracing – From the Nvidia demos shown so far, the ‘interactive raytracing’ means fairly static car models to show off a la showroom style. I think we’re a few years off having GPUs with the grunt to push full on raytracing in-game with the framerates we would want.
    3. TSMC and GloblFoundries have both announced they’ve shifted focus to 28nm fab and so I’d expect both Nvidia and AMD to have GPUs built around this tech later this year/early next, so that alone will improve yield per wafer, performace, heat dissipation and power consumption. Hopefully, this will lead to cheaper, faster, cooler GPUs soon. It would be a bit foolish to buy any GPU now knowing these leaner versions are in the pipeline.
    4. I think Nvidia got caught on the back foot when AMD revealed Eyefinity. Sure, 3D Vision Surround is a step up, but to need 2 Nvidia cards to enable 3 monitors seems overkill and certianly over-expensive. Both Nvidia and AMD need to put pressure on manufaturers to produce bezeless monitors (I believe this is possible with LED) as those bars between screens just get in the way, and only a tiny fraction of us can afford/acommodate 3 projectors.
    5. Please would Nvidia start pushing the DX11 love into some RTS games. FPS games get all the attention, but I’d like to see physics, tessellation, advanced AI in different genres. Speak with Relic for example and let’s have Company of Heroes 2 in DX11 glory.
    6. In Nvidia’s experience of working with game developers, are developers starting now to actually develop with multi-core CPUs and massively threaded GPUs in mind? Also, I heard from your September GDC that the Fermi achitecture natively supports C++. Is this something game devs can capitalise on or is that more a feature for the scientific and general programming community?
    7. Any performance figures on Folding@Home for GTX 480/70?
    8. Reviews to date seem to be comparing the GTX 480 with the Radeon HD 5870s, but surely given the price structure of the Green and Red team’s offerings, it would be more apppropriate to compare the GTX 470 with the 5870 being closer in price to each other.

  • Manny

    darmar said…
    [q]what about temps in dual monitor configuration???
    almost 90°C in idle…. no excuse for that…..[/q]
    04/02/2010 at 04:24 AM
    patrick said in reply to darmar…
    [q]What 90°C? 😮 Can I get a link for that?[/q]
    A Nvidia rep has stated on at least two occasions,THIS WEEK,that this(very high temps in dual mon mode) will be addressed in the next Forceware update.Kyle @ [H]ardOCP has said HE did NOT run into this problem with his multi monitor setup/SLI,and his GTX 480’s/470’s.
    As well,the ONE tester/website that said they had this ‘issue’ was doing something else and he did not mention it,look at his VRAM use(987mb+) WTH !??
    He claims he was only browsing/doing basic 2D stuff,on the PC,but the data he put forth refutes that,and soundly.
    Kyle has done an UPDATE as to the sound profile on single and dual 480GTX’s,I suggest reading it,and -listening- -to- -it-.These cards are hardly loud!
    The card in REAL WORLD use is NO worse then a 260/275/285/295gtx/4890.
    Don’t believe me ! WATCH THE VIDEO REVIEW YOURSELVES….. :
    The cards are NOT anywhere as loud as all the ATI bandwagoneers are making it out to be.
    Happy easter weekend all,that goes out to both ATI and NVDA fans!

  • Manny

    psiphor said…
    [q]7. Any performance figures on Folding@Home for GTX 480/70?[/q]
    Have you read up on the Anandtech review ?
    The Fermi chips wipe the floor in OpenCL/Compute/CUDA/C++/Folding,against GT2xx/RV8xx series chips,and soundly at that.
    Tesselation performance now,and going forward looks to be incredible.

  • Uzair

    Drew, is there a chance of having the GTX 495? Wouldn’t heat be a problem? I respect what you said, at least we get more performance if there is a lot of heat but i just can’t think of GTX 495 coming out which is what i’m hoping for. But dual GPU always produce much more heat.
    In a month, i will be getting the Asus g73-jh which has HD5870. I would love to wait for nVidia to have a GPU for notebook as they have the features of tomorrow games (ill be keeping notebook for years to come) but will the mobile GPU produce this much heat?

  • Uzair
    Henry, are drivers going to solve this problem?

  • uzair

    Drew, is there a chance of having the GTX 495? Wouldn’t heat be a problem? I respect what you said, at least we get more performance if there is a lot of heat but i just can’t think of GTX 495 coming out which is what i’m hoping for. But dual GPU always produce much more heat.
    In a month, i will be getting the Asus g73-jh which has HD5870. I would love to wait for nVidia to have a GPU for notebook as they have the features of tomorrow games (ill be keeping notebook for years to come) but will the mobile GPU produce this much heat?

  • Drew henry

    Yep, this is a bug. Will be fixed shortly.

  • Jebo_4jc

    Mr. Henry –
    Thanks for taking the time to write – and respond – to the community. We hear you loud and clear.
    My thoughts:
    1. First and foremost, I am eagerly waiting for developers to take advantage of this hardware we have! I have an i7 rig with 3x GTX275 because I run folding @ home. If I didn’t run FAH, however, the sad truth is, even a single GTX275 or 285 can max out nearly every game coming out because many of the engines are designed with consoles in mind. PC devs need to push the envelope more. It would swing some console game purchases over to the PC side of the PC offered a drastically better experience.
    2. The GTX480 and 470 don’t offer enough performance in games to justify their price premium over the Radeon 5800 series options. Price cuts or rebates would help this out a lot.
    3. As a current SLI owner, I am very anxious to see the Surround capability enabled on my GTX275s. Is this coming soon?
    Thanks again.

  • patrick

    Kyle didn’t activate DX11 DOF. 🙂
    There are no reason for AMD/ATI or Intel (in the future) to support CUDA.

  • Betacentury

    yeah, 450W tdp?

  • Cherub

    I’m sorry but I don’t think we want to play Unigine Heaven or Stone Giant on my tricked out rig all day long.
    The only game that’s out of the ordinary in terms of performance increase is Metro 2033 and – oh surprise – it’s a TWIMTBP game… .
    Though it’s still not playable when you crank everything up.
    Do you really think that all game developers will code their engine in a way that it runs on the edge of being unplayable with a GTX 480 whereas when you got a 5870 or 470 you’re SOL? I don’t think so, they will take it down a notch or two to target a wide array of cards – not even talking about 5870 and 470 but further below.

  • Dihapus Hilang

    damn i miss my first nvidia card GF7200 256mb
    ha ha ha,
    i wait for low-end fermi 420 or 430
    i hope the cost will be low to
    hi i hear from my friend “OPEN-GL 4” can do Tessellation on windows Xp
    is that right?
    that mean some DX11 feature(win7) can do on Windows Xp OS_

  • meltman

    thank you for every videocard in the past, i love your work from tnt2 over geforce4ti to 6800 and 8800ultra everything. i hear many bad things about your new model gtx470/480 and dont know to buy it or not! but i will count on you and what you say and buy the 470 because i never ever want to have a ati card.
    for the future i wish you good luck and hope you can fix many problems to lead the video scene like in past 🙂

  • tpi2010

    I have had many Nvidia cards over the years in different rigs: started out with a GeForce 2 MX400, had an FX5200 and a 6200 on a spare computer, a 7600GS on my main one, and now a factory overclocked 8800GT. I also have several ATI cards. And from all models and both brands only one X700 card from ATI has failed on me. Apart from that, I’ve had no complaints. So, I’m not a fanboy of either side, I just look at the market and try to get the best value for money.
    But, truth be told, I’ve grown very skeptical of Nvidia’s behaviour towards gamers. Nvidia has clearly stated a few months ago that they are focusing a little away from games, which is no good indication, but worse than that, they now have their PR department saying that “We want tessellation perf to rock since we deeply believe that PC games should have the same geometric complexity as movies”.
    Yes, and now that you do want that, the industry will finally get it. But ATI has had tesselation support since 2001, and yet where was Nvidia all this time? And we had to wait for Nvidia.
    Where was Nvidia when DX10.1 came out ? Where was Nvidia when Ubisoft removed DX10.1 support from Assassin’s Creed ? And again we had to wait for Nvidia, because without Nvidia the gaming studios don’t move forward.
    And where is Nvidia’s ethics when you rename your 8800GT’s to 9800GT’s (while some 9800GT’s were still 65nm parts and others 55nm), and then to GTS240 ? And do the same to the 9800GTX+ becoming the GTS250, thereby confusing your customers ?
    An now you say a performance card is designed to run hot ? And be power inneficient ? The card idles at twice the power draw of the ATI models, despite droping Core Mhz count threefold. Is that by design too ? Is the fact that the chip has 512 cores, but you can’t get them out to the street in numbers with those many cores because you didn’t adapt to the manufacturing process properly ? Please. You just missed a perfect opportunity to stay quiet and work on the next generation. Because you calling next generation to the GTX 480 and GTX 470 is six months late. We are already in the next generation, making it the “current generation” and, unfortunately, Nvidia doesn’t yet have a single customer with one of their “current generation” cards.
    You have a lot of homework to do Nvidia. I wish you well, not because of any fanboyism of either side, but mainly because you have provided me with countless hours of both work and fun and I appreciate that, and also because competition is good for everyone. That said, I surely won’t do you any favors, as I’m not expecting anybody to do you any favors when you screw up. Especially when you ridiculously pretend everything’s fine. It’s a lesson I hope you learn for the future.
    Take care.

  • Drew Henry

    Hi and thanks for posting. 3 x GTX275 … Cool! My rig is 2x GTX285, so you have me beat. We will definitely support 3D Vision Surround on your rig. Support should come at the same time as for GTX 480/470, which is about a month from now.
    Thanks also for supporting F@H. Please check out the F@H for Stephanie contest at Good cause.
    We’re trying hard with the Devs. I believe we have to give them a platform to build these next gen games on … or they won’t do it and we’ll just keep getting DX9 console ports. Tessellation is key, imo.

  • PeterB

    I dont trust these cards, so I’m sticking with ATI for this generation. With a massive power consumption and massive heat output vs. ATI, I’d rather just steer clear. I dont believe a chip can be designed to last just as long at a high temperature vs. a low temperature. Its a comforting idea, but flies in the face of everything I know about silicon manufacture and operation. As we all know, Fermi has been plagued with design issues since the get-go. Now the company tells us it’s been well designed, and we shouldnt worry?
    I dont buy it, and I wont buy it. It’s marketing spin. Better luck next time around guys.

  • John Mann

    Just curious, but what do you think “ALL In Game Settings” means? We enabled everything but THIS and didn’t mention it? If they say they had all ingame options enabled, they were enabled to include your prescious DoF.

  • John Mann

    Also, they dont have to if they dont want to support CUDA, but should they ever decide to do so. ITS FREE, NO COST!

  • Shin0bi272

    “We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above.”
    Orly? try checking the power and heat on the amd 5870. A card thats a whopping 5-10% slower than yours but uses HALF the power. Care to rephrase your carefully thought out BS statement now? Oh wait I forgot about physx … apparently so did the rest of the gaming community. With havok cloth in the works and amd pushing their open source physics “bullet” You have no excuse to put out a card with these specs. Dump the GPgpu idea and give us more fps at a lower temp. In short FPS or GTFO!

  • Bachir

    Do we gamer’s have any hope that the next gen play station4 GPU will be Designed by Nvidia??? That company deserve 1000000% of a new cooperation by Sony for the new awaited PS4. i’ve done recent researches about the subject and found something interesting: rsx2 chip based off the Gtx480 architecture codenamed “Fermi”
    So i hope u all share u’r opinions beacause it is worth the discusion.

  • Brian Blessed’s Beard

    Sorry Drew, but all of your points are pointless (not in the eyes of NVIDIA fanboys maybe – to them, as Joseph Heller so wonderfully put it, NVIDIA has “a thousand points of light.”) ATI’s Evergreen refresh, or next-gen if we’re lucky, which should beat even a 28nm 512 shader Fermi card in performance (I’d be willing to bet money on it, and no, tessellation orientated non-game demo’s do not count) and will probably do so without running dangerously close to shut-down temp (unless AMD/ATI are aiming to lose money) will be out this calender year (unless TSMC run into problems with their 28nm node, and past 40nm issues don’t exactly inspire confidence).
    For me there are simply too few good reasons to buy either a GTX 480 or a 470. Maybe if the 470 was cheaper and had launched 6 months ago. But it isn’t and it didn’t. If I were to buy a new card now, and I may do if I need DX11 compatibility sooner rather than later, I’d get a HD 5850, which is £100 cheaper than a GTX 470, and overclock the living hell out of it.
    ‘Til then my GTX 260, which won’t hit 105C this coming summer unlike the GTX 480 (and Drew, I’d be willing to bet money on that also), will suffice.

  • epiclulz


  • stefanodisabatino

    FERMI is a Fantastic GPU with C likes language… but TOO HOT and few less CUDA ACTIVED CORES than original project… i hope that next time there will be a better full function product :-))!

  • no thanks

    Single versus dual gpu is a red herring; used to paper over the real comparison points. Given a watt budget (eg. < 300 watt to be valid PCIe card), $ budget, etc etc, which is the best graphics card product. Whether that is 2 chips or 1 chip on the card is completely irrelevant. Nvidia has and will done 2 chip implementations just as ATI has done. Multiple cards is yet another point of differentiation, with more tradeoffs (more slots/space/power required, greater cost etc.). Frankly, its a disappointment the GTX480+470 can't drive 3 monitors from a single card. Given your juvenile comment its perhaps pointless to explain further.

  • maxx

    what about multycore card i hope not far

  • Momo

    Fabulous card indeed, but I’m sure you’re desperate to get down to 28nm. What we can say if this had been released as a 28nm part with 512 cores enabled (the intended chip) along with digital pwm to regulate everything better, the tech press, no doubt, would have hailed it as another G80 8800GTX type milestone for Nvidia.

  • UltimateGTR

    I’m waiting for GTX 495 with not less than 2048MB video RAM!

  • Aaron D

    Is nvidia going to release a less expensive, less powerful version of these cards anytime soon? I want a DX11 card, but I don’t need that much power and I would prefer not to buy from ATI again because my current setup is riddled with problems.

  • Drew Henry

    Yes we are. I can’t comment on time frame just yet, but we’ll have new GPUs available at a bunch of different price points. All are based on our new arch designed for next gen games like Metro 2033.
    Thanks for asking.

  • Rochak Gautam

    Hey nvidia Crew!!
    Since the new fermi is taking rid of HD 5870, why don’t you work on dual GPU card of the fermi archtecture to swipe out “The king of graphics”, ATI RADEON HD 5970, and take the throne to the king.
    Work on the dual GPU- GTX 495- with 4 gb of graphics and Liquid cooling recommended !!!! HAHAHAHAHAHA….LOL!!!! 🙂

  • Ven

    Too bad that 3D Vision doesn’t work as expected in so many games. You mentioned Battlefield: Bad Company 2. It’s pretty much unplayable with 3D on.

  • Andrew Fear

    Hi Ven
    Can you tell me more about 3D Vision not working with Bad Company 2? I play it almost every day with a GTX 285 on a 3D Vision Acer 1920×1080 LCD and its very playbable for me. What GPU and settings are you using?

  • Bornprouduk

    If you want to drive a Ferrari you go NVidia
    If you want to drive a skoda then go ATI
    its a poor mans alternative just like AMD/Intel
    I spent to many hours years ago trying to make games work with ATI face facts the future is 3D NVidia
    My GTX480 is on order with Scan and I cant wait.
    forget the heat get a bigger case and fill it with blue led fans looks good and stays cool…..
    Drew Great launch loved all the Hype and want a tshirt Crank it up

  • Matt Leshman

    nVIDIA RULES!!!!!

  • Zimma

    Hey Nvidia!
    I have my 480 on pre-order – nicely timed release to coincide with the demise of my 2 8800 ultra’s which have until now have given great performance with pretty much all games!
    Ordered on day of release but the negativity surrounding the fermi has given me second thoughts in terms of my order. As I live in the UK, ambient temps won’t be too much of an issue and I read reviews that the card is noisy – That doesn’t really bother me as my rig is gaming orientated – If I want less noise, I’ll get a PS3. One thing I do need though is a reliable card that will stand the test of time. Can you reassure me that this card is not a release to cater to timelines and is in fact a reliable, well thought out card that will not see a newer and better version within a year?

  • Candyman

    As about GTX495 and the famous TDP issue.. you forgetting one little detail.
    GTX 275 had 220w TDP and GTX295 (290w TDP) was 2×275 with clocks of 460. Right?
    So.. GTX 470 has tdp of 225w… why you people think that 2×470 underclocked will not be possible?
    And as about people bragging about ati offering more for less…
    Does ATI has CUDA? PhysX? The best drivers in the market?
    TWIMTBP titles which take advantadge of specific features of the hardware?
    Hmm…. don’t think so..

  • Zeek The Geek

    Hello Drew, I have a few simple questions to ask. What is the estimated release date of a GTX 480 with 512 cores? Is Nvidia going to turn around and leave the GTX 480’s to stay with 480 cores and create the GTX 485 with 512 core instead? If that’s the case I would see flaws in that, since the GTX 480’s are already $50 over priced compared to that of ATI’s 5870 in price and performance standards. I know very well that creating your cards is indeed cheaper to do than it is for ATI. Since Nvidia cards have very few stream processors and lower clocked speeds than ATi cards, the manufacturing cost is simply that much less. On top of which you guys sell your cards with a much larger mark up price than ATI does. 5-15% doesn’t justify $100 more pricing on a GTX 480, simply because Nvidia could easily create an extreme cash flow by selling their GTX 280’s at $450.
    The amount of money spent creating the Fermi was certainly mind boggeling. However Nvidia cannot forget that it will lose out on potential business when they charge nearly %100 or more markups on the manufacturing of their cards. ATi has a stunning near %40 markup value on their cards which perform within a marginal tollerance for the price of their cards.
    It is my suggestion that the deffective products should have started out at $420 to compete with the current 5870’s and steal the spot light. While having fixed the 512 stream processor situation, changing the price to a reasonable $450. This is excellent strategy if Nvidia want’s to hog the enthusiast spotlight, let ATI get the mainstream cards. Nvidia can’t afford to make a mistake like this, it’ll deffinately come back to haunt Nvidia if they do not restratigize the pricing for the GTX 480. The GTX 470 is perfectly placed in pricing, I could not argue with the position in which it was placed for cost.
    I’m not an ATI fan, since I’ve had nothing but bad luck with ATI cards. I’ve had a Radeon 9550, outdated too quickly. Radeon x1300, card failed after 3 weeks, got a new card it failed within 3 weeks. I got a 5870 to test DX11, my issues with that card, poor drivers, ATI drivers have always been riddled with really bad bugs. Any antialiasing would cause horrible lag. Any game would lag for the first 5 minutes before going away.
    My Nvidia cards, 8800GT, the first graphics card I enjoyed and loved having, infact I’m using it right now for a reason explained later. I bought this graphics card because it was cheaper, decent in performance and had the next gen GPU in it. This was an awesome card, however the FAN noise was that of a dryer. This single slot card also got really hot in both idle and gameplay, I have managed to reach 101C with it on a hot day. This was bad, but oh well. My second Nvidia card was a mac daddy GTX 295, this card was utterly amazing in every aspect along with my 8800GT. As long as the game had SLI support I would enjoy max possible settings and max Anti-Aliasing, this card was a beast. However I sold it a few weeks ago to buy a GTX 480 when it came out. But sadly I’m going to wait a while until prices on it drop, it’s outragous to think that the GTX 480 which currently yeilds less performance than the GTX 295 should cost the same. As well, the GTX 295 is much more expensive to make than the GTX 480 will ever be.
    My only proposal is that Nvidia will drop the MSRP $80 on the 480 Stream Processor models. Afterwards raising it to $450 on the 512 Stream processors. That will be the only way I’ll buy the card as it is now. I can guarantee that this is what the majority of people will be thinking as well.
    Drew, if you can pull some string and let me be heard throught Nvidia I would appreciate it to the utmost sencerity. If I could be contacted I would appreciate it, I’d love to have a 1 on 1 chat or 1 with many chat with people from Nvidia.
    I’m currently a college student, on track for two associate degrees, Computer Forensics, and Computer Information Systems. I spend my free time researching and studying hardware and specifications. I guess you could say it’s a hobby of mine.

  • Obi-Wan Kenobi

    That is exactly what I am hoping for Dual GTX 470 cores with the ram of the GTX 480, this could be a good card that will wipe the HD 5970 from it’s performance crown, and to be honest even a GTX 480 has the power to keep up either beat this card when rendering DX11 the right way 😉
    the GF100-375-A3 is by far the most powerful GPU on this face of the earth and if supported properly like the Stone Giant DX11 showed us this GPU wiill pulverize anything the compettor puts against it.
    That has been proven and here’s the funniest part, it has been proven with beta drivers , in time these fermi GPU’s will only get better and faster, manys eem to forget that new cards don’t get better and should be seen asa pile of rectrum extracts… they are just missing the point of VGA card engineering.

  • Rainier Gascon

    when will be the release of the economic / midrange versons of 400 series? will it consume the same power and produce the same amount of heat?
    im curious, i am from the Philippines and we all know its a tropical country.
    I want to by one. how much will it cost? will there be a 256-bit vcard?

  • Aditya Gupta

    GTX 480 and 470 are good performers but i don’t understand why these cards just consume so much of power and generate so much of heat.This is the 1st time that ATI GPUs are beating NVIDIA’s GPUs.Also ATI GPUs are beating GTX 4XX in many bechmarks.
    are there any chances that NVIDIA can beat ATI with GTX 485 or 475?I am still hoping something good.The heat generated by GTX 4XX cards can alter the conditions of other components too.
    Uhmm , NVIDIA can bundle a GPU cooler which may help to keep the temperature down.
    What else? My all hopes are shattered but I still hope something good from NVIDIA in the coming months.

  • striker_gt

    Nvidia Fermi fail! 😀

  • dan

    the 480 is a monster, no doubt, although I personally prefer the 5870 at the moment because the configuration I am running likes the ATI side and ATI always seems to be cheaper for some reason. Another concern is heat, I live in Tucson AZ USA and when its 120 or 130 outside the AC is screaming just to get the inside temp 90 or 95! I am running air so PC parts cook! Also being fiscally challenged I tend to want my PC parts to last a while! My 5870 idles at room temp and never goes above 70 fully loaded [with 600 CFM + case ventilation!] and a second power supply for the case fans! At this point I don’t know if I could run a 480 for any length of time before it overheated.

  • Drew Henry

    TypePad HTML Email
    Sounds like you know something about cases. Thats good. Put a
    GTX 480 in it and youll have no problems. Best DX11 gaming, 3D Vision (you
    gotta try this!), plus all the other features of our GPU. Enjoy.
    From: TypePad