by Nick Stam

PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we’d all be playing at 10×7 with no AA!

Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD’s default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).

Filter Tester Observations
Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its “ground2” texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the “perfect” software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the “texture movement” settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.

AF Tester Observations
ComputerBase also says that AMD drivers appear to treat games differently than the popular “AF Tester” (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.

NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.

FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default.  Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.

A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.  During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”.  Special-casing of testing tools should also be considered a “cheat”.

Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences — NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.

We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or “false optimizations” in our competitor’s drivers. Rather it is to get everyone to take a closer look at AMD’s image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.

AMD promotes “no compromise” enthusiast graphics, but it seems multiple reviewers beg to differ.

We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing.  We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

We’re interested to know what you think here in the comments or on the NVIDIA forums.

Similar Stories

  • http://profile.typepad.com/joeeklund Joe Eklund

    Just another reason I’m glad I go with nVidia! I have always found your guys’ drivers LOADS better than ATI/AMD graphics cards. You guys optimize with every new driver it seems, so that every time I download it it’s like a free upgrade to my graphics card! 🙂

  • lulz

    nvidia 4 lyfe!!!11!11!!!


    “Nick Stam
    Technical Marketing Director”
    Good job.

  • v1rus 2.2


  • Nick Stam @ NVIDIA

    Thanks guys above Virus 2.2 😉
    Virus – we are just pointing to multiple unbiased major Websites who conducted the testing. This is not NVIDIA-generated data, though of course we verified their findings too.

  • F@H guy

    um wouldnt this be a bit biased?
    i mean, this is coming from AMD’s MAIN COMPEDITOR. you would expect nvidia to say good things about their products.
    and besides, the only reason why i have GTX275 and dual 8800GTs is because i fold. if i didnt fold i would buy a 2 6870s and OWN.
    i’d listen more if this was from a more unbiased source…

  • mapel110

    Do you ever read before saying “biased”?`4 German Websites report about this issue. They have done research on it and shown videos, that proof ATI is cheating.

  • BizSAR

    Yes. Folks, please read the blog post in its entirety before commenting. You’ll see Nvidia is simply forwarding data from 3rd-party sites (with references) for the most part. Some “dirty pool” going on there.

  • gato81

    Pointing fingers to amd want get nvidia a better reputation. Good work will. Gf100 its a great chip but it was cut out green, and we are leading with unfinished drivers heat and noise problems. Spending money for minimal changes. I read some reviews where gtx 480 vs gtx 580 at the same core clock get the same frames per second!

  • peknikolic@gmail.com

    I have amd card but I noticed that images looks moore natural on nvidia cards.. Ati has veri sharp objects in their images, looks like there are not blury where needs to be, and also their shadows are not right.. My next card is nvidia..

  • Rollo

    This isn’t a “fan” or “bias” issue, this is a problem with AMD putting out drivers that take us back to the “brilinear” days.
    We’ve had angle invariant AF on both NVIDIA and ATi cards for a while now, and this looks like a step backwards for AMD. I’ve ordered a 6870 for myself to check this out, but the videos at TweakPC are pretty convincing.

  • High IQ

    This is why I always choose Nvidia. Nvidia is the only one that will give you the best performance and the best image quality.
    ATI, now known as AMD has always been cheating with the AF quality.

  • herp

    I don’t know. I think that giving the user options to sacrifice IQ for perf is perfectly valid. Default settings are questionable, however, and there’s two very different ways to interpret that – against nV’s perspective and for it.

  • The Truth Hurts

    [QUOTE]um wouldnt this be a bit biased?
    i mean, this is coming from AMD’s MAIN COMPEDITOR. you would expect nvidia to say good things about their products.[/QUOTE]
    Well, of course they wouldn’t publish something promoting AMD over NVIDIA, that’s just silly. However, just because it favours NVIDIA doesn’t make it untrue. There is nothing wrong with bias if it is based on fact.
    Just because it’s biased, doesn’t mean it is a lie, or that it is libel or defamation.
    You can prefer one thing over another, you can point out flaws, tell people something sucks and doesn’t work properly all that you want provided it is based on truth.
    So biased it may be, but it there wasn’t any truth to it, it couldn’t be posted.

  • derp

    Why is your driver team poking around in proprietary, closed-source drivers? I doubt it’s very legal.

  • joey

    This there a blogs somewhere about the number of cheats Nvidia has committed. That would be interesting too!!

  • http://alienbabeltech.com Greg Stanford

    “We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.”
    The day you do reduce IQ behind our backs again is the day I will cease to purchase and support nvidia graphic cards! Optimize all you like, but make sure the end user can control the optimizations, don’t force them on by default and all will be fine.

  • Jon

    The GF 7-series IQ was a debacle, I think we can all agree on that and Nvidia finally learned its lesson.
    Since the GF 8-series IQ was top notch, while AMD’s went into a downward-spiral like a roller coaster.
    Only thing you(NV) need to improve in upcoming cards is AF-angle dependency, a lot of clueless reviewers still harp on that fact that it is on AMD’s cards (while ignoring the whole IQ end result, which is superior on NV).

  • Jon

    Forgot to say: Please keep the High Quality AF-option, best thing since sliced bread!

  • Franpa

    Nvidia makes optimizations that compromise image quality, obvious settings in the Nvidia Control Panel.
    ATI/AMD integrated them into “presets” with no manual control over such optimizations in these presets other then disabling the presets entirely or using the highest quality preset (depending on the video card and driver).

  • BlackOmega

    HAHAH what a load of rubbish. And it’s not like Nvidia hasn’t done the same exact thing.
    Way back when…the 175.19 driver had nice color saturation and excellent overall quality. Then fast forward to the 180.xx driver lo-and-behold it looks bad.
    It was plainly obvious to me that Nvida sacrificed image quality in lieu of FPS, as the games that I used to get 50FPS in, I was getting 100 –but it looked bad (all washed out).
    I actually preferred the older driver, and reverted back to it even though I took almost a 50% FPS hit.
    Also, I would NEVER TRUST what a competitor says about their competitions product. IT’s like asking Intel about AMD’s inner workings.

  • AMD fanboys can’t read

    Looks like the AMD fanboys can’t READ, BlackOmega clearly is a good example. The truth hurts, so they can’t accept the truth and are in denial.
    It’s 4 major German review sites that reported the AMD cheats first and AF shenanigans.

  • Max

    AMD got no right to change how game is supposed to look.
    And we should have known that AMD enables a setting in the CCC that changes the image quality without telling them, that’s lame because the users should be the ones to decide whether to turn it on or not for their games. AMD had no right to turn it on by default, and if they want to turn it on by default, they gotta let users know that by not hiding that setting in CCC.

  • John Smith

    Two notes:
    You missed another german site – HT4U.de. They are doing a great job in invstigation the image quality of graphic cards. And after the 6800 launch they made videos to show the problem in “The Witcher”: http://ht4u.net/reviews/2010/amd_radeon_hd_6800_reloaded/index11.php
    And you should link to these forum posts, because you can download videos from different cards which show the image quality of the ansiotropic filter in the game “Dragensang”:
    #1 – X1800XT, 5770 (Juniper), GF100 ( i think it was a GTX460): http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8383575&postcount=1704
    #2 – 6850: http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8395327&postcount=1746

  • Gregory Berry

    Oh grow up jesus christ. each company has their own faults. EVEN Nvidia. Its just personal preference.

  • Chas
  • israel

    what a butthurt.
    instead of wasting time bashing amd. why don’t you do a do fix for sniper ghost warrior. shitty driver crashed every 10 minutes, i tried to find a way to reset the driver without rebooting, until now nothing.
    it’s a great game, but is a shame that you can’t even play it.
    driver ver: 260.99, card 8800gtx
    here is home page:

  • BiS

    who cares, just buy the best bang for buck. Atm ati, with 560 prolly nvidia, and with 460 was nvidia…
    Or just spend 500€ on the top one of your fav. brand

  • http://www.suckysucky.pl Kaczorq

    It is…not! ATi’s trying to scr3w you guys! NVidia is the leader of quality!

  • TB

    Those sites are “sponsored” by NVIDIA Germany.

  • Diceman
  • Diceman

    Btw Nick, it would be nice if you guys fixed Transparency Multisampling on GT200 and lower cards. Since Release 256, Transparency Multisampling is simply a Alpha Test which does not benefit from increasing the FSMSAA level.
    While in Release 197, enabling Transparency Multisampling improves in quality as you increase the FSMSAA mode.
    The fact that this “Bug” only occurs on G200b and lower, while working properly on GF100+ is an eyebrow raiser.

  • arie

    NVIDIA KEEP DOING GREAT JOB !!! ATI = ugly attitude

  • James

    If NVIDIA consider IQ above everything else, why is their driver default ‘Quality’ instead of ‘High Quality’? All AMD have done is change their driver default from ‘High Quality’ to ‘Quality’ like NVIDIA already do.

  • Dreamer

    LOL. Talk about the pot calling the kettle black. Yes, Nvidia, your hands are certainly clean in the IQ department………

  • MK

    Well If ATI decided to lower the image quality to Nvidas Image Quality…whats the problem??? they just want to compare apples to apples!…
    Ohh and do you guys remember the many time that Nvidia got caught cheating; latest drivers 169.xx cheat in Crysis demo…again…
    Hmmm youre marketing deparment are weak in there “SPIN STORIES” skills…

  • Diceman

    Geforce 7 had the same AF technique as Geforce 6, they were attempting some interesting type of angular independant AF and ended up with the dreaded Flower.

  • Diceman

    not when AMD’s High quality setting is the equivalent to nvidia’s Performance setting!

  • Diceman

    Reverse engineering is legal and legitimate as long as no proprietary code or techniques are implemented in an application created by nvidia.
    There is nothing illegal about knowing how software works.
    using what you learn to create an application without obtaining permission… however… not so legal.
    Then again, americanism has put such a price on knowledge which is despicable.
    …imagine if cavemen put a levy on other tribes putting spears together >.>

  • Diceman

    You have no basis in anything you are saying here.
    Nvidia’s AF quality at Quality mode is better than AMD’s at high quality.
    infact having just checked, Nvidia’s AF quality at Performance mode is better than AMD’s at high quality.
    so choke on that.

  • A Wise Man

    I cannot afford a decent compuer let alone a decent video ward…But when I do acquire the funds I will be choosing Nvidia. I always would have and this only solidifies my choice.

  • LoL

    lol, if it’s true then prove it in the battlefield
    not just ranting about it on your own blog, it’s just childish
    it’s funny how you guys keep saying : according to this, according to that…
    oh yeah speaking about quality, i had a radeon x1550 once, and then bought a 9600gt later. I noticed that image quality in x1550 is better than 9600gt eventhough it’s outclassed. Not forget the mention both was in it’s maximum image settings and enhancement
    dissapointed by that, now i’m using 5870 and if you guys really meant it, prove it and i’ll gladly cross the border once more to use your products.

  • allrpg

    Here’s my take. I do not care about who increases performance by decreasing image quality slightly. It may be AMD today, it can be you tomorrow.
    I really pissed of with both of your bullshit blogs and articles and cat fights. Not everybody in earth in graphics card enthusiast. Other wise both of you could have stopped producing lower/middle end graphics cards. But it isn’t the case. Lower-end/middle-end cards are still in the market and in fact they are the better selling than higher end cards.
    As long as you keep your cards’ price accessible and cheap – I don’t mind who does what. Average gamers are playing in 19″ to 22″ screen range, middle level card is more than suffice. Who needs $400+ cards?

  • nVidiots are green with envy

    So nVidia cheats first, AMD is expected to bend over
    but when AMD lowers their IQ to match that of nVidia’s to make it an even playing field, their MARKETING DIRECTOR crys and moans?
    image quality is the same? lol cry more nvidia. keep selling and spinning off your product so that your loyalists buy it, no matter how awful it is.

  • HyperionZ

    La verdad duele ?? esto seria de importancia si en el catalyst no se pudiera configurar o activar,Nvidia con el AA en batman Arkham assylum asi que fanboys verdes mejor no hablen de trampas, mejor que Nvidia se precoupe por sacar una tarjeta eficiente y dejar de sacar chorradas que son fuerza bruta nada mas mas watts mas calor = nvidia uqe de verde o ecologico solo el color.

  • lkashl

    In all honesty and from an unbiased perspective, the differences in IQ are so miniscule that its hardly, if at all, perceptable in real time game play. Pretty sure most consumers would prefer the ‘invalid’ optimisations.
    That said, there is warrant in NVIDIA’s claims that in benchmarking, the cards should be compared at identical IQ’s so as to offer pure and actual performance statistics.

  • Rosco

    Depends what you want cpu wise, If you want sli and amd you have limited options, Amd and crossfire no problem, if you want intel, crossfire and sli both work. I prefer nvidia gpu and amd cpu, bleh Dilemma.

  • hano

    I really am an Nvidia Fan. However, I couldn’t find a good AM3 Mohtherboard that supported SLI. I want to use AMD processors with SLI but I was out of luck so I had to go for Crossfire.

  • X37

    Damn cheater, who can support such a company. I will never buy again an ATI card

  • Diceman

    you’re a fool.
    the epic fail of amd does not appear on still shots, only on motion shots.
    then again butthurt amd fags don’t tend to read past the first line of any article bringing to light failures in their favourite hardware.

  • trollololl

    why use old games and not recent games? look at MW2 and say again..

  • Diceman

    shimmering is easily noticed in game play if you have 20|20 vision.

  • Diceman

    DO NOT COMPARE STILL IMAGES, the problem is mostly evident in moving gfx and gameplay.

  • AMDschummelt

    I think you should link the HT4U test and this thread to
    There are a lot oft videocomparisons from enganging Users that show the differences between AMD and NV.

  • Dos


  • pieter3dnow

    In the not to distant past Nvidia managed to create drivers which upon every revision were 100 3dmarks faster for a good while.
    Benchmarks are worthless as a whole especially when your competitor buys developers to cripple eye candy on AMD/ATI cards.

  • Kapeeteeleest_Peeg

    I won’t buy nVidia again…
    ‘BumpGate’ cost me a fortune replacing a load of 8800 series/Laptops which died prematurely.
    ‘Fan disabling driver’ a couple of years back cost me a bit as well.
    And the cherry on the cake was the fermi presentation where the card was a mock up made from a cut-up circuit boad with heatsink held with wood screws.
    I don’t care if nVidia products appear better.
    They ‘lost’ me a long time ago and trust is such a difficult thing to get back.

  • i loled

    Im not buying this blog from the same company that rebranded g92 chip like there was no tommorow calling it new cards, and the same company that waved a fake card to the croud saying “Ladies and gentlemans this is Fermi!”

  • lilly

    ROTFL about this blog post!
    The cow is saying cuckold to the donkey!!!

  • salman

    yes these are reviews from german sites which we cant understand what is written.and why only german sites. if there wqas any truth in it we would had seen more reviews. so i will rather read unbiased reviews than this crap.just cuz ati released dx 11 before nvidia is same like xbox360 released their console before ps3 .and ps3 still is no near to their breakeven

  • sza

    It is not the first time ATI or NVIDIA do that and this won’t make difference in my decision to buy new ATI graphic card. If you guys give me better card for less money and less power consumption I will be happy to buy NVIDIA.

  • Davidson

    nVidia inside is enough to make me avoid a computer completely. If it uses an nVidia grpahics card I will look elsewhere and buy something different. If you don’t understand this comment then let me put it this way – If you buy nVidia then you deserve what you get, crappy cards sold by a company that is very willing to mislead consumers, take their money and basically tell them to screw off when they discover that they have been lied to.
    Buy nVidia? (spits on the ground) “Not likely.”

  • boo

    pics or it didn’t happen

  • Poopman

    Depends what you regard as a cheat lol, I think nvidia did this with crysis and heralded it as better performing on nvidia cards

  • GTZ

    All that i must do was set Catalyst’s mipmapping quality to max again and i’m all fine, same performance and IQ as before.
    I only used nVidia’s product once, Geforce 6000 series and sadly blown up due to heat issue. There i never had any issue with ATI’s.

  • Nick Stam @ NVIDIA

    @Boo – we are not posting pics. You can see pics and videos at the sites linked in the story who did the actual testing we referenced. This blog does not present “NV content”, it points to 3rd party content.
    A key goal is to get fair apples-apples benchmark comparison. Another goal would be to get AMD to stop doing these perf-enhancing image quality reductions at default settings.
    NV was guilty of some of these same type of things in the past as many point out, and as I pointed out in the blog post, but we don’t do this anymore behind the back of users. Any image quality optimizations are exposed in control panel – user selectable as FranPA points out.
    Diceman, I’ll talk to our driver guys about the TRMSAA issues.
    John Smith – you are right about HT4U.de – I should have mentioned them too, and they found similar IQ issues earlier than many other sites.

  • NVidia should keep traps shut…

    Oh noes, NVidia would never do anything like this… never, ever… well, maybe this one time… at band camp…
    You know what they say about glass houses and stones.

  • andreas

    This is just pathetic of nvidia te even write. u only put urself in the dirt. AMD is bigger than u guys. and u try to force AMD to change their default settings? Its like a Car maker trying to force the other Car Maker to use the same parts and quality on parts…
    Sry but this is rly childish.
    im not a AMD fanboy, ive had several nvidia cards aswell

  • nigglet

    Oh wow, internet full of retards. FP16 Render Observation Nvidia does the same thing but you cannot disable it. On amd you could by disabling cat AI.
    Please dont through stone in a glasshouse nvidia please. Also what happen to yours 5xx card and furrmark eh?
    //Death to this FUDWAR!

  • Bima Sylirian

    GeForce Cards receive more FPS damage compared to Radeon Cards when image quality set to maximum Settings in NVIDIA Control Panel or Catalyst Control Center.
    I don’t see any reason to say AMD is cheating. It’s about driver settings. You can just simply adjust it. Is adjusting image quality in driver difficult? Well… If it is just too difficult, then Nick Stam and all NVIDIA fanboy here are a bunch of morons.
    Trying to find a way to attack the competitor, huh?
    I did use GeForce Card. I found the image quality and performance fine. I don’t have anything to complain about the cards except the company that creates the card.

  • Diceman

    because the american sites believe anything they are force fed,… you’ve seen their government no doubt?

  • Diceman

    feel free to sepaku

  • Diceman

    lol, yeah because that DEMO based bs hasn’t been refuted and canned 100x before

  • Diceman

    um, no. ATI still has a negative lod regardless of the AI setting. This cause shimmering.

  • Diceman

    Thanks Nick, im sure most people wouldn’t notice it, but in the games and applications i use it is easy to see that unlike Release 197 and older drivers, that the TRMSAA setting is not taking on the FSMSAA mode.
    this is the best example i have
    http://image.svijethardvera.com/images/smb6432xsnvcptrmsaa2.jpg – This is 32xS which has an 8xFSMSAA component, i also tested the 8x and 16xQ modes and got those jaggies on the fence when TrMSAA is enabled from the NVCP.
    http://i53.tinypic.com/jaawie.jpg – This is what i noticed when i opened the profile in nvidia inspector
    I then switched the Supersampling to off and enabled the Multisampling setting. NVCP now displays that TrAA is using a custom setting
    This is my results with 32xS, 8x and 16xQ
    i then returned to 197.45 and took this with nvcp enabled 16xQ/8x and TrMSAA enabled from the NVCP
    i had a friend also reproduce my findings on 197.45 with his 260, he was not willing to upgrade to release 256+ though i’ve had another friend confirm that TrMSAA enables SuperSampling – AA_MODE_REPLAY_MODE_ALPHA_TEST when enabling TrMSAA from the nvcp.
    So from this i concluded that the Transparency MSAA in Release 256 and higher is not functioning in the same way on older cards as it did in release 197, as the default exposed settings were no longer benefitting from increased FSMSAA settings.
    i was thinking it might be doing this for now because theres a bug which makes tree’s transparent in CSS when using it atm.

  • Diceman

    That was blown out of proportion at the time, nvidia was not cheating and if futuremark had a clue they would’ve talked with nvidia more about the situation instead of releasing a patch which artificially deflated and crippled the CG recompiler that was being used to translate the code into an optimised format for NV3X hardware.
    Then again, i wouldn’t expect Kyle or futuremark to have had much of an idea of how the CG recompiler worked, as nvidia themselves did not document it enough, which Jen has admitted to on numerous occasions.

  • AMD fanbois: quit crying

    Too many AMD fanbois who can’t stomach the truth from 3rd party review sites?
    Where is George Ou? He would probably have a blast posting about this.
    It’s hysterically funny how AMD fanbois live in “da nile”
    Damage control at AMD HQ. Sirens, Foghorns, and Fanbois….oh, my!

  • Dani

    I had 3 NVIDIA cards -> all those failed in ~1 year because they were too hot (screen artifacts and not working at all after some time); then they were repaired by vendor, they worked for 1 month then fails again. Should I buy 4th one?
    Got my first ATI card (5870) when it appeared -> very satisfied of drivers, image quality, power consumption, performance and FPS which is more then enough in each game and a lot cheaper.
    Do you think people care about 1 FPS when the difference is 200-199?

  • Jon

    >im not a AMD fanboy, ive had several nvidia cards aswell
    I’d invest in a dictionary instead of a GPU-upgrade next time.

  • nichomach
  • Dreamer

    Yeah, Nvidia never cheats….They make mistakes *cough*.

  • Lans

    “From our testing it’s clear that ATI’s implementation FP16 Demotion does affect image quality in games as challenged by NVIDIA. However, the extent and prevalence of it is not universal – from four games, only one showed any real visible influence. On the other side of the coin are performance improvements, which are plentiful in two games from four: boosting performance by 17 per cent at high resolutions, and a still-impressive 10 per cent at lower resolutions.
    Ultimately the decision to run FP16 Demotion has always been up to the end-user, as the use of Catalyst A.I has been optional since its first inclusion in Catalyst drivers – now many years ago – and as it appears predominantly beneficial or benign in the majority of cases, we’d suggest you leave it enabled.
    We also can’t help but wonder why NVIDIA attempted to make such an issue over this rendering technique; one that they wholeheartedly support and simultaneously condemn. In the two titles that their hack functioned with we saw no image quality nor performance loss, and rather the opposite: a performance boost. Why haven’t NVIDIA included support for FP16 Demotion in their drivers until now for specific titles? Why choose now to kick up a fuss?”
    “NVIDIA have just announced their new GeForce 260.63 beta driver, which along with support for the GTS450 card we’ve grabbed for use in testing also unlocks higher performance in a multitude of gaming titles – though unlike ATI Catalyst, the GeForce driver doesn’t come with FP16 Demotion baked in. Support for FP16 Demotion has been surreptitiously ‘hacked’ by NVIDIA in what they’ve cheekily called the “AMDDemotionHack”, a file that enables Demotion in two more titles.”

  • Vanakatherock

    It doesn’t quite matter if your graphics cards crash just die a year after ownership. I’ve owned two Nvidia cards and both did just that. I got fed up, got an entirely new computer that had a standard ATI graphics card. Used it for a year before upgrading to an HIS Radeon 5670 to compliment my new 22″ Dell Flat panel. Best decision I’ve ever made.

  • Steve Trollo

    U MAD?

  • Trajan Long

    ATI sucks rocks! Nvidia owns the professional market where accuracy and precision drivers rules!! ATI is for amateurs too cheap to get the best.

  • Dreamer

    ATI sucks and rocks? Thanks for the informative post! LOL! BTW, didn’t the GTX 470 (an enthusiast card) drop $100 to compete with the 6870 (a mainstream card)? LOL!

  • Trajan Long

    Typical ATI fanboi! Thanks for making my exact point ROTFLOL! Whining about a hundred bucks! Currently best single GPU 580 by huge margin. SLI 580 crushes everything. Professional market Quadro owns 85%. Supercomputing market two of the top three fastest supercomputers in the WORLD use Tesla.

  • Dreamer

    No, you idiot! The GTX 470 is a Nvidia card! It has decreased in price to compete with cheaper ATI products! Go on newegg and look at all the deals Nvidia are now providing to consumers! The GTX 470 went from $350 to $220 (with rebates) in less than 7 months! So how in the heck is AMD “cheaper”? Please elaborate?

  • Trajan Long

    Written like a liberal. Which fact don’t you want to acknowledge? Best single GPU- 580, best dual GPU solution, 580 SLI, best professional video card Quadro 6000(Nvidia has 80+% of the pro market for a reason), best supercomputing GPU solution Tesla, found in two of the top three fastest computers on the planet. Saving 100 bucks with a coupon is not my department. I’m only interested in the best. Image quality, driver team, CUDA are also important factors for me at this time. Is this situation subject to change? Of course. That is why I said AT THIS TIME I PREFER NVIDIA.

  • David Tan

    Another nvidia/intel user here. I stay far away from amd products, I find that they are a bit cheating. Intel feels much ‘snappier’ than any amd proc, I’d take a e5200 over a phenom 955 any day. Same goes for nvidia gpu’s, nvidia cards take a reasonable fps hit during explosions and action scenes, where my hd5870 (that I sold) would drop from 50fps to 20fps in crysis, complete BS. The fps is never there when you need it for ati cards, where as my gtx260 played crysis at 35fps and dropped to a reasonable 25fps. This is my personal experience with my own cards. I’d take a gtx460 over a hd5870 any day.
    Honestly, I knew that nvidia graphic quality was better than ati even before reading this. That’s not even considering that nvidia gameplay is so much more superior with physx. I read forums where the community is feeling bad for the underdog (AMD), and I say they’re just in denial. Keep up the good work nvidia! 🙂

  • Dreamer

    Written like a liberal? WTH all are you talking about? You stated AMD (formerly ATI) owners are too “cheap” to get the “best”, which is ironic since most of Nvidia’s desktop cards are just as cheap as AMD’s; more so in some cases. Of course the 580 is currently the most expensive single GPU card. It was just released a few weeks ago! AMD hasn’t even released their top tier cards yet! And since when has high price been a basis for purchasing a video card? I thought the focus was on getting the best deal? You obviously don’t own a supercomputer.
    Furthermore, the top tier cards in both the Firepro and Quadro cost over $3000…..If someone can spend $3000+ on a card, why would costs be factor? If Quadro is better, they’d just get Quadro, right? I’m not getting your point. Also, though it may hard to accept, some people will choose the Firepro for its multi-monitor support.
    However, I’ll let you get back your Nvidia worship and condemnation of “Liberals….”

  • ElNad

    That’s kind of a cheap shot.
    It sound like “Hey everybody, look, 4 very obscure cheap GERMAN websites found something about drivers and that means we are THE BEST!!!! wooohoooo”. It seems childish from a respectable and serious company as nVidia.
    I saw a lot of youtube videos comparing image quality from a GTX 480 against a 5970 at the beginning of the year and almost all the tests showed that the ATI image quality was better in very high resolution. Before saying that AMD sucks and cheats in tests, maybe remember how the GF100 is such a crappy GPU. GF104 is a lot better, but AMD had such a better product than you for almost 8 months with the AMD 5000 series.
    I’m not a fanboy and I’m looking to buy a Gigabyte GTX 460 dual fan because it’s the more quiet 460 out there, but that kind of childish post don’t help your cause. Please be adult.

  • Evil. Troll

    *yawn* another excuse from Nvidia why they are losing (lost) the desktop graphics market. Saying AMD’s going behind backs and cheating to “trick” consumers into “thinking” that their graphics card really produces what they are seeing on the screen. *sigh* Whats next Nvidia? Quit whining and be competitive in price points already, quit charging your apple logo fee on crap that should of been out 6 months ago. Kudos to AMD’s development team.

  • Nick Stam @ NVIDIA

    Andreas – our #1 goal is for users and reviewers to properly compare our products using similar image quality when running benchmarks.

  • Nick Stam @ NVIDIA

    ElNad – These are not obscure cheap German sites. A few are some of the largest tech sites in Germany, and others are highly technical. Again, as I posted above to “Andreas” our #1 goal is for users and reviewers to properly compare our products using similar image quality when running benchmarks with current drivers and current products.
    Of course I’m going to tell you that the GF100 is not a crappy GPU. Actually, it’s a pretty amazing GPU with all the features and perf it brings to the table. The distributed out-of-order geometry design was very hard to engineer – thus substantially contributing to the lateness. But no doubt it needs good cooling in a good case, and consumes some serious power when pushed hard. Will be interesting to see the TDP of our competitors upcoming high-end 40nm GPUs.
    Glad to see you’re looking at the Gigabyte 460 board!

  • Trajan Long

    Another typically stupid reply that doesn’t refute the facts, whining about money.

  • Dreamer

    Are you so delusional that you can’t discern between “facts” and “opinion”, friend? Please, lay off the secret sauce.

  • Nick Stam @ NVIDIA

    I am now testing this myself, and have alerted our top driver AA guys and driver product mgmt. I will run older R197.45 and R260.99 (latest driver) for GT200/GTX 280 board. I am using Nvidia Inspector tool similar to you. Back soon…

  • Diceman

    I beg to differ, in all the performance comparisons i have made and seen performed elsewhere, Nvidia’s MSAA and AF have a lower hit compared to AMD’s, this may however just be a result of higher framebuffer bandwidth though.

  • Diceman

    Cards rarely fail because of heat, in the majority of cases the chip itself was already prone to failure and the cause was more likely unstable current rather then heat, you got a bad lot, nothing more nothing less.

  • Diceman

    not only are these high profile sites these were also some of the places getting Fermi information well before other sites.

  • Diceman

    nvidia still holds the highest % of dedicated graphics card, professional and gamer combined.
    Theres a reason professionals avoid FireGL…

  • http://www.ivart.org Ivica

    For me, nVidia is like a German car, and AMD like Japanese. Sometimes Japanese cars look nice and run faster, but at the and, class, performance and stability is not in Toyota but in BMW 🙂

  • Spede

    This is the only thing you Nvidia PR suckers can do. I bet you never done any “optimizations” yourselves :rollseyes:. At least AMD is keeping their reputation respectable unlike you with your Batman AA lock, Assassins Creed DX10.1 support removal and numerous other things you’ve done. Intel is like model citizen compared to you.

  • Spede

    What is really funny you guys complain about AMD’s AF quality when they have angle independent filtering and you don’t have. Yours looks like a stop sign.

  • Diceman

    of course you cannot disable what doesn’t exist.

  • Diceman

    because angle independent AF really helped the 5870 not have crappy AF…. http://img402.imageshack.us/img402/4872/d3daftester13a201002051r.png oh, i guess it didn’t help at all.
    i really don’t like brilinear.

  • nikola

    nvidia > amd

  • Trajan Long

    ATI and their idiot fanbois whine line scalded monkeys! Super!

  • Wayland

    Nvidia NEEDS to FIX the OpenGL problems with GTX480 cards! They are slower than the older cards and I purchased the high end GTX480 because it was supposed to be faster.. it NOT. Many people are finding the same issues.. PLEASE GUYS FIX THIS PROBLEM! I am trying to use this card for DVE use with ivsEdits. Thanks!

  • Diceman

    “Many people are finding the same issues..”
    Depends what you use the card for, its no secret if you are using a Geforce card for work that should be done on a quadro that things are slower…….

  • anonymouse

    When you guys play this low games you must be scared of something.
    Watch out 6970 is comming soon.
    And then 6990 will own you early next year!

  • havok/ opencl/ 78°Celcius/ XFX

    hey nvidia,u r going to suffer until directx 13 launching(november 2014)

  • Mar

    Well, I’ve got a similar experience of Nvidia cards. I got a 8600M GT in my laptop and it’s hot enough to make toddy by placing a small glass of rum right next to the fan exhaust. Fortunately the piece is still running but for how long?
    I used a 6800 Ultra-card in my previous desktop build and after half a year it started showing graphical artifacts when running Need 4 Speed. Mind you the case was well ventilated.
    A year ago I built a new computer using a Radeon 5850 and it’s never ever given me any problems of this kind. This is my own scenario and probably differs a lot from other users’ experience. What it boils down to is this: as long as I’ve been using Nvidia-cards, I’ve had issues with overheating and graphical artifacts. When I switched to ATI/AMD those issues went away.
    To top it off, you guys (Nvidia) made fools of yourself with the whole Fermi launch, and now you’re trying to get a cheap shot at the competition by posting slander? I understand competition is fierce but what about making “honesty” and “commitment” part of your repertoire instead of simply flexing your muscles and going “RAARGH, BIGGEST GPU WINS!”…

  • Shibu

    I was wondering why guys were late to find this out. I have felt this difference from the time 5000 series launched. Even in review sites (guru3d, anandtech, etc.) when they post screenshots of games, there is a glaring difference in image quality. Whatever it maybe, NVIDIA’s graphic cards had the better image quality overall. And as Joe Eklund said, NVIDIAs drivers are the best and no nonsense.

  • rudy

    nvidia is very pathetic….

  • TwilightVampire

    Remember during the GeForce FX days, you guys were guilty of this too. In my opinion, nVidia has no room to talk on this subject.
    Most PC gaming enthusiasts already re-adjust IQ settings to their liking anyway. Defaults mean little to nothing. And official benchmarking is done with the lowest IQ settings for every FPS or 3dmark point possible. Just like it was a moot point in 2003 when nVidia was guilty of this, its still a moot point today when AMD is guilty of it.

  • Play3r

    has this been tested with the older 5000 or even 4000 series? or only with the 6000 series. testing on 2 very resently release graphics cards could just been they are ironing out the bugs in the drivers. It happens for both side everytime they release something new. there are always driver issues. and do the test with the 10.11 and see.

  • connta

    great, i have slightly worse image quality (which i can correct in driver settings and guess what quality and high quality make up for 2-5fps difference and no real IQ difference as far as my eyes are capable to see)…
    on the other hand i do not have a power guzzling card and heat nightmare, my southbridge used to overheat from nvidia cards… yeah, SB overheats and my comp crash, can you set that to “cooler” in driver settings (since i can set high quality) without making my comp sound like a jet? no sir. that is why you fail, complete, “rounded” product is important. very important.
    and all these sites are german. how come no one else found and published this? i don’t see anything on so many other respectable sites… for one guru3d.com said that there is no real difference in IQ and you know what, moving from gtx260 to 6870 i didnt either. a bit fishy my kind sir…

  • Diceman

    Guru3D is well known for sugar coating reviews because Hilb3ert likes to keep recieving review samples.
    Hilberts review only took still shots, and the problem is not visible in stills.

  • Diceman

    don’t post if you didn’t read.

  • Diceman

    i’ll stop you right there.
    the nvidia cg compiler did use a lower precision in 3dmark, but there was NEVER a glaringly obvious decrease in quality.

  • Diceman

    I’ve tested from Release 257 to CUDA Development 263.06, all of these set the supersampling – alpha test setting that does not benefit from a higher FSMSAA mode.
    I was contemplating moving to a GTX 500 series card eventually but i didn’t want part of the reason to be getting TrMSAA back without needing 3rd party tools xD

  • Diceman

    It would be Libel, not slander. and its not Libel anyway because its testifiably correct.

  • Diceman

    also, gj with your 5800 series with its grey screens and abhorrent driver resets due to improper per vendor implementation.
    i can really see AMD holds the reigns over non-reference design.

  • boo@boo.net

    Pot calling kettle black eh Nvidia? You have done the same thing yourselves so many times that it is not even funny. Less QQ please and more focus on making good cards.

  • neliz

    NVIDIA caught cheating in HAWX 2 benchmarks:
    AA IQ is lowered to raise the performance. Dear NVIDIA, maybe you should focus on making better products instead of trying to paint the competition black?

  • Jon

    Please stop posting, you make yourself look like a tool.
    “In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.”

  • http://www.old-school-spiele.de Malte

    I’m happy, that I bought a GTX460 from Gigabyte instead of a HD6850 or HD6870 …
    Regards from Germany

  • Psycoz
  • Nick Stam @ NVIDIA

    Hi Psycoz.
    Please see this story where I explain the HawX issue:

    Also, Hilbert at Guru3D showed the perf gains that we’re concerned about by AMD’s stepping down default IQ. Diceman is right that it’s most obvious in motion, not still shots. Even though Hilbert doesn’t think the actual default IQ degradation is a real big deal IQ-wise (which we disagree given all people who used to argue that we needed to improve our default IQ in the past, and not have any texture shimmer, etc, and now all of the sudden it’s acceptable to the community for AMD default settings to have texture shimmer???….), he does say this in conclusion:

    “We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits. So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.”

  • Monkey Boy

    What? You guys were deliberately caught cheating in the past. It was blatantly obvious! Whereas, in AMD’s case, not so much so. In fact, in he linked Guru3d article, the writer even states that they are incapable of discerning a real difference between the two IQ levels.

  • Monkey Boy

    Also, why are some people saying they have to change the IQ for Nvidia cards with the recent driver releases? Here, I’ll provide a quote and exact link.

    “I just sold my 5870 couple weeks ago for a 580 GTX.

    Yes its true, I had to set my options from Quality to High Quality in the ATI drivers….this is something I have know since the 1900xt days….


    Why isnt this the article as well?

    Seemed Biased to me.

    And this is coming from someone who just had an AMD product to Nvidia… ”


  • bore_u2_death

    Directly quoted from the article published on http://www.guru3d.com comparing AMD and nVIDIA image quality.

    “The optimization however allows ATI to gain 6% up to 10% performance at very little image quality cost. And I choose the words ‘very little’ here very wisely. The bigger issue and topic at hand here is no-matter what, image quality should be 100% similar in-between the graphics card vendors for objective reasons.

    So what does that optimization look like in real world gaming. Well it’s really hard to find actually.

    We seriously had a hard time finding an application where the optimizations show well. So mind you, the above example is catered to show the image quality anomaly.

    This is Mass effect, 16xAF and Trinlinear filtering enabled in the configuration file. We are in the spaceship and have positioned ourselves with a surface area where the optimization should show really well. Yet we have a more complex scene with nice textures and lots of colors, much less bland then the simple example shown previously. This time just a Radeon HD 6850 with the optimization on and off and a GeForce GTX 580.

    We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560×1600 to try it look more visible.

    Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it’s not a cheat. And sure, we know .. this game title is not a perfect example, it however is a good real world example.”

    The ultimate indicator of image quality is that experienced by the end user while using an AMD or nVIDIA graphic card. If the image quality difference in not discernible during normal use, then Catalyst 10.11 is not harmful to image quality.

    Just like the quality settings in the nVIDIA Control Panel, the Radeon HD 5000/6000 series GPUs have their Texture Filtering setting at the Quality setting by default. The end user can adjust this to his or her liking at any time. AMD has optimized performance for its most current GPUs at no expense to image quality, and has made its products more competitive at no loss.

    Most importantly, you have to deliberately expose a slight difference in IQ during regular gameplay by using Mass Effect, a game from over three years ago by increasing the resolution to 1600p, artificially boosting the brightness beyond what a normal player would use, and then magnify the pixels by zooming in PhotoShop, the image quality loss experienced in that scenario is INVALID.

    The claims nVIDIA makes in this article are groundless and invalid in some cases.

  • psycho_biscuit

    Groundless? I don’t think I’d call them that. It may not be much of a difference, but it is still a difference. When I’m playing a game and see something like that happening, I’m immediately drawn out of the immersion. The fact they purposefully do this is kind of insulting.

    You may need to set NVIDIA’s card from quality to high quality as well; but does their ‘quality’ setting skimp out on anything? I like the idea of having a default setting that is not the highest a card can perform, if you don’t want it to be running as hard. Of course most people will turn it up if they’re needing more fps.

    However, if ATI’s cards have less quality at the default setting, what does that say about their ‘high quality’ setting? Do they try and skimp out on IQ there as well, to say their high quality setting runs faster? Losing trust is extremely detrimental in the eyes of the consumer. This, along with all the crashing of games I have with my ATI card currently because of known issues with certain game engines (even with correct drivers) mean that I won’t be buying from them anymore.

    I’ve never had a problem with an NVIDIA card. The only ATI card I’ve ever owned has frustrated me to no end. If they’re skimping out on quality, that’s the nail in the coffin. Give me better, stable quality over a few fps any day.

  • Monkey Boy

    This is my first ATI card in 10 years, and I’ve yet to experience a single problem with it. No one is saying AMD is “skimping” out on IQ, since no one can definitely prove it. As the Guru3d states, it’s damn near impossible. Anyone who says they can clearly discern a difference is a blatant liar. The question is, what’s the difference between Nvidia’s quality setting and their high-quality setting? That’s what needs to be asked as well. I want someone to compare and contrast the two quality settings from both companies.

  • bore_u2_death

    I applaud AMD for having increased the performance provided by its graphic cards with more effective optimizations.

    I challenge nVIDIA to demonstrate a valid scenario where AMD’s newest Cat 10.11 driver compromises on image quality. A valid scenario is only one that can be found during regular gameplay, and not one where old games, high resolutions, certain angles, artificially high brightness, and zoom can combine to allow people to SPLIT HAIRS.

    nVIDIA’s claims have NOT been substantiated in any way with real world gameplay. They are GROUNDLESS until proven otherwise.

  • bore_u2_death

    And just as a reminder, if anyone is so uncomfortable with a microscopic loss in image quality that has(as of yet) not been proven to be detrimental to the end user that you would rather forfeit a 6-10% increase in performance, you don’t have to stand for it.

    Open Catalyst Control Center, and set the Texture Filtering option to High Quality, which just like nVIDIA, will disable all performance optimizations.


  • Skreea Muhammad

    Thanks to all the shenanigans NV PR pulls off, this owner of a large LAN game center (we have over 200 gaming machines, with a mix of NV and ATI cards) .. we spend over $20,000 yearly alone on GPUs, I think it’s finally time we rid our systems of Nvidia products, and go full head on with AMD. At the very least we can retire a few Air Conditioners and saving a ton of electricity in the process. We cannot support a Corporation that acts like children.

    In our 7 years of operation we have mostly used NV products thanks to their superior build quality and frequent driver updates. However the past few years had NV staggering … they’ve dropped the ball on both build quality and driver updates. That bumpgate fiasco? Yeah that hurt. With the amount of machines we have, any issue is catastrophic. As our systems are leased, any issues will cause downtime and loss of profits as we wait replacements.

    When many of our cards started artifacting, some blowning capacitors, among others takes a machine down we are left with lost profits when the system is down. Out of the hundreds of AMD cards we’ve leased, only a handful have failed. However the same cannot be said for Nvidia. Most of our G80/G92 cards have failed (70%+) with the symptoms I’ve listed above. Note that I did not mention Fermi and I will not need to as that chips’ issues are highly documented on most credible hardware sites.

    We have stuck with Nvidia through the good and bad. However it is time to say goodbye. NV isn’t interested in fixing any of it’s own problems and instead resorts to pools to rally up unknown, non-english sites with dirty fingers pointing at AMD is silly.

    Farewell and Goodbye.

  • Sharky

    I read all the comments and it’s really scary how ignorant people can be. The AMD AF optimization is not something you can barely see. It’s a very noticable IQ degradation. You barely see it on screenshots. For this reason the Guru3D article is ridiculous. But you can see it during gameplay.

    Watch these videos:


  • Monkey Boy

    Like Nvidia’s visible IQ degradation in Trackmania? Nvidia on the left, AMD, right. Look at the sign and beams in the back, particularly the fat circle.


    Here’s the original screen, double click to zoom in:


  • bore_u2_death

    The longstanding rule has been that there should be no IQ difference between the Quality and High Quality AF settings, and both made an agreement that their Quality settings would have no optimizations that degrade IQ. So if A = B and B = C, then A = C.

    To find out the effect that all these IQ “degradation” means to the end user, I conducted a research with the HardOCP community, asking Radeon HD 5000 and 6000 owners for their opinion. Read on and find out their subjective opinions. As you will see, it doesn’t amount to much.


  • Sharky

    The videos in the link I’ve posted above show that the Radeons have the shimmering problems even in HQ (which NVidia cards don’t). So Q and HQ comparison does not really matter. Even at best HQ settings the AMD cards have vast IQ issues.

  • http://www.pcpowerplay.com.au Unco

    “And as Joe Eklund said, NVIDIAs drivers are the best and no nonsense.”

    It was only a few months ago that NVIDIA drivers were killing GPU’s due to a dodgey fan controller. People have such a short memory these days.

  • http://www.pcpowerplay.com.au Unco

    “I’ve never had a problem with an NVIDIA card. The only ATI card I’ve ever owned has frustrated me to no end. If they’re skimping out on quality, that’s the nail in the coffin. Give me better, stable quality over a few fps any day.”

    Better stable quality? Did you not see the GTX 480 reviews on launch? Give me a break!

  • Sharky

    “Did you not see the GTX 480 reviews on launch? ”

    Please give a link to a GTX 480 review where they say a word about instability or image quality problems.

    You have none? Case closed.

  • Psycoz

    AMD Catalyst 10.12

    [b]High Quality AF Mode is new default setting for 6xxx series[/b]
    New Catalyst Control Center with Explorer Style menu and shortcuts
    AMD Branded
    Support for 6900 Series