The New NVIDIA TITAN X: The Ultimate. Period.

by Matt Wuebbling

It began with a bet.

Brian Kelleher, our top hardware engineer, bet our CEO, Jen-Hsun Huang, we could get more than 10 teraflops of computing performance from a single chip. Jen-Hsun thought that was crazy.

Well, we did it. The result is crazy. And, as of today, Jen-Hsun now owes Brian a dollar.

The new NVIDIA TITAN X, introduced today, based on our new Pascal GPU architecture, is the biggest GPU ever built. It has a record-breaking 3,584 CUDA cores.

We said our GTX 1080 delivers an “irresponsible amount of performance.” It was a bit reckless. But this is even more reckless.

So forget words. Here are its numbers:

  • 11 TFLOPS FP32
  • 44 TOPS INT8 (new deep learning inferencing instruction)
  • 12B transistors
  • 3,584 CUDA cores at 1.53GHz (versus 3,072 cores at 1.08GHz in previous TITAN X)
  • Up to 60% faster performance than previous TITAN X
  • High performance engineering for maximum overclocking
  • 12 GB of GDDR5X memory (480 GB/s)

Did we go too far? Your call. Just don’t call us crazy, or you might owe Brian a dollar, too.

TITAN X will be available Aug. 2 for $1,200 direct from in North America and Europe, and select system builders. It is coming soon to Asia.

For more on TITAN X, see “A TITAN for a Titan: NVIDIA CEO Jen-Hsun Huang Presents New TITAN X to Baidu’s Andrew Ng

Similar Stories

  • Blockchains

    Taking price gouging to an extreme, I see.
    Good job guys.

  • Troy

    Titan X? Why not Titan P or Titan X2.

  • Douglas Boehme

    It’s called supply and demand. Your tears clearly can’t pay for the card so you might stop whining. I’ll be buying 2 when they come on sale.

  • Slipperyfetus

    $1200 and it isn’t even using HMB2? Yeah, pass.

  • Robert Mahon

    Very happy with my existing Titan X. But this… what’s the power reqs like? With Pascal, probably very reasonable. This might be a fantastic card to SLI to get monstrous 4k gaming quality.


  • snee mgee

    Will there be as big of a shortage with the Titan as there was with the 1080? If I sell things to get it, it will be very frustrating to have it sell out within minutes.

  • ThisOtherGuy

    You know, I bought a Titan X last year, but seeing the EVEN MORE unnecessary price tag this year, I might pass. It’s one thing of it has HBM or some other “new” feature, but it’s an entirely different thing if it’s just last year’s Titan clocked 250 MHz faster.

  • uziwooshan

    If it had hbm memory I might have jumped the ship from the red team. Guess it’s vega for me then…

  • dellers

    1200$. The dollar has become incredibly expensive lately, so this is definitely not going to sell well outside the US. In any healthy market you wouldn’t increase the price 20% when the currency is rising. Probably going to end up in a bad spiral, where these cards don’t sell that well leading to even higher prices next time. A couple of generations more and Titans will cost a month’s wage.

  • dellers

    Pretty sure I read 300W somewhere, without overclocking. Not that reasonable.

  • bviktor

    Great name. Will be a breeze to look them up in webshops. Facepalm.

  • Mohadib Paul

    nVidia please stop the abusive pricing. We beg you long time………..

  • Tyrone /RIP Impala/

    Won’t surprise if they release 1080 Ti for lower price & better performance a few weeks later

  • get mad about skins

    That’s not called supply and demand lol. Supply and demand is where limited amount of items are introdruced. Such as old classic cars, low supply and high demand that’s why they are expensive. There will be a lot of produced Titan X’s. I think you meant target market, so the market for this product would be people who buy high end products.

  • Brian

    Yup , me too, time to retire my old Titan x’s

  • Brian

    It’s 60% faster than the old Titan X . Even if we SLI are old Titan X, the best case scenario is we get only a 50% boost. I think it’s more than worth it

  • Brian

    Goodbye my Titan X, hello my Titan X.
    Weird , like getting a new girlfriend with the same name . Either way , im on board . RELEASE THE KRAGUL!!!

  • Anthony McCann

    its not price gouging, the titan x is a luxury item for enthusiasts. why even complain about the price, you know what the name titan implies by now surely?

  • Vega June

    May I ask you as a low peasant, what will you do with 2 new Titan Xs? Do you have like a monstrous 6*4k display setup?

  • Martin Parker

    So it is going to be called exactly the same as the previous Titan card. What an incredibly stupid idea. So you go to Amazon to order a Titan X and now you have to be *really* careful to make sure you get the right generation of the card. It doesn’t even have a different amount of memory FFS! What dumbass came up with that idea?

  • Martin Parker

    Why does the only single precision (SP) 11 TFlops speed get mentioned? What is the double precision (DP) speed? Is DP capable of 1/2 SP like Pascal is meant to deliver or is it 1/24 the speed of SP like the previous generation?

  • Jarix

    the price is crazy…oooops

  • awesome_farts

    What is hbm2?

  • Jarix

    High Bandwidth Memory (HBM2.0) – THE FUTURE (2017)

  • dsr07mm

    Just wait 1080Ti and let history repeat.

  • sluflyer06

    enjoy 2nd or 3rd place.

  • ThisOtherGuy

    I don’t think I could justify 5 teraflops fewer of compute over the Pro Duo for a $300 price difference.

  • FinnishSpartan

    I have an HTC Vive and would like to max out the graphics which at the moment is only possible with 2x TITAN X.

  • Vega June

    Oh, may I ask on what games and/or interactive apps?

  • hurin

    It’s marketed towards people with F you money. Even if you got a 3 monitor setup, it makes more financial sense to buy a 1080 now and wait for Volta.

  • Allie

    not a few weeks later
    more like a few months later

  • Allie

    a guy in nvidia who be like we just released 1080 now what for titan ???

  • Marcos

    You can find the correct version just looking at the price. Really, there’s no way you can miss that.

  • dellers

    When the Titan goes 200$ up in price there’s no way the 1080 Ti won’t be significantly more expensive than the 980 Ti as well. At this rate next gen Ti’s will probably be 10k$, which quite frankly is a ridiculous price for a consumer card – especially considering that a lot of countries are struggling right now thanks to weak local currencies and strong dollar.

  • ThisOtherGuy

    I don’t think you really understand how Pascal does its DP. While it does execute one DP operation across two SP registers, that comes with quite a bit of overhead.

  • Bryan Daly

    Titan P? For real? P? I mean I know it’s Pascal. Let me just say this:
    I just went to the bathroom for a titan pee.
    Titan X2 is considerably better, but the problem there is it may suggest a dual GPU card like Titan Z or GTX5/690. AMD already uses X2 to denote just that. MAYbe they could go with Roman numerals “Titan XII” but that’s not much better. People tend to drop Romans for a digit anyway.

  • Matthew Morek

    Because clearly you don’t ever read specs or descriptions of items you buy online for over $1,000. Somebody, please take this gentleman’s wallet and put it in a safety deposit box, as he does not seem to understand how one should approach purchasing things online.

  • Scott

    Funny, all the people who parrot ‘supply and demand’ while never even thinking about it. Supply has nothing to do with anything. This card is an inleastic good; there are no substitutes. The demand at $1000 would be relatively the same as the demand at $1200, so that’s why they can get away with it. This would not be the case if AMD had offerings in this range. Demand would drop sharply the more price is increased in that case.

  • uziwooshan

    Maybe in dx11 but surely not in dx12 where gcn is used to its real potential. Is there a problem in looking towards future tech? I currently use 2 R9 290s for 4k gaming and they develop ~9.3Tflops together. Ofc dx11 is only acceptable @4k by which i mean ultra without AA. But the hbm2 bandwidth will help a lot more at higher ress than 384bit. So far polaris proved to work great as long as you don’t have an antique mobo. The gain in dx11 performance of the gtx1060 is proportional with the price increase so 2 successful products both. Meh i blabered long enough but this should skip some further replies.

  • Rob Ainscough

    Wonder how long it will take EK to come out with a full coverage water cooling block for this? Anyway, I’m on the notify list and will buy two when available … but I’m guessing Aug 2nd will be more like Oct/Nov before they are actually “in stock”. Anyone want to buy my old Titan X’s for $600 each?

  • Mo

    Dude get the Titan X. That Titan X is old and doesn’t play well with 8K.

  • sluflyer06

    the only place AMD has any real advantage since pascal came out is Vulkan, Nvidia has regained their lost ground in DX12 now and maybe that will matter if Vulkan ever gains popularity, right now its just 2 games.

  • YoYo-Pete

    They were tentativly going to stick with the 3 digit nomenclature and use GTX X80/X70 for the new cards but then just ended up doing 1080/1070

  • Sovos

    Yes, and I’m sure there’s no one on the internet who will set the price of an old card to the price of a new card in the hopes that the buyer makes that assumption.

  • chizow

    Since Nvidia loves its 5 Wonders of Pascal campaign, here’s my take on this card. 🙂

    5 “Blunders” of GP102 Pascal:

    1) Only 12GB VRAM, sidegrade from original Titan X
    2) GDDR5X, not HBM2.
    3) Not full fat chip, 3584 on this chip vs 3840 fully enabled.
    4) 20% higher price than all other single-GPU Titans
    5) Marketed as Prosumer card again to justify premium

    To prospective buyers, remember the OG Titan, looked great at first,
    suffered from many of these same deficiencies, early adopters later
    burned big time not once, not twice, but thrice with the release of the
    780, 780Ti and Titan Black obsoleting the original in less than a year
    from launch.

    Obviously I’m addicted to performance and Nvidia is the only option here, but there comes a time to say no, and now is that time 🙂

  • Mexor

    1) and 4) probably have to do with cutting into the market for NVIDIA’s $5000 Tesla cards because 5) is real, not just a marketing ploy. Note that this card was released at a deep learning gathering, not a gaming gathering. Don’t be so solipsistic.

    Are you sure about 3)? And even if true, so what? It’s a very large chip on a new process, it seems standard practice, not blundering.

    As far as 2), I guess that’s a price/performance trade-off decision relating not just to this card (the Titan X) but whatever other cards might be released using the GP102 GPU. For gaming, the memory bandwidth is presumably plenty. For compute usage, I am sure they would like more bandwidth, but perhaps, again, differentiation with the more expensive Tesla offerings might come into play here (though presumably the Tesla M40 will get a Pascal successor and if GP102 is used for that part then it too would be lacking the bandwidth offered by HBM2 unless NVIDIA is willing to make two versions of the GPU, one on CoWoS with an HBM2 memory controller and one on a standard process with a GDDR5X memory controller).

  • Martin Parker

    Don’t be an idiot all your life. Take a 10 minute break. Yes, of course I read the specs, but it doesn’t seem unreasonable to ask NVidia to differentiate 2 cards which are so similar in specs. Same name, same RAM, even similar looks. As I said you do have to be really careful to make sure you got the right one. I don’t know of any manufacturer who has used exactly the same name for 2 different cards, so why start now?

  • Erick Sitter

    Lol, Jay thought he’d be using 2 1080s, will he be using 2 Titans now? ^O^

  • Jonathan Leack

    The best GPU in the world. Great R&D Nvidia!

  • Jonathan Leack

    Yeah ok.

  • Jonathan Leack

    You have no idea how expensive developing graphics cards like this is. If they were only getting a $100 profit per sale, they would never recover their costs and investment.

    You’ll understand one day.

  • awesome_farts

    Thx lol

  • Martin Parker

    Oh really? There’s several of the old ones on ebay selling for > $1200. Seems there is a way you can miss that …

  • Mexor

    Please stop misusing terms.

  • Andrew

    People who buy high-end products. A small market. Low supply. The fixed-to-variable costs ratio (engineering, factory setup, QA training) is much higher per card they’re going to sell, which dictates a higher selling price if NVidia wants to break even.

    So you’re both right. They have a specific high-price target market, which is smaller, therefore there are *extra* costs associated with the low volume via supply/demand.

  • Mexor

    If the demand at $1000 is the same as the demand at $1200 then the card should probably be priced far higher than $1200.

  • Matthew Morek

    First of all, there’s just a minimum of similarity between the old and the new one visually, largely in shape and size, but the overall look and feel is different.

    Secondly, it’s easy to add a year of release to a product title in brackets, ensuring customers know what they are purchasing (say, i.e.: [mid-2016]).

    Clearly, NVIDIA wanted to preserve the TITAN brand name, instead of endlessly fighting with gibberish codes, especially that TITAN X can only be ever one. Every product gets its own model name and number regardless; I don’t see why there would be an exception in this case (i.e.: EVGA 12G-P4-2990-KR).

    Sure, clearly ambiguous name might be an issue, but it’s highly likely they will retire the current TITAN X cards shortly after the new one launches, making it easier for merchants to mark or discount them appropriately.

    There’s always a way. Besides, it’s just an announcement, without any concrete model numbers mentioned (or memory configs), so no point arguing about something that isn’t yet available.

  • Kevin Price

    When do you think the evga sc’d edition’s will follow?

  • acrid56

    True. Cars work the same way- Altima, Golf, Porsche 911.

  • uziwooshan

    Gained ground due to strong HW but dx12 favors the clustered system amd uses whereas dx11 favours the system used by nVidia. You can read the technical details.,they are too long to read Just think that the fury x had 8.6Tflops on a 28nm chip. So the 1080 is not a big deal in terms of raw performance. Dx12 will gain popularity and async as well. Since gcn is fully async it will benefit from it. And yes devs will use tech like async due to the gcn architecture inside consoles. If you read long enough and in detail you’ll see that as new titles will arise nVidia will play ketchup. After all it has always been like this. Each dominates a period. Voodoo once dominated both ati and nvidia in terms of performance. And they did it high time.

  • tyrionlannister

    Thanks a lot for refusing to talk about this until now. I contacted your support last month to help with my purchasing decision. I wanted to decide whether to purchase the 1080s or wait for an updated Titan. I was just asking if there was one planned, and as a second question, whether it would be released within the next year if it was planned.

    Your response:

    “We understand that many end users will always have interest in information on future NVIDIA products or technology. Unfortunately, we cannot discuss any future products or technology, or confirm whether we are or are not working on such devices.”

    So, I bought a pair of 1080s. I won’t be buying the Titan.


    You will have to sell your kidneys or your first-born in Oz for that … I might wait for the next-gen or might upgrade my CPU + MoBo and get a 3rd 980-Ti-STRIX … would be most likely faster than 2x Titan’s and cheaper as well …

  • Sebastian Paulus

    Add the 100% increased price over the 1080 for a maximum 20% increase in performance or even lower depending on its overclockability.
    Price per performance of the Titan X is abysimal.

  • FinnishSpartan

    GTA5, Skyrim and Fallout 4 for examples. I like to run mods too.

  • Matthew Karlsson

    Really Nvidia?

    I mean, I knew this was coming, and planning for it, but I just received my 1080 last week, and haven’t even received my water blocks I ordered from EK in Slovenia yet… Now I am going to have to turn around and sell these at a loss.

    If you were going to launch the Titan X so soon, couldn’t you just have announced them together? Geez. Talk about stabbing your best customers in the back…

  • TheDarkSlay3r

    But can it run Crysis?

  • Dirk Broer

    11 TFLOPS FP32, nice, but how many TFLOPS FP64?

  • PublicStaticVoidMain

    titan pee FTW

  • chizow

    1 and 4 are marketing justifications to raise prices for those who
    are too myopic to see the truth of it. Original Titan X had literally
    ZERO “Prosumer” capabilities aimed strictly at gamers and guess what?
    It was the most successful Titan X of all time. Removed link but search “kitguru Titan X sales”

    Not to mention, there were 3 generations of Nvidia GeForce cards that
    preceded the original Titan that still maintained most of their Prosumer
    capabilties (DP) before Nvidia started artificially neutered them into
    oblivion starting with Kepler. The same generation they hatched this
    Prosumer nonsense to begin with to justify a Titan-ic price increase.

    Yes, 3 is true. 3840 in its fully enabled form. Removed link but search “Nvidia blog P100 3840”

    And please stop making excuses that simply don’t hold any water
    historically lol, its a cut wafer at an ultra premium no need to sugar
    coat it. 8800 GTX, GTX 280, GTX 580, GTX 780Ti, Titan Black, Titan X
    are all examples of flagship chips that were uncut, full fat chips from
    Nvidia. The very large chips on a new process that are cut back are
    typically reserved for Tesla or Quadro where TDP matters more than raw

    2) Is still a cut back feature on a card that should
    be premium, with not only lower spec memory but also lower memory
    density than expected.

    In summary, Nvidia is charging 20% premium
    for a cut down chip with the same VRAM capacity of its predecessory,
    while only offering 40% best-case improvement over the already
    overpriced GTX 1080. I’m as big an Nvidia fan as any but there’s simply
    no room to be a sycophant for products that simply don’t make the grade
    when you’re actually in the market to buy such products.

    Links removed due to page moderation trigger for links

  • chizow

    Agreed, price perf is always abysmal on titan level products, but you would at the very least expect all the bells and whistles and options from the start when you see that massive sticker price. The fact nvidia is holding back just shows there’s room for something much better. No thanks!

  • Vega June

    Neat-o, too bad I’m not really into VR, because I can barely afford a 1060, and I’m not really familiar with it’s requirements/possibilities… Anyway good luck!

  • Daniel Hinkle

    Sli support?

  • Russian Spy

    Titan Y

  • Russian Spy

    What if you’re wrong and they do release it here in a couple of weeks?

  • Mexor

    “1 and 4 are marketing justifications to raise prices for those who
    are too myopic to see the truth of it.”

    No. You don’t understand the market. You think it’s just a gaming card with a gimmick. It’s not. Search for “inside the gpu clusters that power baidu’s neural networks” and read the nextplatform article. Search for “Baidu Eyes Deep Learning Strategy in Wake of New GPU Options” for more discussion on the topic. Is it a coincidence that Jen-Hsun Huang presented the Titan X to Baidu Chief Scientist Andrew Ng and not to a game developer? I think not. (Search for “A TITAN for a Titan: NVIDIA CEO Jen-Hsun Huang Presents New TITAN X to Baidu’s Andrew Ng”.)

    “Yes, 3 is true. 3840 in its fully enabled form. Removed link but search “Nvidia blog P100 3840″”

    Yes I know how the GP100 is configured. But it’s pure speculation at this point to assume GP102 is configured with six GP104-type GPCs. It probably is, but it’s not definite.

    “And please stop making excuses that simply don’t hold any water historically lol, its a cut down chip at an ultra premium no need to sugar coat it.”

    The GTX 780 was a cut down GK110 and was NVIDIA’s flagship card for six months. Besides, what’s it matter to you if NVIDIA makes a 600 mm^2 die with 10% of the SMs disabled or a ~540 mm^2 die with all SMs enabled? It’s a silly arguing point.

    “2) Is still a cut back feature on a card that should be premium, with not only lower spec memory but also lower memory density than expected.”

    As far as gaming is concerned, the Titan X does not benefit from HBM 2 except for power requirements. The Titan X has 24% more FLOPS and 50% more memory bandwidth than the GTX 1080 (and 40% more cores with 50% more memory bandwidth in situations where that is the more relevant comparison). It’s not going to be memory bandwidth limited. As far as professional usage is concerned, it is exactly a cut-back specification. It’s meant to be, as I was saying in my previous message.

    “In summary, Nvidia is charging 20% premium for a cut down chip with the same VRAM capacity of its predecessory, while only offering 40% best-case improvement over the already overpriced GTX 1080.”

    First of all, comparing the card in one aspect to the Maxwell Titan X and in an entirely different aspect to the GTX 1080 to conclude the card doesn’t offer much is a logical fallacy. One cannot combine the Maxwell Titan X and the GTX 1080 into one card that has the RAM of the Titan X and the processing power of the GTX 1080 (in addition, you conveniently left out RAM bandwidth, which is not included in either of your cited cards and would be necessary to achieve the performance of the Pascal Titan X). But, perhaps more importantly, your claim that the Pascal Titan X is offering a 40% best-case improvement over the GTX 1080 is only true from a gaming perspective. The Pascal Titan X, however, is not a card that exists only in the realm of gaming. The GTX 1080 is not a good candidate for professional machine learning. It neither has enough RAM nor enough memory bandwidth. (It also has restricted FP16 performance, but it’s unknown at this point whether the Titan X does, too. My speculation is that Titan X is fully FP16x2 enabled.) Therefore the GTX 1080 can be priced without concern of how it affects the way potential Tesla customers choose to spend their resources. The Titan X has enough RAM and enough memory bandwidth to be useful for machine learning. With it, NVIDIA must be careful not to incentivize professional machine learning companies into spending more money on interconnect hardware and developing software and algorithms to wire together many lower-margin Titan X’s rather than spending that money buying higher margin Tesla cards from NVIDIA. Therefore, to protect their margins, NVIDIA must demand a larger margin on the Titan X to make it worth their while to provide the option should those companies be inclined to go that route with their algorithms.

  • David Minster

    I have one reservation about this card. Microsoft. For the last month or
    so Windows 10 only allows ‘bad’ (their label not mine) drivers written
    or provided by Microsoft itself. I did get the native Nvidia drivers to
    work on my 1080 at the cost of almost breaking W10. So it is their
    drivers, or no one else’s for now, a seemingly common problem as a quick
    Google will reveal. Also, the driver available is not the latest Nvidia
    driver and the GeForce experience won’t work on the latest iteration of
    Windows. So, my question is, what is the point of getting the best
    video card ever if it isn’t supported by the operating system? A system
    that make it very vanilla instead of the raging monster it should be.

  • LDKM Tech

    It should be GTX Titan Colossus or GTX Colossus.

    Titan –> Colossus —> Behemoth

  • LDKM Tech

    hence why they said: “We said our GTX 1080 delivers an “irresponsible amount of performance.” It was a bit reckless. But this is even more reckless.”

    You’re right for saying it’s people with excess amounts of money.

  • chizow

    lol the amount of apologist nonsense mixed in with blatant ignorance of the facts I’ve presented is appalling. Feel free to pay more for less, then act gobsmacked when Nvidia pulls a Kepler all over again and releases an upgraded Titan or even Geforce card in 6-8 months lol.

  • Mexor

    You didn’t respond to my points, you just made unsupported blanket statements. You probably didn’t even read the articles I sent you to in order to understand where my “ignorance” stems from.

    NVIDIA almost certainly are going to release a card about as fast as the new Titan X for gaming purposes, but much less attractive for professional usage, in six months or so. If you want a cheaper card high performing card for gaming then wait for it. Without the professional market this Titan X card probably wouldn’t be released at this time at all.

  • Bruno Magalhães


  • Blockchains

    That’s why nVidia’s profit margins have been increasing for years. Because they’re barely making any money off of these cards – got it.

  • Blockchains

    Yes, thanks to idiots like yourself we have sky-high hardware prices. I still remember when you could get dual-GPU cards like the GTX 590 for $700 USD. Keep drinking the cool-aid and pay a %50+ premium to look like an idiot in a few months when AMD or nVidia release superior hardware at a lower price point.

    I knew someone who bought the original Titan about a month before the 780 Ti came out, and the results were hilarious. Hilariously the r9 290 (which was launched for $400) now beats both, lol.

  • Blockchains

    The GTX 980 Ti was a slightly cut-down Titan X yet sold for significantly less. The nVidia game-plan is to release over-priced hardware, and then release much cheaper but ever-so-slightly less powerful hardware just a couple months later.

    It is easiest to extract the maximum amount of capital from those who don’t understand its value.

  • Douglas Boehme

    Your tears sustain me. Just for that, I’ll buy 3 of them.

  • Blockchains

    I can assure you the sum of the hardware in front of me is worth far more than a few Titan X’s. However, unlike a Titan X being bought by someone who doesn’t understand the value of money, this hardware is instrumental in my capacity to make a very good living. Go nuts and buy any number of Titan X’s you please. Three would suit you well as there’s almost no SLi support for 3-way configurations.

  • Àlfryan Irgie

    Hmm. Good package for white day.

  • Sean

    Curb your expectations. You think corporations will reveal their product plans to random people?

  • Stanley Scouten

    i plan on buying two as well once their are some decent water blocks for them out. First card thats got me excited since the original titan and 780

  • HauRock

    So is the 20% price bump you have been doing for your cards going to be the standard every year from now on. Because if so I’ll be switching to over to the red team.

  • Moltakfire

    You realise that was sarcasm, right?

  • Yeltnerb1

    No, you REALLY don’t :

  • DongJun Kim

    How about Titan NX(New X) or Titan DX(Double X)

  • delacroix01 .

    Titan XL?

  • blinkclaw

    what is volta is that the next gpu project don’t recall anything with that name

  • Blink Fast

    I’m getting automatic updates from nVidia once or twice a week for my 1070 through whatever the nVidia applet is called. Unless you have a pre-made proprietary system with modified chipset that only installs manufacturer drivers, I’m not sure why you have this problem.

  • TODO

    They have been doing it for quiet sometime now.

  • Henry

    Titan XP simple. Do it now.

  • Nathan Bullock

    Titan XP
    “Win XP with the Titan XP”
    “Level up your game with the Titan XP”
    (Credit to LinusTechTips)

  • himura

    For GPU renders such octane SLI is not necessary (actually it is detrimental) so that is not a valid argument.
    The new Titan X might make sense if the budget allows for it.

  • himura

    Rendering, you guys forget that not all GPUs are used for gaming

  • RailGunra

    No it’s “Titan :p” Way better than “Titan X”

  • RailGunra

    Are you on crack? Since when did the dollar get expensive? It’s lower than ever. $1200 or $1400 more likely here in Sweden is not that much if you compare the prices of the 1080 which is around $920. $490 more grants you the best GPU on market while the dollar is in decline. It’s great actually! 😀

  • RailGunra

    Getting myself one of these but I think it’s rather due to inflation, though it is a big jump from $999 to $1200 but it’s probably to avoid weird pricing like $1113 and adjusting it upwards until you reach $1200.

  • RailGunra

    You don’t need excess amount of money to get it. I personally save $85 every month in a 3 years period which equates to $3060. More than enough for a new rig with almost any parts you’d like.

  • hurin

    Why would you want a card with more FPS than your monitor can display?

  • Harawanagangsta

    Titan XP you fools!

  • Karim

    Both supply and demand have functions of elasticity. That’s what the phrase “supply and demand” is about, haha. And a monopoly — “there are no substitutes” — is a supply problem, not a demand problem.

  • Curtis Allen

    All the people complaining about the price. You realize this is an enthusiast gpu, it is actually a enterprise (business) gpu that has been built for extreme high end 4k video rendering and other complex compute tasks, but it can also be used for gaming if one so chooses. Universities use them also for film production and editing classes. The higher the cuda cores the faster it renders. This is also why last generations gpu, 980ti will render video faster than a 1080. That is what the last generation of Titan X was, this is no different.

  • Mike Lopez

    Wow lol are you being serious? That’s super rude.

  • Mike Lopez

    Looks sleek! Too bad I don’t want to spend 2400 plus tax on two…but more power to those who will. Lol literally more power!

  • sluflyer06

    Hi. You must be new here. Welcome to the internet. Also, yes, very serious about that, AMD comes out with OK products but when it comes to the absolute fastest, they just cannot beat Intel or Nvidia, which is why they focus on the value crowd.

  • Aaron Jones

    Titan XP would have been the best thing to call it… It’s pretty confusing to name something the same thing as one of your other products lol.

  • Diablo81588

    The new one will only be sold on nVidia’s website.

  • David Minster

    I built the system myself, MSI Gaming motherboard Z170, I’ll play around with the drivers that I have installed for the MB – thanks.
    Okay, all MSI motherboard drivers removed, I rebooted into a low res environment and downloaded the Geforce experience, everything Nvidia installed perfectly thanks 🙂

  • Diablo81588


  • Carl Sanders

    Well, yeah at 1200 bucks it HAS to be the best consumer card.

  • Carl Sanders


  • Mike Lopez

    I understand this is the internet lol but there is no reason to take it as a pass to be rude and condescending. My rule is if you wouldn’t say it to someone’s face or be that way with someone face to face, don’t say it. The Internet brings out the worst in people, no need to be a part of the problem. Also AMD is fine, I just wish they did a better job competing.

  • The Noble Robot

    It might not be impossible to differentiate, but there is pretty much no reason for them to have invited the potential confusion. I know I’m already having a hard time finding news about it on Google since the results are crowded out by press releases and blog posts about last year’s model (they even have similar “best ever” “most CUDA cores” “crazy expensive” headlines).

    I’m hoping to get one of these water-cooled, either pre-assembled from a vendor like EVGA or a reference-design-compatiable waterblock from EK, and I already know it’s going to be annoying finding any information about that online in the upcoming weeks/months.

    I mean, back when, NVIDIA could have called every card in this series “Titan (20XX),” but they didn’t, yet now they are changing the naming scheme for seemingly no reason. I mean, it’s not just that they used the same name, it’s that they used the same “sub-name,” which changes the meaning of the older name, too. The X *was* synonymous with the 900-series 2015 version of the Titan, and now it’s not. If they wanted to change the naming scheme, just calling it “Titan” would have made more sense here.

    So yeah, not the end of the world, but an objectively dumb marketing decision nonetheless.

  • Socius

    Will there be pre-orders prior to August 2nd? And is PayPal accepted or do I have to put this on my credit card?

  • lightking813

    welp better start saving

  • Gamedick

    literally ANYTHING but titan “x” would be better

    it’s really dumb, you must see that

    maxwell titan x cards are going for $1800 on newegg right now

  • RailGunra

    Downsampling 8k looks amazing, especially in 144hz or higher 🙂

  • Maxbad

    Is it really the ultimate though? I bet they gonna release Titan X Black or something with HBM2 and full-sized GP102 for 2000$ or so next year

  • RyuTakeru

    K, so maybe this has been answered but i wanna ask it anyway, whats the release date for the titan xp?

    Not that this matters too much just kinda wanna slate 2 of them on a workstation build i’m working on

  • princepwnage

    just call it Nvidia GTX Titan X 2016

  • JohnnyBenned

    On german sites the release date is 2nd August 😉

  • JohnnyBenned

    Titan X² would be cool 2

  • JohnnyBenned

    Titan X² pls ?

  • JohnnyBenned

    Im just to confused at the moment. Should i wait for the 1080 TI or just get one Titan X for a 3440×1440 display?
    How much will it cost in the EU?
    This pricetag is so high….

  • 1011101001001

    I agree. The fact that this does not use hbm 2 means I will pass. Hopefully amd has a beast card coming out with it. I was really wanting to buy the 1080 ti but It will just be a lesser version of this.

  • Alexandre Gauthier

    Great idea man. Will try to do that when i start working

  • Slipperyfetus

    Yeah man, if i were you id certainly wait it out. I think the 1080ti will probably use hbm2, but itll be at least 6+ months done the road and cost 1200+. I wouldnt be surprised if they released a titan 2.0 at the same time.

    AMD will defs be forced to bring out a well priced card using hbm2, i feel thatll be the best cost/performance buy.

    Itll be worth the wait man 🙂

  • Vega June

    When it comes to rendering, there are GPUs made for that, like the Quadro model line.

  • Vincent Hung

    Everyone here is missing the obvious gold mine name. A Titan card based on the 1000 series? T-1000

  • Xander Hawkins

    Will water blocks for custom loops be made, even by third party companies?

  • Blockchains

    We’re not exactly talking about a guy who’s in industry, doing rendering of any kind, lol.

    (I’m fully aware that these GPUs can be used for non-gaming purposes, I own many graphics cards myself)

  • ThisOtherGuy

    ‘60% faster’ the other option is to grab another Titan X once they hit $500, or, even better, grab some AMD cards.

  • Dirk Broer

    I use FP64 performance and the FP64 TFLOPS are decreasing with each new generation. I see it is only 317 (and when boosted 343). With the old (2014) Titan Z it was 8121.6 GFLOPS FP32 vs 2707.2 GFLOPS FP64, much better.

  • Privat Privat

    Why not titan Æ Ø and Å 😡

  • Privat Privat

    You mean titan X1,6 🙂

  • Tonči Jukić

    And then, for some odd reason, you say **ck you to all your customers that don’t have a country-specific webshop. “No, we won’t sell it to half the EU”.

  • KingK76

    You may think it is a “small increase in performance” but that would be wrong. The reality is that the Pascal Titan X will FINALLY allow new AAA games to be played at 4K resolution and at a vsynced 60fps. The GTX1080 is an amazing card but you can’t play games at a locked 4K@60fps. Even using SLI isn’t a great option anymore as major game engines like Unreal Engine 4 don’t even support SLI. I always buy 2 cards when it’s time to upgrade GPUs (I have dual 7900GTX’s, 8800GTX’s, GTX285’s, GTX480’s and GTX980’s) but now for the first time I can purchase a single card and get the performance I need (I game on a 70″ 4K Vizio P Series…). I will watercool this bad boy and OC it for an additional 20% performance on top of the 30-35% increase it will already have over the GTX1080. I can’t wait.

  • Blasted

    Oh yeah? Still no Async Compute? “irresponsible amount of performance.” Yeah irresponsible is right.I wasn’t impressed with 980ti’s “DX12 Ready”; I am also certain of this card missing the mark. You lost me this gen, try again some other time when you get around to work with new “API” instead of “brute forcing””irresponsible” numbers like AMD tries with their CPU’s. AMD succeeds this gen in their GPU dept. Why? It isn’t about “high raw irresponsible numbers”, it’s about correct set algorithm instruction sets first AND THEN speed. We don’t need “preemption” any longer, when AC does everything all at once, and the GPU take the load off my CPU, FOR ONCE.

    You’ve sunk very quickly NVidia, quit being lazy in your hardware to software development.

  • Blasted

    dat Red teams DX2/Vulkan performance though; sooo good! Tears wont be shed when I jump to team red baby! woowoo!

  • uziwooshan

    Red team realised fast that their HW was a bit too soon on the market and devs won’t really use multi-core systems. Hence why with dx12/vulkan GCN takes the lead. I got 2 R9 290s for 4k gaming and even tho the only game I tested in dx12 was tw warhammer I noticed around ~19 more fps on average than on dx 11.

  • uziwooshan

    Sooo you’re telling us that sli started losing support during this transition to Dx12/vulkan while crossfire is still strong?

  • Taylor Stoll

    I doubt it. But maybe… Who cares though as long as the performance is there? This card with a 10% overclock to the memory will be around 530GB/s of memory bandwidth using GDDR5x. And the GPU is 11TF without an overclock (easily 13TF with an OC). This card will allow 4K@60fps. That is what I’m after. I don’t care if they were using GDDR3 to achieve it. I think it is silly to walk away just because it isn’t using a tech that isn’t even available until 2017, and at that will only be of small consequence to performance (it’s the GPU that gives us the power, the memory is only an issue if it is holding back the GPU… which at 500GB/s it won’t be.) Think about it.

  • Taylor Stoll

    Have fun then….

  • Taylor Stoll


  • Taylor Stoll

    Super rude???? He’s telling it like it is. How is that rude? Nothing about what he said was “rude” maybe a little cheeky but not rude… My god… Mr.s sensitive over here…lol!

  • uziwooshan

    Well I will have with dx12 and vulkan games which will soon start coming more and more…

  • Blasted

    Yup, I have Total War: Warhammer and a 980ti and I am disgusted by my performance, even in DX11. I am so sick of NVidia and their lying crap market schemes, just like AMD touting “high ghz is better” on CPU’s which is not the case. Phew! At least they got the cards right! Awesome for you dude! I can’t wait to get an AMD GPU card. I hope NVidia learns from this.

  • uziwooshan

    I got 2 r9 290s and an fx8120 oc’ed at 4.0GHz. Dx12 does give me about ~19fps boost on avg @4k

  • uziwooshan

    Actually it’s noy fanboism but personal experience with both brands. Micro-stutter on crossfire. Yep true if we speak about cf in 2013-2014… As for scaling everyone knows cf always scaled better. Umm you were reffering? So you comment from multiple accounts? Also there is no engine-wide support for cf/sli but devs can implement them in their own games. Alas you should improve your judgement. I own devices with components from all the 3 teams(R-G-B xD) and so far for 2 years in the gpu department my cf amd drivers were a lot more stable than the geforce ones on my notebook(true it does come equipped only with a gtx960m). I also had nVidia gpus for desktop in the past.

  • Mike Lopez

    Its a matter of perspective actually. What one considers rude might not be seen that way by another. I uhderstand that, however there is no reason to down someone based on something like gpu preference. And yes he did down that guy by saying what he said. I am not sensitive, but I think people ought to be more respectful to one another, period. The internet has given rise to a special breed of people that dont have to answer for what thet say.

  • Slipperyfetus

    That’s what your after? Well that is certainly good to know! I am super duper stoked for you.

    Who said anything about walking away? I haven’t even walked up to it, it’s not even out. I, personally, won’t be purchasing it and would prefer to pick up a much cheaper card next year that is using hbm2.

    But hey! You go get em tiger 🙂

  • othertomperson

    If a game engine is not supporting multi GPU it means it doesn’t support either SLI or Crossfire. Similarly, if a Dx12 game supports any of the varying forms of multi GPU, it means it supports multiple GPUs of either AMD or Nvidia (or a mix of both). You are being called a fanboy because you are saying that multi GPU for Nvidia is losing support, while it’s gaining support for AMD when in reality when it comes to Dx 12 there is no distinction any more.

  • uziwooshan

    Technically amd started investing more into their multi-gpu part of drivers while nVidia isn”t investing as much in sli as before. For CF i can confirm as I have 2 amd gpus. For sli i confirm as a good friend has 2 970s

  • othertomperson

    It isn’t driver based at all with dx12. It’s the dev doing the work, amd/Nvidia is irrelevant.

  • uziwooshan

    i was reffering pre dx12. there will be more dx9&dx11 games before dx12 games will appear in larger numbers

  • Anarchy Bunker

    What a silly post. The obsession with a single feature, asyc compute, is hilarious. What matters is the output. How they accomplish that is irrelevant. You’re throwing a tantrum because each GPU does it slightly different. Why are you not raging how how crap AMD was in implementing DX11? Have a good cry, then get over it. Lastly…this gen? AMD has been using *this gen* for ages.

  • Blasted

    It’s not hilarious, it’s pretty serious actually without good instruction sets down with the GPU, “brute force” means nothing, NOTHING.No I am not throwing a “tantrum” I am just displeased by their apparent lack of API compatibility and crap architecture, you can keep your falsely advertised piece of hardware, thank you. I know I am having a “good cry” of laughter, by your ignorance, ha! You know nothing of what is going on, do you?

  • marsbound2024

    Please tell me what number they should have used for the “X” in front of “80” or “70”? They went all the way up to 9. Last I checked, there are only 9 digits they could have used, unless you want to use a 0. Am I missing something from your post?

  • Taylor Stoll

    Fair enough then… I can’t wait for midnight! Mind you it probably won’t actually be on sale until tomorrow afternoon. But I will be on Nvidias site right at midnight!

  • Taylor Stoll

    So will I. And I GUARANTEE that Nivida will still perform better then AMD in DX12 and Vulkan games. Just like they do now. You may have to pay for it but they still EASILY have the lead in performance. I don’t care how they get there… As long as they are the best. And that they are. Hands down.

  • uziwooshan

    I always changed sides based on what suits me best. So far, for 4k I’ll stick with my 2r9 290s as pascal and most likely vega as well won’t be capable enough for 4k. But most likely I’ll stick to amd as the gcn architecture is better suited for dx12/vulkan. The bonus to choose them is that they also deliver hw for consoles so devs who started developing their games from 2014 onward might be more inclined on optimising the dominant platform( taking the high number of console peasants into account) now ofc this is mostly asumption as both teams have great tech with their new generations and so far only nVidia showed their big guns. If amd hit it right with vega and Drivers(ffs they should focus more on speed than stability,like nvidia) we might have a new winner(fury x had 8.6Tflops which kept it close to pascal in terms of raw performance). But if AMD screw it with vega we might see them go bankrupt and for at least 2years have intel denied the x64 and multi-core patents denied,and gddr5,5x and hbm denied to nVidia till someone else buys amds patents and renegociates them with the blue and green. Meh… Part of me would like to see amd fail this bad,just to sit back and enjoy the chaos left behind.

  • NoMoreWar

    Exactly! Just bought one that was released last year thinking it is this one. Now i have to return it. What a stupid idea to name it same.

  • NoMoreWar

    I am one those sucker, bought one for $1300 thinking it’s this one, will return it.

  • NoMoreWar

    Why not Titan Trump?

  • Gamedick

    sorry to hear that…you getting a full refund?

  • Sam Hain

    Am curious how my… (gulp) OC’d Gigabyte GTX 980 Ti Xtreme Gaming Edition Windforces (2-way SLI) setup will fair (or fail) against this BEAST??? Any guesses?

  • NoMoreWar


  • Sparhawk122

    When are brand variants releasing, and stores except NVidia product page, going to start getting them? I want one and would get one. However I want an Asus variant due to the fact that my Asus motherboard will then be able to detect it and then I can use my Asus Mobo program to keep track of it instead of having to use two programs.

  • Sparhawk122

    It’s still a single chip GPU…..

  • Iluv2raceit

    “Much cheaper” answers it all. You don’t even deserve to comment if you can’t afford it.

  • Iluv2raceit

    That’s super honest.

  • Slipperyfetus

    Haha you seriously can’t be that stupid.

    ‘I waste my money on dumb overpriced gear because it’s brand new, even though it will no longer be the best in 8 months’

    Nah good one Marrrd Racer!

  • Slipperyfetus

    Also, since you’re trawling the comments of an article that appeared 10 days ago.. I’m going to correctly assume you were late to the party.

    Surely you’d already be packing quad X’s since you ‘can’ afford it, why are you here? Shouldn’t you be smashing 8K 3000+ fps rich guy?

  • DavidMoreau

    Selling or on sale? Don’t confuse listings with actual sold items.

    Regardless, someone in the comments already admitted to being fooled.

  • himura

    Yes, and you need to pay 4 times the prize with not a significant improvement, Quadros are more more stable but share same architecture (and sometimes even the same GPU) than some of the Gaming models, most indie artist stick to Gaming Gpus for rendering and Nvidia knows that even some studios rely of those models as well. So….. Gtx’s and titans are Real time render devices, what you use it for is irrelevant.

  • himura

    That is irrelevant, What matters is if the product hits the target audience, which is this case is not only gamers.

  • Blockchains

    I personally knew no professionals who used the first Titan X, and I can imagine it was only relevant to those who require a massive frame buffer on a budget. For CUDA work most professionals I know used either 980 Ti’s or Quadros. Since the 1080Ti is taking a while to release this time, it might be a bit of a different story, but the original Titan X was without a doubt a cash grab by nVidia.