New NVIDIA TITAN X GPU Powers Virtual Experience “Thief in the Shadows” at GDC

The annual Game Developers Conference just opened with something big.

Weta Digital, Epic Games, Oculus and NVIDIA unveiled this morning an extraordinary virtual reality experience called “Thief in the Shadows.”

The product of a unique collaboration, “Thief in the Shadows” pushes VR to new levels.  Powering this experience is the most advanced GPU ever built—the new GeForce GTX TITAN X, which was introduced by NVIDIA CEO Jen-Hsun Huang, in a surprise appearance with Epic’s CEO and founder Tim Sweeney, at Epic’s keynote this morning before a packed house of 300 at San Francisco’s Moscone Center. (See also “NVIDIA Opens PhysX Code to New Developers.”)

Jen-Hsun Huang presented the first production version of our new NVIDIA Titan X to Epic CEO Tim Sweeney at GDC Wednesday.
NVIDIA TITAN X: If Smaug hoarded GPUs, instead of gold, we think the old dragon would pick up this one.

Tim said that the new VR experience is so powerful that it requires a new graphics solution with unprecedented capabilities that aren’t currently available. “Does anyone have any ideas how we can do this?” he asked the crowd.

A moment of silence. Some rustling. Jen-Hsun quickly emerged from the back of the room, saying he had one. Unwrapping a concealed box artfully labeled TITAN X, he unveiled the company’s new top-end GPU.

He was cagey about details, saying they’d be revealed in full in two weeks at NVIDIA’s GPU Technology Conference. But he said TITAN X, built on the company’s Maxwell architecture, has eight billion transistors, a 12GB framebuffer and took thousands of engineer-years to build.

Smaug is quite a beast. So is the new Titan X. ©2014 WARNER BROS. ENTERTAINMENT INC. AND METRO-GOLDWYN-MAYER PICTURES INC.

“It’s the most advanced GPU the world has ever seen,” Jen-Hsun said, as he presented Tim with the company’s very first production unit.

In “Thief in the Shadows,” gamers wade through acres of coins, pass between stacks of gold ingots and explore subterranean crevices all guarded by an enormous dragon that has no intention of giving up any of his hoard.

“Thief in the Shadows” was created by the Weta Digital talent behind the Hobbit movies. The VR experience runs exclusively on Oculus’s “Crescent Bay” prototype, leveraging the muscle of a TITAN X GPU to deliver the jaw-dropping experience at a smooth 90 frames per second.

NVIDIA CEO Jen-Hsun Huang presented the first production version of our new NVIDIA TITAN X to Epic CEO Tim Sweeney at GDC Wednesday.

“When you come face-to-face with an enormous dragon that experience has to be believable, visceral and emotional,” said Alasdair Coull, head of R&D at Weta Digital. “NVIDIA’s new TITAN X GPUs provide the platform to deliver exactly that.”

“‘Thief in the Shadows’ would not exist without NVIDIA’s support and amazing hardware,” Tim said. “Together with Unreal Engine 4 and Oculus Crescent Bay, these three pieces of technology place the viewer inside a virtual world of unparalleled detail and action.”

In “Thief in the Shadows” viewers will not only experience the VR world’s visual richness. They’ll hear it, too. That’s thanks to the demo’s spatial audio effects Epic integrated into the experience and driven by the Oculus Audio SDK. The result: viewers will feel as if they’ve really explored a dragon’s lair and come face to face with Smaug himself.

Epic's latest jaw-dropping demo, Kite, called upon the power of NVIDIA TITAN X (see "NVIDIA Opens PhysX Code to UE4 Developers").
Epic’s latest jaw-dropping demo, Kite, called upon the power of NVIDIA TITAN X (see “NVIDIA Opens PhysX Code to UE4 Developers“).

“Oculus continues to work closely with Epic and NVIDIA to deliver incredible, immersive virtual reality to gamers everywhere,” said Brendan Iribe, CEO of Oculus. “The advancements in NVIDIA’s TITAN X and Epic’s Unreal Engine 4, combined with the latest Oculus Rift hardware and innovative Oculus SDK features like asynchronous timewarp and late-latching, enable developers to reach new levels of performance, comfort and presence in VR.”

GDC attendees can see “Thief in the Shadows” for themselves only in the NVIDIA (#1016), Epic Games (#1024) and Oculus (#1224) booths on the show floor.

More coverage of our announcements at GDC: Latest PhysX Source Code Now Available on GitHub, Why We’re Building SHIELD, the World’s First Android TV Console 

Similar Stories

  • IdiotTormentor

    Holy cow, this is insane!

  • DeNeDe

    where can i see the recorded presentation ?

  • abc

    Now let’s make a SLI system with the Titan X.

  • AverageGamer

    please dont let it be titan z

  • Guest
  • Terrance Jackson

    Where is AMD in all of these NVIDIA product announcements?

  • FckNvidia

    Nvidia is tottaly overpriced…

  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • FckNvidia


  • LocutusEstBorg

    AMD has nothing. They just play catch up like Android and then fail to actually catch up.

  • BleachedSleet

    …you mean the same Android currently powering NVIDIA Shield?

  • Terrance Jackson

    It makes no sense. How can you let your competitor just leave you in the dust like this with a console yesterday to a flagship beast GPU today?

  • Len

    Yea, because Nvidia has their GPU’s in all 3 consoles….GO NVIDIA! Oh….wait

  • Terrance Jackson

    The consoles arent exactly a great example with 900p/30fps games. Shield is doing 1080p and 4k @ 60 fps

  • LocutusEstBorg

    Everything worthwhile in Shield’s Android was done by nVidia.

  • Seth Forbus

    That is why Android is great. Standing on the shoulders of giants. Anyone can take Android and do whatever they want with it. Its an amazing platform. Some developers like to pile on bullshit on top (which is probably why you hate Android) but stock Android is a robust system with the ability to be extremely customized. Its also built upon linux, which has given it a strong developer community.

  • LocutusEstBorg

    Android’s great for these applications. My original comment was a comparison to iOS as a smartphone platform, compared to which Android far behind in terms of quality and actually useful features.

  • Matt

    Ah, the age old Android vs. iOS debate. iOS with it’s additional keyboard support from the start, ability to send files over bluetooth and widgets! Huzzah!

  • LocutusEstBorg

    All of those were useless or poorly implemented till iOS implemented them.

  • Matt

    iOS doesnt have widgets, nor the ability to send files over bluetooth… I guess that is quality implementation 😀

  • Chris Ashmore

    If paying $1000 plus for a single gpu is what you are into then be my “guest”

  • amnesia
  • Vortex

    I’ll never afford a good gaming computer with one of these in it

  • Anthony112409p

    Nvidia’s Titan series has always had ‘dual’ gpus. Both the Titan and Titan Z were dual-gpu. That is why they take up double the space on a motherboard. It would be an extreme about-face if they turned around and made the X a single gpu.

  • malcmilli

    depends on your needs. If you got the cash, then what’s the big deal. And if you’re using it for work purposes, then it could be quite the sound investment.

  • Seth Forbus

    Ok. When you can start an FTP server, run adblock, or unarchive .rar files on the go then we’ll talk. Do you not remember that IOS didn’t even have copy paste functionality when it was new? I was so glad when I jumped from an iphone 3G to an Android device ages ago. Never looked back. The power just isn’t there. There have been some rough android phones over the years, but the platform as a whole is rock solid.

  • malcmilli

    i hate blind comparisons like that… yeah my phone can play pokemon red at 1080p@ 40 frames per second but come on, its NOT the same thing as trying to run crisis.

  • malcmilli

    I understand somewhat when you refer to quality, but iOS was always behind in features. It was pretty much what your decision is based on to if you should like iOS or Android… do you prefer polish to the T or do you prefer more useful features.

  • Glenn615

    titan x ? come on NVidia … expected something better .
    -I’ve seen the specs and it fails to impress. 390x will kill this card .

  • Glenn615

    derp … double the space eeh ? … my 5870 takes up 2 spaces too..

  • Gigahurts

    Actually, the titan z was the only dual GPU titan released (it was a triple slot card as well). Not sure where you got your facts from.

  • david spagnolo

    I hope so.

  • James Wassall

    “I’ve seen the specs”
    So you work for Nvidia and leak inside information?

    The 390x won’t kill this card. The only card that could kill the Titan Black was the 295×2, then the Titan Z came along and stole the top spot and is very reasonably priced at the moment, all things considered.

  • Anthony112409p

    Relax. You’re right lol

  • Anthony112409p

    Any higher-end GPU will take up two spaces. The Titan Z took up 3 🙂

  • Black Light Shoots

    8billion transistors don’t impress you? The Z only had 7million.

  • Chris_Is_Retarded

    why did you even put guest in quotations. THis is the stupidest thing ive seen all day

  • Iosisforfags

    Look at this little Ifag… mad cause his iphone and ipad only has 1 gig of ram

  • himura

    exactly, if you use it for work then perfect, that is what I will do.
    But no game can squeeze the power out of this baby

  • Jake – Klikkit

    Who needs an Octo-core CPU with 3GB of RAM in a phone when the iPhone with 1GB RAM beats it in the benchmarks and day-to-day performance?

  • Jake – Klikkit

    Managing FTP servers and archived files can be achieved with apps on the app store.. adblock can be achieved with a 5 minute jailbreak.

    Android didn’t have copy paste when it was first released either.. what’s your point?

    If your last experience is an iPhone 3G, I can understand why you’re so misinformed, but still..

    Who cares about power when a 1GB iPhone 5S nearing it’s second birthday beats an 8-Core 3GB Android flagship in benchmarks and day-to-day performance?

  • Jake – Klikkit

    More useful features to me = better implementation of the features

    So far no Android device has nailed the fingerprint sensor like Apple has

    Spotlight doesn’t really have a comparable Android equivalent

    iMessage and Handover are brilliant

    And Apple still manage to consistently incorporate one of the best camera sensors into their devices year in, year out.

    Again, it’s not necessarily having more features that wins it for me, it’s perfecting the features that I’ll actually use, which Apple have nailed.

    ApplePay will be another example.. Google wallet had a chance but now Samsung have damaged the market reach with SamsungPay, it’s only going to harm mobile payments on the Android platform by saturating the systems and confusing end-users. Samsung don’t have the smartphone branding power that allows Apple to just introduce an already existing feature (NFC payments) and totally dominate that market.

  • Jake – Klikkit

    My PS4 plays titles at 1080p / 60FPS, why are you PC elitists so picky about the comparisons you make?

  • Tom_Jiang

    I have a brand new EVGA TITAN BLACK SC with case,and all the things are brand new,because I have no computer…so,may I change a TITAN X?I’m from China

  • Anaron

    The GTX Titan Z had 2×7.1 billion transistors which is considerably more than “7 million”.

  • Seth Forbus

    Not managing servers. HOSTING THEM.

  • Anaron

    The 295X2 had 90%+ the performance of the Titan Z at half the price. It even beats it in some games at 4K (e.g. Battlefield 4 and Crysis 3).

    Anyway, I highly doubt the 390X will outperform the Titan X. That card is obviously meant for the ultra high-end enthusiast market. The 390X won’t cost more than $1K but the Titan X will likely cost at least $1.5K.

  • Seth Forbus

    Law enforcement can force you to unlock a phone using a fingerprint, but not a pin or password. Its a useless feature for real security, just a trick to impress your friends. Google it.

    Imessage only works well if all of your friends have imessage, otherwise it causes problems. There is currently a class action lawsuit against apple because after switching from an iphone to another device, users could no longer get sms from their iphone friends. Google it.

    There are many android and windows mobile devices with better cameras. Google it.

    Google wallet has worked perfectly for a long time without issues. How is applepay better? Do you have any real examples based in fact why its better?

    You are a fanboy. I get it. ITs cool. But don’t spout opinion as fact.

  • Anaron

    It’s overpriced. You can get very good performance at 1080p/1440p with current cards.

  • Anaron

    Actually, it likely won’t.

  • Anaron

    You shouldn’t take it personally. The truth is, a lot of next-gen console games are rendered at sub-1080p resolutions and can’t achieve 60 FPS.

  • WildmanSteve

    R9 295 X2 beats Titan Z. Not sure what benchmarks you’re looking at?

  • Jason

    that is a very very small list of titles that are true 1080p 60fps.

    I play at 1440p (Double 1080) @ 120hz (120fps) … so the console experience is quite far behind

  • Jason

    This was all because of AMD willing to make the cheapest chip sacrificing performance.

  • Jake – Klikkit

    I don’t take it personally, I passed the PC gaming phase and occasionally find time to casually game on my PS4 nowadays.

  • golephish

    oh no PC master race.

  • Jake – Klikkit

    1440p is also more like 1.4X 1080p, but I get your point, I have a 5K display which boggles my mind in regards to the math involved.

    PS4 Outputs at 120FPS too.

  • Jason

    No. 2560×1440 is exactly 2x the pixels of 1080p. Ps4 games are all vsync’d to 60fps, and if you have a ps4 you sure don’t have a 5k tv

  • Jake – Klikkit

    “useless feature for real security” – Nope, allows me to comply with the password complexity restrictions put in place by Microsoft Exchange whilst allowing me a near instant unlock with minimal input.

    I don’t need to Google the iMessage thing, it’s so old it’s not even worth mentioning anymore, especially since it was resolved over a year ago:

    I didn’t say there weren’t many Android and Windows devices with better cameras, I said Apple manage to consistently include *one of the best*. Despite being 8MP, it is the 2nd most used camera on Flickr and outputs better than most 40+MP alternatives.

    As for the mobile payments, if you read my comment you’d see I was talking about the adoption of it.

    Even before it went public, support for it had been announced near everywhere, it even prompted many businesses to try and block it’s use by starting their own rival systems.

    Google wallet has been around for years, and while I got some use out of it, it’s rarely supported in Europe and hardly anybody knows of it outside of the tech world. Key is in the brand recognition and marketing that Apple seem to have mastered.

    Call me a fanboy or whatever is the “in-phrase” these days, I really couldn’t care less, if being more educated in something than you to the point you feel the need to name-call is being a fanboy, then fanboy I am.

  • Seth Forbus

    So, you are arguing that Apple is better at marketing, not the actual tech. Sorry, this is the Nvidia blog. Pretty sure most of the people here are interested in the tech more than the marketing. Fanboy means you like a product more than another, no need to insult my education. I’m out, this argument is going nowhere.

  • Jake – Klikkit

    Who said I had a 5K TV? To my knowledge there aren’t any on the market yet. I said a 5K display.

    PS4 games are currently capped at 60FPS because the developers choose to cap them, it’s not a hardware cap, as that cap is 120FPS.

    Also, 1920×1080 = 2073600 pixels while 2560 x 1440 = 3686400 pixels, so no, not double, but closer than my guess, it’s actually 1.77x

    Obviously not comparable to 4K: 3840 x 2160 = 8294400.

    Which is still in peasant territory when compared to my iMac’s 5K res: 5120 x 2880 = 14745600 (Nearly double 4K and exactly 4X your 1440p res)

    You’ve just been served.

  • Jake – Klikkit

    “So, you are arguing that Apple is better at marketing, not the actual tech.”

    Way to pick and choose what part of my argument you actually respond to.. summarising my point by taking what I said about mobile payments out of context.

    “Fanboy means you like a product more than another”

    No it doesn’t.. it means you have unconditional preference over X vs Y whilst ignoring factual arguments in a debate over X vs Y.

    I consider the advantages of Android and choose iOS.

    I was considering buying a Nexus 6 and am now considering an S6, I also have a Nexus 7 to allow me to remote play my PS4.

    Very much a fanboy, I know.

  • Jake – Klikkit

    lol, I love the CAPS LOCK as if it’s a groundbreaking feat.

    Why would you want to host an FTP server on your smartphone? despite the fact that again, App Store apps can handle this through emulation or you could just jailbreak and install an actual FTP server client.

    (Again, not sure why you’d want to install an FTP server on your smartphone, seems like a pretty counter-productive method of file management/transfer when there are far faster and user-intuitive methods out there)

  • Jake – Klikkit

    Actually it has both, jesus where have all you Android “Fanboys” been for the past year?

  • Kougeru

    lol fanboy much? Clearly an apple fanboy too, otherwise you’d know Android not only has the strongest phones (in some models, thanks to Nvidia) but also owns like 80% of the mobile phone market.

  • Kougeru

    I agree with most of what you said except pins and passwords are VERY easy for police to break on phones. Not like they’d need to. There’s ways to access phone data without using any passwords. Of course, if you just don’t do anything illegal then none of this should matter.

  • WildmanSteve

    Titan Z was 2 fully unlocked GK110 cores with 7.1 billion transistors each making it 14.2 billion transistors.

  • Lilyf Tirack AddmePlease

    Aaaaaaaa… now i kno whwy that card cost more than your organ.

  • David Curtis

    Actually it is impossible for ps4 to output 120hz because HDMI locks the frames to 60hz. If you think 120Hz on tv was legit, I am sorry to break it to you, but that is frame interpolation.

  • David Curtis

    Like I stated Before, HDMI locks frames at 60Hz and if your TV outputs 120hz its only Fake frame interpolation. Meaning that HDMI is technically a hardware limitation. Nobody is served.

  • Adam

    Will all 12GB of VRAM be usable? or will it only use 11.5?

  • MortalJohn

    triple 4k monitors, or just high resoloution VR alone will still make this thing weep, it’s not about the game as much as it’s about the amount of pixels a gpu can push out now a days. Let’s not forget that slowly high frame rate monitors are becoming a thing now as well, 144hz monitor doubling how many pixels a gpu needs to output.

  • himura

    High resolution VR? which game in particular?

    and 3 4k monitors. Why would you wnat to do that?

    honest question.

  • gg

    heres the crapple fan boy everyone…

  • Anaron

    Okay. It seemed like you did because of your “PC elitist” comment. I’ve always enjoyed console and PC gaming. Nowadays, I primarily game on my PC because nothing out there interests me. The good titles come out well into a console’s life cycle so I’ll wait for those.

  • ChaoticShadow

    Now-a-days Apple is falling behind honestly–especially the mobile market. iOS has pretty much gone downhill; iOS7/8 just have a few new features Android already has and a bit faster (compared to iOS 6), and now it has the stupid looking colorful “Metro” UI.

    Edit: Don’t call me an Android fanboy cuz I ain’t one, it’s just all the other mobile OSes suck.

  • Fantria

    the level of nerd on this topic is insane…lol (this is not in any way an insult…it is an observation….i am nerd too…so relax)

  • Kahai

    Three 144 hz 2250 x 1440 swifts in portrait mode all pushing 3D content should be enough. xD

  • Allen

    dont forget the averge human eye can only see 60fps/hz even extraordinary eyes will only see anything above that is a waste because you wont notice the difference and 144 ha we are over 240hz now not that it matters..the only part of your statement that makes sense is extra monitors and higher rez.anything else is a waste of money and electricity cause you wont benifit from it anyways except bragging rights you cant confirm without testing/benchmark equipment lol

  • Allen

    the hz is way to high its a waste as you can only see half that if you got some killer good eyes

  • planetofthemage

    You’ve clearly never used a 144hz monitor if you believe that. Your eye recognizes far more frames than 60/sec — that’s scientific fact.

  • planetofthemage

    *2560 x 1440

  • planetofthemage

    AMD is miles ahead in APUs, and game devices use them because they’re smaller and easier to cool than discrete graphics cards.

  • Allen

    actualy its a back and forth game very time relavent..the last titan was actually slower in many games compared to the 290x r9 only a handful of physx based games seen improvements over the 290…the 290 also used less power and ran much cooler than the titan…not to mention amd drivers have always been far superior to nvidia’ not an amd fan boy..really dont care who makes it…just care about getting the best bang for the buck which amd delivers time and time again…ooh ya and the titan was $1000+ where the 290 was in the 500 range

  • Allen

    ios is a bulky os..very limited in features without vioding your warrenty and to get the same hardware in an iphone or ipad you pay doulbe or more…for what exactly? a popularized name point to any apple product and i’ll point you to something cheap AND better

  • kieran boyce

    I love how this is a debate about apple and android now. they are both good for different reasons each one does better than the other in other areas. android has more devices which can have more features but this makes it hard for the android os to cater for stability across so many devices so you trade different features for a few little bugs here and there as well as some minor security issues every now and then. apple goes for a more stable approach as they only have one device. iphones are easy to use and quite stable with fast updates. however because the operating system is not so open source like android it doesn’t offer different price range or cater to different features like better cameras or larger sizes and so on. there is also a lack of customization android has the ability to have custom launchers without a jailbreak and voiding the warranty.

    so in the end it all depends on what you want. customization or stability which is more important to you.

  • ChaoticShadow

    Find me something similar to a MacBook Pro (13 inch, retina, no chrombooks) please

  • himura

    you are not really getting anything aside from brute forcing the GPU.
    Is that necessary at all?
    I can crush the GPU by doing a 0.01cm voxel size in a 40Kx40K container but, what is the point of that? when i am not going to get that close to the sim anyways?

  • Allen

    ios actually stole from other platforms and did a terriable job of it implementing heavy restrictions and a much higher price for a lower quality product and poor coding which leads to hangups freezing and slower processing..not to mention they dont build any of the hardware

  • Allen

    word processing is the benchmark apple refers 2…how often do you use a word processor like office? yup

  • Allen

    jailbreak=vioded warrenty….lol love your brochure reading abilities and lack of critical thinking..just the kind of consumer they want

  • Old Brian

    Guys, I have read this whole discussion. Very informative and productive. I have weighed both arguments and researched the matter at hand. I think I have come to a conclusion. First however, if I may provide a rebuttal to you Jake… because you like apple products, you are clearly a homosexual. This argument is logically sound. If you can’t see my error free logic, then it must be quite unfortunate to be as mentally impaired as you have lead yourself on to be.


    An informed consumer

  • Nathaniel Marrufo

    Emphasis on “only the last year” lol

  • DeltaEven

    Actually, no, science has proven that the human eye has no framerate, it captures fluid motion and has a resolution far higher than any monitor, if you were to put it in to pixels. Although, if you were to capture a single frame of what the eye can focus on, it’s about 7-10MP. The human eye can see fluid motion, wether 30 FPS, 300FPS, or 3000FPS, it doesn’t have a framerate.

  • Nadefrenzy

    Note 4 crushes your iPhone 6 and 6 plus in multi-core benchmarks. Don’t spread bs.

  • Allen

    omg brochure reading ignoramus

  • ChaoticShadow

    “because you like apple products, you are clearly a homosexual.” Lol, so all the people who buy Apple products are another Tim Cook?

  • Allen

    trans count alone means can make a huge 1300/1600 ati card and modify it to have 20 billion.wont make any noticable difference.wouldnt even come close to a 5450…

  • Allen

    very highly doubt you will be able 2 use more than 4.5gb without using 4k x3 in 3d @240HZ even on a game with crazy memory mapping issues related to poor coding even still 6gb is pushing it very hard dual boxing games or something which will drasticly kill perfomance to make use of the vram.the exception would be cad programs or protien mapping or something of that nature and even still i’d be very very impressed to see someone use 12gb’s of vram outside of a server..if your a gamer its a waste

  • William Melville

    MacBook Pros don’t run ios. Did you want to try to ask your question again?

  • JJ

    Uh. Heard of BSD?

  • meepo

    Surface Pro 3

  • Dawson Harrison

    Linux vs. BSD………im sorry bro but no.

  • Jason

    there is a big difference between 1.4x and 1.77x..

    Whos closer to the 1.77x truth, 1.4x or 2x.

    Secondly, ps4 can output 120hz @ 720p, HDMI does not support 1080p@120hz, and even if a ps4 could output that, the hardware is NOWHERE NEAR Capable of outputting modern AAA Games at 1080 120hz

  • Dawson Harrison

    we’d so get along great XD

  • Jason

    You literally say in your last post
    “I have a 5K display which boggles my mind”
    “Who said I had a 5K TV? ”


  • ChaoticShadow

    Of course they don’t, Allen said he’d point me to another laptop just as good as an Apple one. My point is that Apple’s laptops are actually fairly comparable to their Windows counterparts.

  • Dawson Harrison


  • Dawson Harrison

    preach it

  • ChaoticShadow

    That’s a tablet, Microsoft advertised it that way, so it’ll stick as a tablet.

  • None

    lol and I guess that’s why 65% of the smart phone market is Android phones. Because it’s the worst and has no useable features.

  • None

    Honestly they don’t really need to compete with this, this card is not going into the average gamer’s computer. It’s probably going to be like $2000… And I’d be willing to bet that the R9 295X2 is still faster than this card so they would even still be competing for the most power in a single (dual) slot for small form factor PCs. If there is one thing AMD does right it’s dual GPU cards, their dual GPU cards are always excellent.

  • None

    lol if you shopped carefully at one point you could get 3 R9 290’s for less than a single Titan. And a single R9 290 was usually within 5% of the Titan on pretty much any game. How someone can argue with that value I’ll never understand…

  • Jad

    You are either a sad person, a troll, or a paid shill.

  • ShaneMcGrath

    You forgot to add the $2000 aircon you will need to cool your room down from those monstrosities, And the doctors bills from trying to deal with the Tinnitus of the jet engines. 😉

  • ShaneMcGrath

    Metro Last Light at 4k will destroy any card coming out for the next few years yet. lol

  • gamer555

    Amd Fan boy.. u talk BS

    Nvidia drivers are better..
    Nvidia cards are better and thats why they sell them more.. Amd is allredy losing.

    AMD= Rip
    Nvidia= Winner

    And im not an Nvidia fan boy, i realy dont care who makes just care about getting the best bang for the buck which NVIDIA delivers time and time again.. Nvidia is allAbout Quality and Speed.

    Amd is just.. Meh..

    And im not Nvidia fan boy… Just like you are.nit Amd fan boy (lol yeah)

  • James Wassall

    The only benchmarks I’ve looked at are the non-mantle games. In like 90% (I made up the percentage) of tests it wins, not by a lot though.

  • James Wassall

    From the benchmarks I’ve seen, only in BF4 did the 295 beat the Z. I doubt this will cost that much; if it launches before the announcement of the 390 then it’ll be expensive but if its released just after it’ll be competitively priced.

  • Wile E

    It’s a convertible and marketed as such. It’s not marketed as a pure tablet. But even so, it’s more powerful and more feature packed than a MBP. To overlook it as an option because you don’t like the way it’s marketed would be folly.


    3072 cuda cores
    and 200wats tpd?

  • Gideon

    You mean play catch up like APPLE.

    Android does it first then apple does it years later.

  • γιαννης

    nvidia drivers are better? well you can tell that to those people that burned down their cards because the driver was stopping the fan and continued to fill the gpu with random data causing it to overheat
    nvidia cards are better well you cant really have an argument on that amd release a card then nvidia release a card then amd release a better card and then nvidia release a better card…there is no point on that BUT if amd actually get the hmb memory banks to work perfect while nvidia will need almost 1.5 year to implement them (they said that not me)well you can guess what is going to happen when a card with a bandwidth of 720gb/s(combined)is running alone in the market…
    its one of the biggest cartel in the world atm no suprises here
    and please dont say you are not an nvidia fanboy…you are its clear even to a blind person lol

  • himura

    i would like to see that.

  • DeToNaToR

    I just saw images with this card and WHERE IS THE BACKPLATE ???!!!!!!

  • Jake – Klikkit

    So you think Display = TV?

    You poor soul.

    Display = Display (Monitor/Projector/Tablet Screen/TV)

  • Jake – Klikkit

    I already said you were closer, but I just enjoyed correcting you as you insisted it was exactly 2x, you had no clue it wasn’t.

    Who said the PS4 needs to output via HDMI?

    Project Morpheus is 1080p at 120fps.

    Also, HDMI 1.4 supports 1080p at 120FPS, how do you think 3D Blu Ray works?

    And of course the current gen can’t output max everything, look at CoD2 on the Xbox 360 and then CoD Ghosts on the same console, there’s a vast difference in quality.

  • Jake – Klikkit

    Yeah, he was wrong.

    Android usually do things first.

    Apple just do them better (Fingerprint, ApplePay adoption etc.)

  • Jake – Klikkit

    Brilliant, the only thing you could argue with is my typing at 2AM on a mobile device. Nothing says you just lost the argument quite like that.

  • Jake – Klikkit

    Where did I mention the Note 4?

  • Jake – Klikkit

    I have news for you pal, Jailbreaking doesn’t void your ‘warrenty’.. and I’m guessing you mean Warranty?

    If it voided the warranty why would it be in the brochure?

    You can also remove the Jailbreak by restoring in iTunes.

  • Jake – Klikkit

    emphasis on how often have I sent files over bluetooth since I’ve had the ability? 0

    It’s 2015, if you’re still using bluetooth to transfer data regularly you’re a caveman or my grandparents.

  • Jake – Klikkit

    1: HDMI 1.4b supports 120hz at 1080p (also pretty sure 1.4a supports it too)

    2: 5K is a gimmick to you, but when Dell offer a 27″ 5K monitor for the same price as Apple’s 27″ 5K all-in-one, the iMac is a pretty sweet deal.

    3: I have only tried gaming briefly at 4K on Windows 10 via bootcamp. Despite only having a 4GB M295X, it handles pretty well.

    Not sure why you’d consider gaming on a super-thin all in one though.. maybe you’re just special and can’t afford purpose built machines for different activities.

    And if you’re so insistent on gaming on OS X, why would you buy games for Mac when you can bootcamp and buy games for Windows?

  • Eyoldaith

    Of course the Titans were more expensive, but that was because when they were released, there were no single-GPU AMD cards that could compete with it at all.

    The Titan was released 7-8 months before the R9 290/290X.

  • Andrej Szelle

    the man who earns € 350 a month just a dream : D

  • John

    12 gb? are you sure it’s not 11.5gb?!

  • jsjsjsjjsjs

    youre the idiot. a display isnt a tv.

  • Yahia Rakhi

    11.5GB 😉

    Fanboy war in 3 2 1..

  • Cory Wilson

    Obviously an ios fanboy but whatever. As far AMD goes both radeons and geforces are great. Nvidia usually has a slight edge over amd but amd gives you much better bang for the buck. I’m not brand loyal to either, had an radeon hd6790, heavily considered the R9 280, 285, and wound up buying a gtx960. The Rx 300 series is in the works I’m sure it’ll help amd level the playing field more

  • WildmanSteve
  • malcmilli

    Apple has great implementation of its features… can’t knock that. however often times those features come years after they were introduced. Personally i have no need for fingerprint sensor, but i understand its value to others.

    For me i enjoy new tech and options, things like 4G, NFC, turn by turn navigation came years earlier, removable batteries, microsd cards, depth cameras, ability to download mp3’s and torrents straight to my phone, more customization, easier way to transfer files, water proofing, more customization, varying screen sizes (years earlier), double tap to wake, double tap to sleep, etc.

    I meant more useful features is in quantity of useful features. Apple def has quality, they just often wait too long before implementing new stuff.

  • Jake – Klikkit

    The removable batteries and MicroSD aren’t really factors anymore as most Android flagships are ditching them (Nexus 6 and S6 for example)

    The MP3s, Torrents and file management can be done with AppStore apps.

    Of course near enough any feature I want I can post on /r/jailbreak and have it developed asap for free, but I get what you mean about Android incorporating things years earlier in some cases.

  • malcmilli

    Yeah i mean i was referring to things excluding jailbreaking/rooting. Since i’m not sure of all of the capabilities that are available on either side.

    But as the batteries and SD cards not being a factor… they only arent a factor until 0 flagships offer them. Right now you can still have the option, which is what its all about.

    Glad we could have a civil discussion though. thumbs up.

  • Jake – Klikkit

    Yeah, very rare for a civil discussion when it comes to iOS vs Android, as soon as I say something positive about Apple it’s usually flat out denied and I’m called a lying fanboy.

  • malcmilli

    lol i’ve been called an apple fanboy and a samsung fanboy despite me owning neither devices. I can acknowledge a solid product while still preferring something else for personal reasons.

  • PeterJarvis84

    Don’t you mean the opposite? It’s now iOS that is currently playing “catch-up”.

  • Mike

    For some people, and many companies… yeah it is. Some of us have jobs that pay more than 10 dollars an hour. 🙂

  • Mike

    This has been proven wrong. google “Air force human eye frames per second” It’s in between 300-1000hz not 60.

  • Mike

    No but there is a point where the eye will not be able to register things after so many times, or in this case, frames, per second. Google “Air Force Human Eye Frames Per Second.” and you’ll get a few different answers, but what they said was basically 300-1000hz

  • Mike

    I really doubt it. Besides, that’s still 2GPU’s for the slot of one. I’d rather have 2 titan’s in SLI. Don’t forget that the 295×2 is only 4gb of vram. The GPU’s have to share the pool of 8gb, So, really it’s not going to touch this for higher end games /resolutions. Like at all.

  • Guest

    Even the OS? dumbass…

  • Antony

    amd has the r9 390x which is 4gb and meant to compete eith the gtx 980 🙁

  • jeff hicks

    lol android? you mean like vs iphone cause that’s a joke. android is always better than iphone

  • Rafael Girotti

    Nailed it.


    Drivers, fan?…bla bla bla..
    has spoken the oldest AMD fangirl

  • Rapajez

    Apple fan-boy comments on a Tech-Blog and thinks he’s going to win an argument. Granted, there might have been some truth to them, if it was 2009, but those days are long gone. Using Lollipop on a S6 Edge will make the latest iPhone look like a Flip-Phone. My grandpa uses both!
    Those comments may fly at your local Starbucks, but not here, lol.

  • Adriel Garcia

    It’ll probably have 11.5 gb of useable memory… But dur XD

  • Badongski

    Lets see how much will this be

  • planetofthemage

    I wasn’t implying the eye has a framerate — I meant it more in the sense that the motion will be more clear to your eye at hz frequencies greater than 60hz. (That is, they DO matter.)

  • Alex_Atkin_UK

    Apple implement features based on ease of use > flexibility. It works well for them and I don’t think its wrong. But for technical users its a big no thanks.

    ApplePay is purely down to brand loyalty. iPhones are considered fashionable, celebrities tend to have them, so of course its going to get better commercial support as it looks goods to support the fashionable platform. That doesn’t mean its a better system.

    I personally have no interest in relying on my phone for paying for anything, much prefer good old fashioned credit cards that if are lost or stolen can be easily and cheaply replaced.

    Having everything on one device might be convenient, but its a security nightmare.

  • Kahai

    3d content eats half the refresh rate… 72 hz is near perfect if you’re actually maintaining that during gaming.

  • Adam Reber

    It’s because they are cheap.

  • Alex_Atkin_UK

    I would say more likely to be 8GB full speed, 4GB slow. 😉

  • Clayton Hollister

    AMD is supposed to be coming out with high bandwidth memory for their cards, not sure how much faster they will be.

  • Corey

    Amazing graphics. Learn more about my site and making money at

  • jenxrj

    I disagree, I don’t think iOS is about ‘quality’ & ‘useful features’.

  • Glenn615

    Yikes ! AMD has pushed back the 3XX cards launch date until June @ Computex……
    Source : Kitguru . net

  • ChaoticShadow

    Maybe I should’ve clarified: I don’t want an Apples to Oranges comparison, I’d like to see a laptop that does the job a MBP does for less.

    If I was to compare the two, for $200 less it wouldn’t be a bad deal. As tablets have limitations, it would be kind of unfair to compare a full fledged laptop to a 2 in 1 laptop/tablet thing.


    Although on the tablet side, Surface does really beat out the iPad 😉

  • Silver Joystix

    I want this so badly – but I would need to replace my entire computer. Plus, I can’t afford this at all. Not even a little bit. I guess, I’ll be waiting on that new Shield to come out 🙂

  • William Harrell

    I love how an announcement for the Titan X turned into a heated iOS vs Android debate.

  • billy bob

    you can if you are still under your return period… other wise no. just return the titan black to your retailer for a refund and wait for this

  • ChaoticShadow

    It really started out as a bad comparison in the first place, as AMD and nVidia are constantly trying to outrun the other in race for the “best gpu” whereas the iOS vs Android is basically iOS currently lagging behind Android. And yes, it’s real funny how fanboy wars starts in the most random places.

  • Sourav Roy

    I need Bro…. IPhone is good in terms of its display… Force a 2560x1600p display in it and we’ll see

  • Nadefrenzy

    When you said “Octo-core CPU with 3GB of RAM” you were illustrating the point that those beefy specs mean nothing when compared to the iPhone’s average specs. Well you’re wrong, because the Note 4 and other flagships crush the iPhones in multicore benchmarks.

  • Nyx

    Had to sell his grandmother and his soul for that one G of ram too.

  • Adar Tzivion

    wonder how much it needs.
    and wonder how much it will cost

  • Guiga

    Are you serious? I think you are 5 years late, my friend…

  • TeeeeHeeee

    Apple let their new phones stay with old/degraded hardware and change their UI. That is all they do. Kudos to Apple for being one of the first with a 64-bit chipset for their mobile phones and tablets. That is the biggest thing they’ve accomplished for their mobile devices.

    This is how I think it is, and you might disagree.
    As better hardware and as updated the hardware in the gadget is, as better is the analysis and research of improving software development and mobile gaming. We cannot neglect Mobile gaming, it is improving drastically, and many people love it. As far as now in terms of having better hardware in phones, Phones with Android are at the top of the chart. Yes, iOS got more apps, and that is because App developers would rather choose a platform to make applications for which most people use. But, in theory, iOS is nothing compared to Android and that is all because of the hardware. And again let me suggest, this is due to not being able to improve software development. An example of this is the Nvidia Shield, and I cannot imagine iOS being the mobile OS on that.

  • Trent Foley

    Will be priced so ridiculously it will make normal people laugh at pc gamers for spending so much money to play video games. More ammunition for console fanboys when arguing about pc being too expensive. It does look really nice though, I like the design of the reference coolers. Hopefully amd will release the 390x soon so we will have some options at the high end.

  • LarZen

    When VR comes and if a beast of a card like this can make a dramatic difference. Then I might be willing to empty my savings account marked “toys I must have”…

  • Guest

    Well good for you. Make sure you get one of these titan x’s and upload it on Youtube for the world to see how successful you are. :]

  • Mike Hunt

    Lolz @ the iPhone Derp.. Not our fault your to dumb use a Droid correctly..

  • Mike Hunt

    Yep the 290x & 295x are nothing.. Drugs r bad, mkay..

  • Guest

    “Why would you want to host an FTP server on your smartphone?”

    Ahhh, the ultimate defense of every Apple fanboy. “Why would you want to X on your Y?”

    Because you *can*. That’s all that’s enough for some people, people who
    are actually interested in computers and what computers can do. Someone
    who owns a computer simply to “run apps” from a walled garden can’t be
    expected to understand this though, I guess.
    2. Just because *you* don’t see a need for Xing on your Y, doesn’t mean that no one else does.

    remember talking about touch-sensitive screens being adopted into
    monitors for Windows 8, talking with some friends of mine who worked at
    Apple. “LOL,” they said, “who is going to sit there and touch their
    montior!? Get real. Touch works for tablets. No one is going to use that
    on their monitor.”

    Walk into Best Buy right now and count the
    amount of laptop monitors that are touch-sensitive. Typical Apple
    nonsense. They think they have the One True Way to use a computer, and
    anyone else who doesn’t conform is a fool. Remember the iPhone 4 antenna
    fiasco? And Steve Jobs responded, saying people were “holding it
    wrong”? Guess what? There is no “right” or “wrong” way to do things like
    hold a phone, or where you host an FTP server from.

    The rest of
    the world will continue on, using computers as *they* see fit, while
    Apple users have their use cases dictated to them by the dictators and
    fascists at Apple. …L.O.L.

  • virusfm

    “Why would you want to host an FTP server on your smartphone?”

    Ahhh, the ultimate defense of every Apple fanboy. “Why would you want to X on your Y?”

    1. Because you *can*. That’s all that’s enough for some people, people who are actually interested in computers and what computers can do. Someone who owns a computer simply to “run apps” from a walled garden can’t be expected to understand this though, I guess.
    2. Just because *you* don’t see a need for Xing on your Y, doesn’t mean that no one else does.

    I remember talking about touch-sensitive screens being adopted into
    monitors for Windows 8, talking with some friends of mine who worked at Apple. “LOL,” they said, “who is going to sit there and touch their monitor!? Get real. Touch works for tablets. No one is going to use that on their monitor.”

    Walk into Best Buy right now and count the amount of laptop monitors that are touch-sensitive. Typical Apple
    nonsense. They think they have the One True Way to use a computer, and anyone else who doesn’t conform is a fool. Remember the iPhone 4 antenna fiasco? And Steve Jobs responded, saying people were “holding it wrong”? Guess what? There is no “right” or “wrong” way to do things like hold a phone, or where you host an FTP server from.

    The rest of the world will continue on, using computers as *they* see fit, while Apple users have their use cases dictated to them by the dictators and fascists at Apple. …L.O.L.

  • Gilbert Nicks

    i very much dislike apple… but dude he never said google wallet is better… he just said Samsung pay is going to hurt it due to the fact not every android owner knows better to use one or the other… and as for the the whole who is better… all 3 phone providers have their place with consumers… its all a matter of opinions 100%… now if you want to just run specs its kinda of a crap shoot because look at it this way… there is only one company releasing IOS… vs minor handful for Windows VS everyone and their mom releasing Android OS devices… and all 3 have at least 1 new device each year… features come and go… both android and Apple have random crashing issues… (this is experience not saying they all do it but my luck it happens.) windows phone… i think i had 5 crashes… and all of them were not a full os crash but 3 apps giving me a hard time (was running 1520)…. when i go looking for a phone i still look at all 3 OS…. but will admit i dont like having to Jailbreak and stay behind on updates to Jailbreak…. so i tend to stay away from IOS… as for Android… sure Rooting is nice (but not all can be rooted with latest firmware) so back to that issue… but will admit can be very useful… and fun… now Windows Phone…. 7.5-8.1 was GREAT.. to me…. again its opinion… I could do everything I wanted and not worry about limitations or restrictions… I could FTP, Stream, Download, Convert ect… from my phone… didnt have to root or jailbreak to do anything… I enjoyed the Ecosystem that came with my windows phone… and they are expanding it…. Android is the same way if i understand correctly… Google/android Ecosystem… all devices talk/update/sync/ ect… Yes i know Apple has a similar set up but god damn is it expensive to enjoy that ecosystem…

    back from rambling… its a opinion… not one is better than the other… sure Specs on paper may look nice one year to the next ect… but it is still opinion and preferences that should decide..

    (all os builders adopt ideas from others all the time accept it… just like console developers)

  • Gilbert Nicks

    Apple keeps IOS to its self so that negates that one… (still dont like apple but your view is flawed) Windows Phone just finally made a Phone OS for everyone and not just for businesses… and lets face it the past 4 years people have been Apps Apps Apps and unless all them apps are on windows as well dispite specs they will ignore it (also we know hacking sells devices and it has not been 100% hacked yet) also i understand WIndows phone was being picky on the specs and manufactures that used their OS… so that was stupid and hurt them a bit… as for android.. google loves selling low and getting that % in sales so that makes it a great OS for phone developers to pick up… the fact Android is pretty much LINUX means its a open build allowing people to create proper plugins ect for custom hardware to work in the OS… also makes it easy to hack and for most device companies label it as a feature to help boost sales… example Nvidia with the shield just said HERE its 100% open for root and here is the software have fun… that really boosted the sales of the shield… (fingers crossed the console is just as easy to get into) does it make a better os for the consumer… that can be a big yes or a big no based on who is your consumer and what they are capable of… the fact more manufactures release a Android OS alone explains why 65% of the smart phone market is android.

  • David Curtis

    I will not repeat myself again as HDMI does not support anything above 60FPS. I have a 120HZ monitor and it requires display port or Dual link dvi to output 120hz. It doesn’t support frame rates above 60hz and the current spec is HDMI 2.0 which I have on my GTX 970.

    The iMac is not a sweet deal unless you are looking to do photo editing. The rest of the machine is underpowered and isn’t good for anything else such as High end rendering of video and CGI. Hence it is just a gimmick to me and I cannot do much with that.


    getting a new smokescreen

  • Armando Ferreira

    Let’s not forget

  • Eita.K

    Fanboy lol

  • Mike

    Or I could get two and just play games and be happy and you can stew about it on your own… I guess?

  • goomba1

    umm amd is radeon and yes they do have a card that is side by side with this and amd is always cheaper here is link
    ok found it Nvidia’s next generation GM200 GPU allegedly rocks 3072 CUDA cores and upto 1.2Ghz boost clock

    Read more:
    and that being said for 450 u can get this which has more stream processors and much higher clock I have 2 links both are better than titan x here they are
    hope this helps yes amd is still here they are working on new islands I believe and should be releasing a new architecture soon

  • Jake – Klikkit

    HDMI doesn’t support above 60fps because your monitor doesn’t have a HDMI 1.4 port? Ok that must mean it doesn’t exist for anybody of course.

    You do realise that having HDMI 2.0 on your GPU doesn’t magically upgrade your monitor’s hardware to support the latest HDMI spec right? if your monitor supports HDMI <1.4a, obviously it wont do 60FPS.

    4Ghz i7 with 32GB RAM and a 295 R9 is underpowered?


  • Jake – Klikkit

    So are you just gonna try and insult or actually answer the question?

    Why would you need to host an FTP server on a smartphone?

    (bear in mind the iPhone has this capability)

  • David Curtis

    Unfortunately for you, You are still wrong. Hdmi does not allow anything higher than 60hz (maybe 75hz if you can clock the monitor a little higher) and the iMac is powered my by mobility hardware. Not the same thing as Desktop hardware. Period. End of discussion. Have a nice day.

  • Jake – Klikkit

    The Vizio P series supports 120Hz over HDMI 2.0…..

    I don’t care that my iMac has a mobile GPU, it still performs ridiculously well at a resolution you probably wont have for another 5 years

  • jake

    you know fingerprint scanner is useless on any device not just ios…..if you don’t encrypt your data any “front end security”(fingerprint,pin,password, etc) is gonna be bypassed and cracked. Just saying until one company gives full p2p encryption,which will never happen, all of the devices are the same, based of preference.

  • Jake – Klikkit

    Fingerprint scanner is hardly useless.. I don’t want to type out my password for every app I download, or type in a code for every ApplePay purchase I make.

    If it was useless, the majority of the Android manufacturers wouldn’t have started including them on their devices.

    Not sure why you’re comparing biometric security to encryption as they are two entirely different things

  • Jake – Klikkit

    is this a parody? it’s a video about a 3 year old device with now redundant claims.. (NFC, External USB via Camera Cable, Widgets, Emulators, Theming)

    He was upset that Apple don’t open-source iOS for use on TV’s and Games consoles? LOL

    Then using the braindead argument that Apple sell our fingerprints to the NSA… Yeah! and Steve Jobs is still alive working in Area 51 with Tupac working on prototypes for the Apple plane!

  • David Curtis

    Still doesn’t change the fact that your PS4 and Xbox one are still limited to 60hz no matter what you do because of the standard they were built on and wont change that your iMac has the equivalent of a Radeon 7970 gpu that is clocked lower. If you run windows on it and play battlefield, I doubt you can play at full resolution without lowering the settings down significantly. I am also sure that LG makes the 5k panel for Apple and ACer also uses LG panels, so Id give it 1 year for something affordable that I am not interested in purchasing as I would much rather buy a VR headset and probably wait for 4k 120hz Oled.

  • Jake – Klikkit

    It’s funny you mentioned VR Headsets and the PS4 not supporting 1080p at 120Hz, because Project Morpheus outputs 1080p at 120Hz for the PS4.

    I already said I don’t use the full 5K resolution for Battlefield.

    Not sure why you’re talking about panel manufacturers when you could’ve simply said Dell already have a 27″ 5K monitor out (for the same price as the iMac though).

    And you can enjoy your 4K 120Hz OLED when it becomes available (and more importantly affordable) but I’m not sure why you’re comparing something you’ll likely buy as a TV to my All in one computer which has nearly double the resolution.

  • jake

    that is not true at all actually. if you full disk encrypt your type in 1 password to get in. and that is all, no need for extra security like fingerprint scanners….just saying, you called out the fingerprint scanner for ease of use…i do like the idea of the fingerprint scanners though..just not as a selling point for a device..

  • mark

    past year, I have been dfoing this forever on android

  • Jake – Klikkit

    Really? I still don’t even use Widgets or Bluetooth for data transfer, mainly because there are far better alternatives and Bluetooth should only really be used for device pairing

  • Robin Kleven

    The big question is: Is this card most suited for CUDA developers? Or will Titan X be a gamers choice?

  • efex 172

    Titan X power is 250w, a AMD Radeon R9 295X2 is 500w … will be only interesting what the 4K frame rates will be…

  • leignheart

    And I suppose you’re equating IOS to Nvidia then? Man you are indeed a fanboy. IOS is great for people who love a controlled experience and use a device the exact same way as everyone else who owns the device that has not changed from its first iteration to its last iteration. However android users experience full control over everything on their device and no one exists to tell them they cannot do something on their device, and so there for every android user is different, their phones are different, the look on their phones are different and unique, and that is a very good thing. I think apple is just fine, but android is the place to be if you want your own experience to be unique and not be the same as everyone elses, since what apple offers is grey uniformity. They even go so far to keep their sameness by using tech that’s multiple years old. However that being said, I am still not biased in the sense that I have owned apple products, have some right now, and probably will in the future, However Android or any platform that offers freedom is the one for me.

  • leignheart

    Truthfully, I feel you will not win in this debate, usually apple fans will argue to the death that apple products are the best their are and nothing could possible come close to it. I hear arguments all the time how their 5000 dollar mac is way better than my 5000$ custom built pc. And they still think their machine is better even when I fire up the heaven benchmark and I get 60fps at 2560×1600 on extreme and they get 10fps or lower on average. Even then they tell me still that their Mac is better. Its hard to debate people who ignore hard facts.

  • joe

    my 2200 rig get 57 fps on 4k res u wasted a lot of money

  • lonejack

    this thief in the shadows game awfully resembles the desolation of smaug…

  • Iwasa Misaki

    whoa whoa whoa don’t be dissin android, Just Like NVIDIA Making PhysX Open Source Android is the same, We Continue to Evolve… remember the NVIDIA Shield runs Android too. Ever heard of an Apple Mac user gaming?? anyways Yey! I Can’t wait to get my hands on a TITAN

  • Wikipedia

    HDMI 1.4a was released on March 4, 2010 and adds two additional
    mandatory 3D formats for broadcast content, which was deferred with HDMI
    1.4 in order to see the direction of the 3D broadcast market.[140][141] HDMI 1.4a has defined mandatory 3D formats for broadcast, game, and movie content.[140]
    HDMI 1.4a requires that 3D displays implement the frame packing 3D
    format at either 720p50 and 1080p24 or 720p60 and 1080p24, side-by-side
    horizontal at either 1080i50 or 1080i60, and top-and-bottom at either
    720p50 and 1080p24 or 720p60 and 1080p24.[141]
    HDMI 1.4b was released on October 11, 2011.[142]
    One of the new features is that it adds the ability to carry 3D 1080p
    video at 120 Hz – allowing frame packing 3D format at 1080p60 per Eye
    (120 Hz total).[143] All future versions of the HDMI specification will be made by the HDMI Forum that was created on October 25, 2011.[37][144]

    HDMI 2.0, referred to by some manufacturers as HDMI UHD, was released on September 4, 2013.[145]

    HDMI 2.0 increases the maximum TMDS per channel throughput from 3.4 Gbit/s to 6 Gbit/s which allows for a maximum total TMDS throughput of 18 Gbit/s.[145] This allows HDMI 2.0 to carry 4K resolution at 60 frames per second (fps).[145][146][147] Other features of HDMI 2.0 include the options of the Rec. 2020
    color space, Dual View, 4:2:0 chroma subsampling, 25 fps 3D formats, up
    to 32 channels of audio, up to 1536 kHz audio (for example 7.1 channels
    of 192kHz), up to 4 audio streams, 21:9 aspect ratio, the HE-AAC and DRA audio standards, dynamic auto lip-sync, improved 3D capability, and additional CEC functions.[145]

  • Kriss Prolls Crispy

    i’m pretty sure the 390x will kill the Titan X, or maybe of they make a 395×2 haha who knows.

  • Kriss Prolls Crispy

    they are on the way with 390x 😉

  • Richard Rankin

    I build boxes too. Storage heavy Linux database systems, SSD & processor heavy Linux analytic machines and a Mac for writing letters, reports, graphs, photos, etc. If I gamed I’d build my own machine for that (in fact my analytic machine running Windows is close to that design). I have a lot of tools in my toolbox and can code in several languages. Use the right tools for the job. If you don’t have the right tool, make one.

  • Cameron Holman

    One day I’ll win the lottery and get one….One day….Soon….Soon….

  • Jorell

    This is a 980ti. They added the Titan moniker to crank the price up to retarded.

  • youssef abaza

    I am going to say one thing “Sword Art Online”

  • Left Shark

    Snap-On, MAC, Matco, CornWell, WaterLoo, Extreme, or???


    i know you what you meant.

  • renz

    the driver did mess with the fan control but you’re talking nonsense when you say the driver continue to fill the gpu with ‘random’ data which causing the gpu to overheat. where did you get that from?

    and just because AMD get HBM ahead of nvidia they they will dominate nvidia. the HBM only give more bandwidth but it does not magically make the gpu core faster. by you logic 7970 should be significantly faster than 680 because 7970 have 260GB/s while 680 only have 192GB/s.

  • renz

    but at the expense of power consumption. with 6990 they were the one first to break 300w with dual gpu (5970 before that was rated at 294w if i’m not mistaken). with 295×2 they got into new height making it into 500w card.

  • renz


  • michael J

    where could we pre-order Titan x?

  • Disintegrate

    I don’t like Apple, but I can’t deny it’s good. The price tag bothers me a lot, but hey – some people have more money than me.

    I am using SM-N9005 and I can say that this phone handles everything I throw at it. From remote desktop connections, network port sniffing, coding, conferencing, editing, scripting, systems management, to multimedia entertainment, gaming, taking photos, videos and stuff. Important here is that I multitask while doing all this. And I can work from anywhere, gets even better with the docking station and wireless kb/mouse combo and external display. I am truly mobile and my PC fits in one pocket with all the equipment.

    Now on the other hand, Aplle makes very nice design, but the hardware is not powerfull enough for 2014, if you ask me. The display resolution is also too low for 2014. Still, it looks nice. It reminds me of WP a little bit. MS says:”WP do not have pupmed hardware specs. This is because, WP does more with less.”
    Sounds nice, but it’s bullshit. More is more dot.

    It is really personall choice which one is better. But the whole copy from Apple thing is what makes me write this post. Surely the name is Apple, but that doesn’t mean that they invented the hot water. Like the fingerprint scanner – S5, S5 mini have it. Yes, iPhone have it for years, but the last real innovation died with mr. Jobs. What about the ability for wireless payments on terminals introduced with S6?
    Use your phone, be happy with it, be proud of how well educated about it you are. But, sir, please stop believeing that the everything starts and ends at the very same point. Your view is not same as mine, but I don’t deny yours. And yes I am a fanboy of Andorid, but I don’t feel you educated more than me, even though I believe you are a fanboy too…You just like Apple 🙂

  • Jake – Klikkit

    “the hardware is not powerfull enough for 2014”

    The hardware might not be powerful enough for you, but an iPhone 6 with inferior hardware runs just as fast as a Galaxy S5

    I’m glad you mentioned the fingerprint sensor, because there still isn’t an Android that has a good fingerprint sensor. Apple nailed it 100% on the 5S, and then we had Android’s putting it on the back under the camera?

    If you actually go and read my posts, you’ll see I’m not anti-Android, I’m just defending Apple devices due to the braindead anti-Apple Droid fans who’ll talk crap all day about Apple devices and then lie about having owned one as if it makes their bs any more credible.