Comments Locked

398 Comments

Back to Article

  • Jon Tseng - Monday, January 26, 2015 - link

    Storm in a teacup.

    The benchmarks everyone ran at launch which showed its a screaming card at a great price still stand.

    Very happy with my 970. It's a screaming card which I got for a great place. Move along now.
  • Inteli - Monday, January 26, 2015 - link

    Agreed. Especially at 1080p, I haven't noticed any noticeable drops in frame rate attributed to a lack of VRAM. I think the people hurting the most will be those who bought 2 for SLI at 1440p or higher. I do feel bad for them, but not bad enough that I'm going to stop buying Nvidia.
  • nutmeg2000 - Monday, January 26, 2015 - link

    Looks as if nvidia had some shills jump into the comments section right quick.

    It's great that sites are starting to investigate this major issue now and it will reveal more on the deception nvidia has engaged into with their gtx 970 marketing. We will begin to learn more soon and see what sort of legal trouble nvidia has created with their illegal marketing.

    Major issue for nvidia and not surprising to see nvidia shills in commet areas like this trying to downplay what is going to be a huge issue for them going forward.
  • Letitbewilder - Monday, January 26, 2015 - link

    Some paranoia there but yes nvidia is in hot water and will be trying to lessen the impact this has on their business. Major trouble here and surely a lawsuit is coming for them as in the past with the bunpgate situation. I welcome the test work sites are in the midst of now and hope nvidia makes.good on this significant issue with the gtx 970.
  • Taneli - Monday, January 26, 2015 - link

    A clear case for a class action suit here
  • Thepotisleaking - Monday, January 26, 2015 - link

    The coming weeks are sure to be....

    .

    Interesting!
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    I recall not that long ago here, AMD was embarrassed and in a similar situation, I believe it was the Bulldozer core, that was said to have like 2 billion transistors for 6 months or more after release, then right here Anand posted as part of another article, the updated chart with the only 1.5B transistors, and said AMD gave him the new much lower number with no explanation as to why, and then it was immediately dropped, as if no harm no foul.
    LOL
    It was amazing at the time, AMD got a big fat free pass on a giant spec lie.
  • Overmind - Wednesday, February 4, 2015 - link

    They didn't disable millions of transistors. It was just a wrong number written somewhere.
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    Well, it was 2billion transistors for AMD's bullldozer that dropped not to just 1.5, but lower, to 1.2 -
    http://www.anandtech.com/show/5176/amd-revises-bul...
    " The actual transistor count for Bulldozer is apparently 1.2 billion transistors. I don't have an explanation as to why the original number was wrong.."
    So when AMD does it, it's ok, no explanation necessary... and...the next line is ridiculous praise..

    "Despite the downward revision in Bulldozer's transistor count by 800M, AMD's first high-end 32nm processor still boasts a higher transistor density than any of its 45nm predecessors "

    LOL - it's okay, AMD does it and gets a total pass, so don't worry about it.

    "
  • mrloadingscreen - Friday, January 30, 2015 - link

    It's okay because no one who actually cares about technology much less transistor count bought a bulldozer.
  • piiman - Saturday, January 31, 2015 - link

    who cares how many transistors if it still runs great? This however impacts performance. Dying light turns into a stuttering mess the second the vram go over 3.5gb. This causes me to have to go from a surround 5670x1080 to a single screen and lower graphic settings overall at 1980x1080p , so I can stay under 3.5gb vram,to keep the game playing at an keep good FPS.
    The whole reason I bought 2 970s was to be able to play new 4k games as this could hurt that plan in a big way.
  • fonolavalencia - Wednesday, February 4, 2015 - link

    Booooh, i have to play in one monitor, while people are dying in middle east !. you gringos will get the refound for sure, imagine myself being in ECUADOR... I hope they got a STEP UP program at least. This is very FISHY bussiness friends, we should stop this. And by the way, how about using an application to lower your ram to 3.5
  • frodbonzi - Thursday, February 5, 2015 - link

    Lol... this is a high end card, people dying in the Middle East have nothing to do with it, so get your trolling @ss out of here. And if you actaully bought 2 of these, you SHOULD NOT HAVE TO get an app to limit yourself to 3.5gb!! Especially if you bought them assuming they could handle 4gb!
  • Morawka - Monday, January 26, 2015 - link

    So what if they call everyone's bluff and offer refund's for the cards?

    that's what i would do if i were nvidia, just offer a refund, the majority of people will still keep their card, and nvidia could offer a free game to all those affected.

    Then once the class action suit fires up, they can show all of the goodwill they offered and the case would lose serious steam.
  • spartaman64 - Monday, January 26, 2015 - link

    i just hope nvidia updates the specs and put a disclaimer that there is .5GB of slow ram
  • JlHADJOE - Tuesday, January 27, 2015 - link

    Well, Nvidia's published specs (http://www.geforce.com/hardware/desktop-gpus/gefor... are technically correct, which some people would say is the best kind of correct.

    It's only the review sites really that posted stuff like ROP counts and cache size,
  • piiman - Saturday, January 31, 2015 - link

    funny, they removed that link with the spec's. Perhaps they weren't that correct after all?
  • fuckNvidia - Monday, February 9, 2015 - link

    That was giving to them by nvidia, brainwash the public by the media and say we didn't say it, still is not right.
  • xkieth - Thursday, January 29, 2015 - link

    By majority of people, you mean, those people who bought 970 without any knowledge of the issue. nVidia gives them a free game out of the blue(from their perspective) and ofcourse, free game is free, accepts it, being the oblivious people that they are.

    then BAM!. Ubisoft 2.0
  • piiman - Saturday, January 31, 2015 - link

    "So what if they call everyone's bluff and offer refund's for the cards?"

    A better solution is to turn on the ROP or offer 980's as replacements . Now that would be "good will"
  • ol1bit - Monday, February 2, 2015 - link

    They don't disable all this stuff, it more about what chips pass what tests. In the past, they would have had to disable more of it, so the 970 would not be as fast as it is. Cheers!
  • Oderus Urungus - Saturday, February 7, 2015 - link

    You cant simply "turn on" the ROP, it's lazer-ed off I do believe, which makes it impossible.
  • xenol - Tuesday, January 27, 2015 - link

    A class action lawsuit will take years to settle and in the end all you get is a $30 rebate (while the lawyer who represented the customers gets millions) and you forfeit your right to participate in another class action lawsuit, as the legalese tends to say.
  • 3ricss - Tuesday, January 27, 2015 - link

    Yeah, no class action lawsuit is going to happened. The devil is in the details on this one and I'd be surprise if less than 1% of the users out there have even read up on this.
  • Yojimbo - Tuesday, January 27, 2015 - link

    Are you a lawyer? I wouldn't normally dare say it is or isn't a case for a class action suit, as I am not a lawyer and really have no idea what I am talking about, but since you took a gander at it but didn't back it up in any way, I will too, but with some explanations of my line of thinking. My first reaction is that NVIDIA would probably have to release false claims of utility and not just false numbers. The 4GB claim is real, not false. I wonder if ROPs and memory bandwidth might be a bit too abstract for the courts to rule that consumers were truly deceived. Game performance is the true benchmark as far as consumers are concerned. Secondly they never advertised the inaccurate information, they released it to review sites in press packets. There's still a responsibility there, but my guess is it's a step down from an active advertising campaign.
  • eanazag - Tuesday, January 27, 2015 - link

    Get off your class action suit gravy train America - them lawyers got yous trained. This could easily have been a mistake. On top of that, what was illegal? Price fixing? No. Ruined the competitive landscape? Absolutely not.

    I don't own a 970. The price to performance has only helped consumers.
  • SkyBill40 - Wednesday, January 28, 2015 - link

    A mistake? Unlikely. It seems pretty clear this was likely known prior to launch yet they made no mention of it. That's deliberately falsifying the specs of the card. While it's not really a huge deal, it IS a rather huge hit to Nvidia for the sake of trust in their product specs. Full disclosure is what it should be about from the beginning... not dropping a "mea culpa" afterwards and expecting everyone to just buy into that.

    You can buy into that if you wish, but not I. I guess it's good that I've waited to upgrade. I'll either pick up a 980 or one of the Ti variants should they be released.
  • rafaelluik - Wednesday, January 28, 2015 - link

    Are you kidding or are you mind? Will you keep buying from NVIDIA from now on?!
  • Yojimbo - Wednesday, January 28, 2015 - link

    NO! BURN THEIR CORPORATE HEADQUARTERS! (My legal counsel insists I point out that this was sarcasm.)
  • Yojimbo - Wednesday, January 28, 2015 - link

    What exactly was known prior to launch, the way the card was designed? I hope so. The mistake was made AT launch. The mistake was the improper information being given to the reviewing press. The card works as designed. It doesn't seem worth it to try to fool consumers into thinking there's 64 ROPs instead of 56 ROPs or there's 224 GB/s of bandwidth instead of 196 GB/s of bandwidth, especially when there's seemingly so much fancy hardware and software engineering behind it to make it work. If the wrong information really was released on purpose, it seems like a stupid decision. Sure, consumers would have more peace of mind thinking there was that extra headroom, but on the other hand consumers would also be impressed with the innovations that make it work. Probably the net result is that the card would get more positive attention from the better specs, but I hardly think it could be considered enough to want to choose to lie about the specs, knowing the shit-storm that would be released if and when the truth is found. I mean, games which usually allocate the whole VRAM only are allocating 3.5GB in the 970, so, in hindsight at least, it's pretty obvious that something is noticeable.

    This card is out of my price range so I wouldn't have got one anyway, but I don't see any reason to avoid it. It's SM-bound, not ROP- nor memory bandwidth-bound, and the .5 GB of slower RAM hasn't been shown to create a problem, from what I've seen. NVIDIA just managed to make the whole thing fit together more efficiently from a manufacturing cost standpoint. Hence they can sell a card that works equally well as a 64 ROP and 224 GB/s card for less money than they would be able to sell a 64 ROP and 224GB/s card for.
  • piiman - Saturday, January 31, 2015 - link

    "not ROP- nor memory bandwidth-bound, and the .5 GB of slower RAM hasn't been shown to create a problem, from what I've seen."

    Go buy dying light and watch what happens the second it goes over 3.5gb. I can tell the second it does because the game begins to stutter going from 150 fps 2 FPS. I have to lower settings, can't us triple monitor surround, to keep it under 3.5 so I lose LOTS of graphic goodness. Below 3.5 the game runs great over and it makes you want to throw it out the window.
  • gw74 - Thursday, January 29, 2015 - link

    nope
  • gw74 - Thursday, January 29, 2015 - link

    nope. a clear case for a refund or exchange perhaps, if you even want one, at most.
  • adamrussell - Friday, January 30, 2015 - link

    Doubtful. The most you could ask for is a full refund and is that what you really want?
  • ol1bit - Monday, February 2, 2015 - link

    Clas action? Are you nuts? Really, in the world just a few years ago, no one would even care. You still are getting the same kick ass card the benchmarks ran, for a kick ass price. You cannot expect it to be the same as the 980?
  • Stas - Saturday, February 7, 2015 - link

    They better give us free shit. I don't care about some lawyers collecting millions and sending me a $20 check. I'd take a discounted trade-up to 980 or maybe a $50 coupon toward any nVidia product purchase valid for 2 years.
  • StevoLincolnite - Tuesday, January 27, 2015 - link

    The big issue is... These specifications have been KNOWN for a long time, yet nVidia did nothing to notify ANYONE about the inaccuracies that *every* review site posted.
    It was only until they were "caught out" that they became apologetic.

    This will probably a good moment for AMD to launch it's 300 series of cards to capitalize on this fumble, I doubt they will though, they haven't exactly been quick to react for many years.
  • HisDivineOrder - Tuesday, January 27, 2015 - link

    More likely, AMD will announce a re-re-release of the 7970..er... 7970Ghz...er...R9 280X line as the R9 284X line with a "Never Settle Not Ever No Way" bundle that includes three of the following: Deus Ex Human Revolution, Sleeping Dogs (non-Definitive), Hitman Absolution, Alien Isolation, Saints Row 4, Saints Row: Gat Out of Hell, a ship for Star Citizen, or Lego Batman 3.

    Meanwhile, the R9 285X may finally arrive by the time nVidia finishes filling out the Maxwell stack with the GM200.
  • Kutark - Tuesday, January 27, 2015 - link

    Well played sir, i got a nice chuckle out of that post ;-)
  • Yojimbo - Tuesday, January 27, 2015 - link

    Four months ago would have been a better time for AMD to launch their 300 series cards, but AMD doesn't have any 300 series cards to launch at the moment and if the rumors are true, won't have them until June.
  • bigboxes - Monday, January 26, 2015 - link

    Agreed. How can AnandTech just blindly accept this excuse? We can only hope that there is a price drop that follows this outing.
  • JarredWalton - Monday, January 26, 2015 - link

    People buy these cards based on performance, not on raw specs. For their part, NVIDIA doesn't even publicly list ROP counts for the various parts. I can't imagine any credible lawyer trying to start a class action suit over this information.
  • bigboxes - Monday, January 26, 2015 - link

    I'd say they do both. No one is saying that they 970 is a bad card. It's just not as good as advertised. I've been seriously considering purchasing a 970. The last couple of weeks I have been researching the different models. I think that Newegg and Amazon keep on directing web pages and e-mails directly relating to that fact. I'm now waiting to see how this whole thing settles.
  • Thepotisleaking - Monday, January 26, 2015 - link

    On point! All metrics are considered for purchases and there is no doubt somewhere out there are those that bought based on specs, particularly VRAM, that were defrauded.

    Johntseng is likely an nvidia employee or just a fanboy simpleton. Nice to see this issue acquiring traction leading to another payout from nvidia.

    not worth the lie is what they are likely thinking down in Santa Clara today :)
  • Thepotisleaking - Monday, January 26, 2015 - link

    Further many sites are on this issue now, a lot more data on the performance impact of this issue will be coming to light.

    .

    #ROPGATE 2014
  • Jon Tseng - Tuesday, January 27, 2015 - link

    >Johntseng is likely an nvidia employee

    Incorrect. You can look me up on Linkedin if you want to know where I work. Unlike some people I choose not to hide behind anonymous pseudonyms when I post online.

    >or just a fanboy

    Incorrect. From 2009-2014 I used a 4870x2 (scan.co.uk invoice #AQX84951 if you don't believe me) which was an incredible card for the price (£240). From 2003-06 I used a 9800 Pro (another great piece of silicon). I also had a 4850 in the HTPC and used an XT1950 Pro for a while while I was weaning myself off of AGP. Stange behaviour for an NVDA fanboy.

    >simpleton.

    Maybe if you tried engaging with my arguments rather than conducting silly ad hominem (https://yourlogicalfallacyis.com/ad-hominem) attacks you might have more headway.

    GTX970 is still a great card at an awesome price. Nothing that has come out today has changed either the performance benchmarks we have all seen or the price point we can buy it at. Actually I'm secretly hoping all the dumb publicity makes the price come down a bit so I can grab a second one to SLI...
  • Kutark - Tuesday, January 27, 2015 - link

    Don't waste your time responding to these morons. The irony is he called you a fanboy, when he is clearly an ATI fanboy.
  • Fishman44 - Friday, January 30, 2015 - link

    The issue isn't how great the card is. The issue is that they knowingly deceived their customers.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    That's empty rhetoric.

    Performance depends on specs. 4 GB of VRAM performs better than 3.5 GB of VRAM when the game needs more than 3.5 GB of VRAM.
  • nevertell - Tuesday, January 27, 2015 - link

    Outside of situations where the GPU is used for general purpose computation, the performance implications of partitioned memory are not noteworthy given proper memory management in the driver. The reason games utilize as much memory as possible is because they cache textures and vertexes and intermediate results. The reason the 970gtx utilizes 3.5 gigabytes in most scenarios is that storing a cache on partition memory will deliver lower performance. I can assure you that a game that would need 4 gigabytes of data to perform a non-trivial draw call would be bound by compute, not i/o. Research sparse texture arrays and all other nice things that will be (are introduced in opengl with extensions for each vendor) in the next gen opengl spec (and are sort of implemented in mantle and dx12) and understand how caches and this will come as common sense.
    The real question is, why didn't they just sell a 3.5gb card instead and save on memory chips ?
  • Kutark - Tuesday, January 27, 2015 - link

    b/c consumers are dumb and if the 970 had 1mb less ram than 4gb it would of decreased sales. There is a reason they moved away from stuff like having 1.2gb or 1.5gb, etc etc. People like big solid numbers.
  • HisDivineOrder - Tuesday, January 27, 2015 - link

    You give lawyers so much credit. Often, lawyers like being one to start such things and don't care much if they manage to finish it.
  • maximumGPU - Wednesday, January 28, 2015 - link

    I'd say both Jarred. Sure, i look at performance first, but performance metrics tell me how good the card is NOW. The next thing i do is look at the specs and try and estimate how future proof my purchase will be.
    A 3GB 970 would show great metrics at 1080p, but i wouldn't buy it because i know ram is ever more important thanks to the consoles catching up.
    Since i game at 1440p, 4GB was my minimum ram threshold. i thought i got that with the 970, but instead got 3.5 + 0.5GB of slow ram. That makes my card less future proof than i thought and could've well affected my buying decision, regardless of its current performance metrics.
  • Ranger101 - Tuesday, January 27, 2015 - link

    No surprises as to Nvidia's behaviour, as a company they are of course a rapacious juggernaut, but Tut Tut Anandtech, what would the great founder have to say?

    Having read many recent articles in the GPU section, I am mostly impressed by the high quality of writing, however those who read between the lines of Mr Smith's Gpu reviews, realise that appearances of impartiality in his writing are misleading and that he is in fact a staunch and unrelenting supporter of camp green. ( Everyone is of course biased, it's just less appropriate to let it shine through in technical website reviews.)

    It should therefore come as no surpise that in his initial review these issues "escaped" his attention, despite the fact that "a limited number of flags were raised" and that in the follow up article, he unashamedly wields the Bastard sword of Nvidia. LOL.

    You must remember these things happen for a reason Ryan and I once again encourage you to temper your bias in forthcoming utterances...AMD still make good cards and a little competition is good....right? :)
  • just4U - Tuesday, January 27, 2015 - link

    I think the fact that Ryan gets accused of being in favor of AMD and Nvidia means that's he's doing a pretty good job of not really being in either camp. If anything I'd simply suggest his expectations on performance are limited and when the cards actually do better.. he tends to point that out. Not really a bad thing considering how underwhelming hardware leaps are these days in most segments. Smaller jumps not the leaps and bounds we were all once used to.
  • OrphanageExplosion - Tuesday, January 27, 2015 - link

    Oh do behave. When Anandtech had the AMD News Center sponsorship all we ever heard from the commentariat was that the site, and Ryan specifically, were AMD biased. I think we all know where the bias is on Anandtech - and it's in the comments, not the editorial.
  • HisDivineOrder - Tuesday, January 27, 2015 - link

    You're talking about the same guy that just took it on AMD's word that Mantle was going to be "virtually identical" to the same low level access API as the Xbox One and that subsequently Mantle was AMD bringing the Xbox One's low level access language to PC gaming.

    Seriously. If the guy is biased toward anything, he's biased toward believing more of AMD's statements than he really ought to, but I've had a hard time really blaming him since AMD had JUST paid for him (and his buddy journalists) to go to Hawaii on a beach trip and vacation under the excuse that it was to present the GPU part called "Hawaii." I mean, if I was tatken to Hawaii, I'd probably be willing to believe anything they told me, too.

    Still, don't mistake the man for an nVidia fanboy. He's clearly not. Lots of other people questioned that AMD party line far more than Anandtech did back in the day and it took a long time before they acknowledged that AMD had hoodwinked them and they never REALLY admitted it wholeheartedly.

    Because AMD suggesting that Mantle was anything but a completely proprietary and locked-in API was a lie and no hardware company has yet to sign up in spite of the fact Intel tried very hard to research the subject and was rebuffed by AMD for months.

    Intel likes to do anything they can do for free and they read all the press (like Ryan's) that suggested Mantle was going to be free and freely available, but as it turned out, that was more hyperbole on the part of AMD.

    Yet I saw nothing of that on Anandtech. No, I don't think there's much evidence of his being "a staunch and unrelenting supporter of camp green" unless you're recalling the heady days of AMD's time as a "green" company.
  • Gothmoth - Tuesday, January 27, 2015 - link

    as if you read or even UNDERSTAND what ROP´s mean before you buy a card.....

    you and all the others are just trolls who have to much time on their hands....
  • Kutark - Tuesday, January 27, 2015 - link

    I honestly don't understand why people are so up in arms over this. At the end of the day the performance figures still stand. The situations in which this news could actually arise and cause any problems are so limited its not even funny. At the resolutions and settings most games operate at don't use anywhere close to 4gb of vram.

    Honestly if i didn't have SLI'd 760's i'd go out and buy a 970 tonight, regardless of any of this information.

    That being said, this is another article that proves why anandtech is easily the best tech website out there. Thorough and honest, unbiased, just, amazing, love it. Sorry for all the commas.
  • Kutark - Tuesday, January 27, 2015 - link

    Meant to say gamers, not games. Regardless.
  • vegemeister - Saturday, January 31, 2015 - link

    They do *not* stand. Anandtech's own 970 review did not present frame interval statistics. FPS measurements are only useful if you are comparing trials in a controlled experiment. Overclocking, changing AA levels, and the like.
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    Oh blindly accepting NO EXCUSE looks to be the standard here: " The incorrect number, provided to me (and other reviewers) by AMD PR around 3 months ago was 2 billion transistors. The actual transistor count for Bulldozer is apparently 1.2 billion transistors. I don't have an explanation as to why the original number was wrong, just that the new number has been triple checked by my contact and is indeed right. "

    LOL - it's okay man, AMD did it huge and never gave a reason for the giant PR lie, and we all were required to pretend it didn't matter, as we were told in the write up, here.

    http://www.anandtech.com/show/5176/amd-revises-bul...
  • Jon Tseng - Monday, January 26, 2015 - link

    Hey man if you really think this is a terrible card I'd be happy to buy yours off you (I assume from your righteously aggrieved tone you /must/ be a betrayed GTX 970 owner right?).

    How about you sell it to me for $200? I can do PayPal! After all given its /such/ a PoS card I'm sure you can't argue its worth any more than that right?

    Then I can go and muck around SLI-ing AC:Unity at stupid resolutions (plus I hear it might be able to actually run Crysis), and you can go off and sue NVidia for the extra $150. Then we're both happy! :-) :-)
  • tuxRoller - Monday, January 26, 2015 - link

    Hey, given the card is so fantastic, why not $400?
  • cuex - Tuesday, January 27, 2015 - link

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    how about a kg of shit?
  • Jon Tseng - Tuesday, January 27, 2015 - link

    Ummm... Because I can buy one new for $350?
  • Oxford Guy - Tuesday, January 27, 2015 - link

    "Hey man if you really think this is a terrible card I'd be happy to buy yours off you..."

    This is a red herring.
  • Jon Tseng - Tuesday, January 27, 2015 - link

    Not really. I'm actually making two serious underlying points:

    1) A lot of the people who are raising up a sh*tstorm on this thread ("OMG NVDA ARE EVIL THIS IS AN EVIL MASTER PLAN JEN HSUN IS THE ANTICHRIST") likely don't own a GTX 970. Note that comment from GTX 970 owners (myself included) is largely positive.

    By calling out people to put up and sell out I am highlighting this fact - most critics can't put up and sell out because they are not GTX 970 owners with first hand experience of the product.

    2) The underlying logic of critics is that based on the news we heard yesterday this card is "gimped" - i.e. it's suddenly worth less than we thought it was.

    However this is despite the fact that the real-world performance of the card (which we all have seen in independent benchmarks) has not changed, and the price of the card has not changed this card has suddenly become bad overnight. Remember there were plenty of UHD benchmarks on real world games conducted at time of launch. Those numbers have not changed.

    By posing rhetorical question "has the card suddenly changed overnight so that its real world value has fallen from $350 to $200" I am trying to bring out the incongruity between the two positions.
  • Sushisamurai - Tuesday, January 27, 2015 - link

    I think ur getting trolled...

    Anyways, this just reminds me of the more in-depth analysis of Apple's A8X chip - If only the chip got "faster" overnight....
  • neilbrysonmc - Wednesday, January 28, 2015 - link

    I'm a new GTX 970 owner as well and I'm loving it. As long as I can run almost every game on ultra @ 1080p, I'm fine with it.
  • GGlover - Wednesday, January 28, 2015 - link

    So Mr. Tseng, how much did you pay for your 970? I bought 2 of them at launch and I am indeed angry that I was misled by inaccurate specs. I could care less about the real performance as I was lied to. I don't want to give money to liars. You should be a little more upset unless you got your card for free...
  • ThorAxe77 - Friday, January 30, 2015 - link

    "I could care less about the real performance" That's the funniest thing I have heard of in a while!!
    May be you should buy a 4GB R7 240...it's got 4GB so it must be great for 4K!
  • piiman - Saturday, January 31, 2015 - link

    "By calling out people to put up and sell out I am highlighting this fact - most critics can't put up and sell out because they are not GTX 970 owners with first hand experience of the product."

    If by selling out you mean taking a big lose then I'll talk to ASUS first and see if they will take it back at cost. Thanks for tying to take advantage of us.

    "2) The underlying logic of critics is that based on the news we heard yesterday this card is "gimped" - i.e. it's suddenly worth less than we thought it was."

    And since you don't think it is how abut you pay full price?

    "However this is despite the fact that the real-world performance of the card (which we all have seen in independent benchmarks) has not changed,"

    Lots , if not most benchmarks, didn't fill up the 3.5gb so we'll see what happens soon when you go over and from what I have seen once that happens your card is "gimped".

    we'll see soon I guess.
  • Gothmoth - Tuesday, January 27, 2015 - link

    quick get your tinfoil hats on..... lunatics.
  • gw74 - Thursday, January 29, 2015 - link

    Looks as if you're an AMD shill
  • ol1bit - Monday, February 2, 2015 - link

    Shills? AMD fanboy I'd say. The benchmarks haven't changed, you still get 4GB of ram, it still runs at 85-90% of the 980 for 1/2 the price or so. How do you think they make slower version of the same chipset anyway?
  • Alexvrb - Monday, January 26, 2015 - link

    They should have advertised it as 3.5GB + 512MB buffer or something of that nature. Nobody would have really batted an eye. If anything it might have driven a few customers to spring for the 980 over the 970, despite the comparatively large price gap and the not-so-large performance gap.

    Regardless... true or not, I love Nvidia's version of events. "Uh well we just didn't catch the technical documentation error and um, Group A doesn't talk to Group B. Honest. If we had caught wind of the mistake in several months before the story spread across the net like wildfire, we would have totally come out public with the info on our own. We swear! We come out and knock our own products down a peg months after the reviews come out all the time, with no prompting, yep."
  • 3ricss - Tuesday, January 27, 2015 - link

    I don't know. I still don't think they are wrong in advertising as 4GB, but they should have definitely offered a footnote as to the allocation of that memory.
  • Samus - Monday, January 26, 2015 - link

    Yeah, simply the best $300 I ever spent on a GPU, even if it realistically has 3.5GB of usable memory.
  • Laststop311 - Tuesday, January 27, 2015 - link

    but its not 300. it's more like 350+ including shipping unless you get the crappiest ones with the cheapest coolers.
  • Mondozai - Monday, January 26, 2015 - link

    When a company intentionally lies to its consumers, that isn't a storm in a teacup. Ryan may believe them but I don't. I agree with him that it's incredibly stupid to do this kind of stuff, but the notion that they didn't know, even after all the manuals were passed around the company? Knowing the number of ROPs is basic stuff for technical marketing.

    And okay if this got missed a single round. But in successive rounds, over a period of almost half a year? C'mon. Nvidia knows that it wouldn't sell as well if they marketed it as "3.5 VRAM" and they tried to cover this shit up.

    I'm guessing Jonah Alben didn't have anything to do with this, and I'm guessing he's pissed as fuck. The big quesiton is if Jen-Hsun knew or not. Their marketing team are not exactly people I'd trust(watch Tom Peterson in any stream and you'll know what I mean).

    Throwing the marketing guys under the bus is poetic justice. But also an easy move. Again, did the CEO know?
  • mapesdhs - Monday, January 26, 2015 - link


    "intentionally lies".. yeah right! So you're saying this is not acceptable, and yet it's ok for AMD
    (and indeed NVIDIA) to market dual-GPU cards by advertising the sum of the VRAM on both
    GPUs, even though an application can only see & access the individual amount? Look at
    *any* seller site spec list for an AMD 295x2, they all say 8GB (ditto the specs page on
    AMD's site), while Anandtech's own review shows quite clearly that it's just 2x4GB, so the
    real amount accessible by an application is 4GB, not 8GB. Surely this is far more of a
    deception than the mistake NVIDIA states they have made with the 970 specs.

    So I call out hypocrasy; your comment is just NVIDIA-bashing when there have been far
    more blatant deceptions in the past, from both sides. NVIDIA does the double-up VRAM
    nonsense aswell, eg. the sale ads for the Titan Z all state 12GB, as do the specs on the
    NVIDIA web site, but again it's just 6GB per GPU, so 6GB max visible to an application.
    Look back in time, you'll see the same mush published for cards like the GTX 295 and
    equivalent ATIs from back then.

    So quit moaning about what is merely a mistake which doesn't change the conclusions
    based on the initial 970 review performance results, and instead highlight the more blatant
    marketing fibs, especially on dual-GPU cards. Or of course feel free to cite in *any* dual-
    GPU review where you complained about the VRAM diddle.

    Sorry if I sound peeved, but your comment started by claiming something is true when
    it's just your opinion, based on what you'd like to believe is true.

    Ian.
  • alacard - Monday, January 26, 2015 - link

    "So you're saying this is not acceptable, and yet it's ok for AMD
    (and indeed NVIDIA) to market dual-GPU cards by advertising the sum of the VRAM on both
    GPUs, even though an application can only see & access the individual amount?"

    That's what's known as a straw-man, he never mentioned anything about dual GPUs. His point about ROPs is perfectly valid--and no Ian it's not ok to lie about that, nor about the amount of cache.

    "Sorry if I sound peeved, but your comment started by claiming something is true when
    it's just your opinion, based on what you'd like to believe is true."

    Why would you give Nvidia the benefit of the doubt here? If you really and truly believed no one brought this up before release or noticed it afterwards than you're a bigger fool than i could have ever guessed you are.

    Sorry if I sound peeved, but your comment started is claiming something is true when
    it's just your opinion, based on what you'd like to believe is true.
  • dragonsqrrl - Monday, January 26, 2015 - link

    "Why would you give Nvidia the benefit of the doubt here?"

    Why would Nvidia want to deceive the whole PC gaming world over something so minor? As Ryan stated in the article that would be genuinely stupid. Can you think of a reason why Nvidia would intentionally seed a slightly inaccurate spec sheet to the press? What would they gain from that? I don't think there's any reason to believe the initial spec sheet was anything other than a mistake by Nvidia, and neither does any credible tech journalist I know of.

    That being said I also highly doubt they weren't aware of the mistake until now. While I think their response to this incident has been good so far, I really think they should've come out with this information sooner (like last week when this started to really heat up). But I think that time was probably spent confirming what had happened and how to present it to the press.
  • alacard - Monday, January 26, 2015 - link

    " Can you think of a reason why Nvidia would intentionally seed a slightly inaccurate spec sheet to the press?"

    Is this a real question or some sort of a joke? You're asking why a company would knowingly inflate a spec sheet for a product they want to sell, and doing so with a straight face? Is that PT Barnum's johnson i see swinging from your asshole?
  • Galidou - Tuesday, January 27, 2015 - link

    People buy performance, don't say a thing about memory bandwidth rops and such install it in your computer. You paid it less than some video cards it outperforms, don't care about stats, you're on the good way.

    Companies lie to us about advertising any sort of things on tv and so on. I've seen many LCD monitors advertising X nits and not delivering totally the amount and no one ever sues them. If the monitor is still averages better or the same image quality than the best monitors in it's price class who cares about the advertisement.

    Not saying that lying to improve sales number is right, but SO MANY companies do that. Unless it turns out to be a really bad product for the price you paid, then sue them. But don't whine when there's a SLIGHT difference but still outperforms everything in it's price class, uses less power, has good drivers and so on.

    The only reason Nvidia would have to do this intentionally would be to back up a medium video card performance, a kind of semi failure, which the GTX 970 SURELY isn't. Why would a company need to boost sales while they know it's gonna be sold out for the next month because of it's price/performance ratio?
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    Oh, so that's why AMD lied about the number of transistors in the Bullldozer core, claiming it was 2 billion, then months later correcting their lie to journalists and revising it downward quite a large leap to 1.2 billion, a full 40% drop.
    Yes, lying about a cruddy product that never met expectations by pumping up that core transistor count to give the impression of latent power just not yet utilized, by say, optimizations required for the Windows OS to use all the "8"/(4) cores better with improved threading...

    Hahahhaaa no it's not a joke...

    http://www.anandtech.com/show/5176/amd-revises-bul...
  • dragonsqrrl - Tuesday, January 27, 2015 - link

    Wow, disproportionately aggressive response to appropriate and logical questions. I can't tell if you're trying to intentionally mislead others or if you really have no clue what you're talking about. Yes, I'm asking why Nvidia would conspire to intentionally lie about something so minor in the initial spec sheet that would almost certainly be discovered soon after launch? I even tried to help you out a little: What would they gain from that?

    It just takes a simple risk assessment and a little bit of logic to pretty much rule this out as an intentional deception.
  • Galidou - Tuesday, January 27, 2015 - link

    Nvidia's way of thinking by the mad community: ''With the performance to cost ratio of that card when it's gonna be launched, it will be sold out for weeks to come even if we give the true spec sheets! Let's speak to marketing department and modifiy that so it can be SOLD OUT TIMES 2!! YEAH, now you speak, let's make the community so mad they have to wait for it! YEAH, we want the community to HATE US!''
  • alacard - Tuesday, January 27, 2015 - link

    Galadou, Dragonsqrrl: Can you explain how a 970 with one of the dram banks partitioned for low priority data is supposed to operate at 256 bits? Given that the last 512 chunk is only being accessed as a last resort, and only after all the other RAM is occupied, the memory subsystem could only be operating at 224 bits for the majority of cases.

    I could be wrong but i just don't see it. Given that, we're not merely talking about diminished ROP and cache count, but also a shallower memory interface which NVIDIA marketed specifically as being exactly the same as the 980. Here is a direct quote from their reviewer's guide:

    "Equipped with 13 SMX units and 1664 CUDA Cores the GeForce GTX 970 also has the rending horsepower to tackle next generation gaming. And with its 256-bit memory interface, 4GB frame buffer, and 7Gbps memory the GTX 970 ships with the SAME MEMORY SUBSYSTEM as our flagship GEFORCE GTX 980"

    If it really is only operating at 224 bits, THIS IS A BIG DEAL. Even if it were an honest mistake, it's still a big deal. Giving them the benefit of the doubt and assuming their initial materials were wrong, the idea they didn't notice it after release... come on.

    BTW that PT Barnum comment was just a joke that popped in my head at the last second and i couldn't resist adding it.
  • Galidou - Tuesday, January 27, 2015 - link

    The performance remains still amazing for the price. They wouldn't have to describe the spec to me that I would have bought it if I didn't have an already good enough card for what I do/play.

    What's a big deal to me: performance to cost ratio end of the line. I never cared about anything else.
  • Galidou - Tuesday, January 27, 2015 - link

    alacard, did you really read the article? It is said about the memory bus that: ''Ultimately this link is what makes a partially disabled partition possible, and is also what makes it possible to have the full 256-bit memory bus present and active in spite of the lack of a ROP/L2 unit and its associated crossbar port.''

    If you have to be on the mad side of the community, at least, know your subject.
  • alacard - Tuesday, January 27, 2015 - link

    Galidou, did you read my comment? It CAN'T be running at 256bit bus width with the last 500MB DRAM module empty, which is will be most of the time. Please don't be dense with your replies about me not knowing my subject when you clearly don't know yours.

    "People buy performance, don't say a thing about memory bandwidth rops"

    Do you have a crystal ball? How do you know whether or not people buy things for specs are you clairvoyant? Can you see backward and forward in time? I buy tech based on specs all the time, in fact i don't know anyone personally who doesn't. A 256bit bus vs a 224 bus would cause me to think more carefully about my decision. Maybe i have a program with extremely high bandwidth needs that would run faster with 256. Maybe I plan on 4k gaming so I want the extra ROPS just in case. Maybe 2MB of cache sounds better to my ears than 1K+.

    My guess is the above - people like me - is why NVIDIA did nothing to correct specs they had to know were false. Now they're reaping what they sow, and i hope it's a huge harvest. They've earned it.
  • Galidou - Saturday, January 31, 2015 - link

    Nothing can be done qith you, you're the ultimate truth, next time, I will buy based on specs. Oh god forgive me for thinking I had to buy a gtx 970 because on paper it performs better than a R9 290 which has a better spec sheet.

    I will know better from now, buy the R9 290 because of the 512 bit memory bus and 2560 Stream processors. EVEN if it gives me worse fps for the price. oh almighty specs, forgive me for being such an ignorant, I once thought performance was more important than specs, now that alacard has enlightened me, never shall I make the same mistake.

    P.S.: I bought a GTX 660 ti even if peeps were going against the 192-bit memory bus and it's still one of the best video card I bought even 2 years and a half later.
  • Galidou - Saturday, January 31, 2015 - link

    Oh, another point, you want to buy a video card, look at benchmarks of the game you play at your resolution. Buy the video card that give you the best fps for the price, oh NOO I forgot again, no one buy BASED on performance, I HAS NO CRISTAL BALL OH NO! Fast, to the spec sheets: OH yeah I dunno what the performance is but THE SPEC OH THEY SPEAK TO ME!

    Darn, my friend plays the same games than me, bought video card for a cheaper price based on benchmark and gets better performance than me, alacard screwed me!!!
  • Galidou - Saturday, January 31, 2015 - link

    We sure look at specs, but again, we buy based on performance and I know I'm right. If I wasn't everyone would buy R9 290 and R9 290x because of the memory bus and quantity of Stream processors, not considering that 256 bit bus with superb compression with no loss in image quality will give you better performance.

    But nowhere will you see about nvidia's bandwidth that it surpasses the 512-bit bus of AMD. Not in any online retailer store directly on the spec sheet. So no, specs aren't everything, it doesn't say a thing about the optimization it had behind the specs.
  • spartaman64 - Monday, January 26, 2015 - link

    the 970 is still a great card but we should hold nvidia accountable
  • bigboxes - Tuesday, January 27, 2015 - link

    Good lord, you're shilling on HardOCP as well. You should be banned for this kind of crap. C'mon mods.
  • Ranger101 - Tuesday, January 27, 2015 - link

    What the hell do you know?
  • jackstar7 - Tuesday, January 27, 2015 - link

    So the life of your card was cut short (as games continue to use more and more VRAM) and you're not bothered by that? Is it that you would upgrade again before this became an issue and you don't mind losing value for potential resale? I just don't understand the mindset of someone who is okay with finding out their purchase was not made with all the correct information and in this case because the company specifically screwed up in providing it.
  • Stuka87 - Tuesday, January 27, 2015 - link

    Except its not for a great price. Its 5% faster (on average) than the 1.5 year old 290X, but cost $100-$150 more. How is this a good price?
  • Chaser - Wednesday, January 28, 2015 - link

    Couldn't agree more. Frame rates and superb low power consumption for the price.
  • Sh!fty - Friday, January 30, 2015 - link

    I got the card for its longevity. I bought 4GB of RAM and now being told I only received 3.5GB.

    REGARDLESS of how good the card is, this is false advertisement
  • piiman - Saturday, January 31, 2015 - link

    Well I've recently picked up Dying Light and every time its VRAM goes even a little above the 3.5 the game turns into a stuttering mess.

    I also bought a card BASED on that faulty data sheet and I'd like what I paid for.
  • Overmind - Wednesday, February 4, 2015 - link

    Not a storm. A marketing trick.
    Remember the fake nV video card presentation, the one with wood-screws ?

    The questions one should ask is how much are they willing to cheat ?
  • fuckNvidia - Monday, February 9, 2015 - link

    Be very happy with it!<------nvidia employee as a customer i'm returning both mine
  • mluppov - Sunday, March 1, 2015 - link

    Oh shut up, Jen-Hsun.
  • Cullinaire - Monday, January 26, 2015 - link

    Perception will become reality soon as games start to use more and more VRAM...
  • Veritex - Monday, January 26, 2015 - link

    That's what Euro review sites indicated with PS4/Xbox1 with 8 GB of memory and with around 30 million sold now and heading toward 50 million in the near future...the gimped cards like the GTX 970 will lose performance and value more quickly than in the past.

    Another factor about perception is that Nvidia misrepresented the specs all through 2014 and only after so many users were questioning them. They conveniently waited until the Christmas and shopping season was past before disclosing the truth to the consumers.

    So it is hard not to have an overall perception of Nvidia as a shady and less than truthful company.
  • looncraz - Monday, January 26, 2015 - link

    That's my concern about all this - it took so long for it to come to light, someone at nVidia HAD to have noticed - these people get paid to read an interact with reviewers, and even the engineers would likely have read the reviews and saw that the specs as wrong (and one or more of them probably brought it up to management).

    nVidia can prevent this problem in the future by adding a specification review phase - or just having the engineers write down the specs... really not that complicated.
  • Jon Tseng - Monday, January 26, 2015 - link

    If it's such a gimped card I'll buy yours off you for $200. After all if it's really going to lose performance and value so quickly it can't be worth much more than that.

    You can sue NVidia for the extra $150. I can finally get FSX running at 4K*. Everyone's happy! :-) :-)

    * Bonus point if you can spot the deliberate "Kessel Run in 12 Parsecs" logic here.
  • JarredWalton - Monday, January 26, 2015 - link

    Not likely. Most games target specific amounts of VRAM, with 1GB, 2GB, 3GB, 4GB, 6GB, and 8GB are all likely candidates, usually the targets have some leeway. The reason is you target memory use based on textures and shadow maps, but you know that you also have to have frame buffers and other elements in VRAM (that aren't usually directly under the control of the game). So a game that targets 4GB VRAM will usually target more like 3-3.2GB VRAM, leaving the rest for the GPU to use on frame buffers, Z-buffers, etc.

    To that end, I've seen games where a GTX 970 runs great at Ultra QHD, but Ultra 4K really kills performance -- because where Ultra QHD might be just under the 3.5GB VRAM of the 970, Ultra 4K ends up going over the 4GB barrier. (The same performance drop occurs with the GTX 980 as well in my experience.) And do you know what most gamers will do if they hit the point where performance takes a big dive? They'll drop one or two settings to "fix" things.

    And that's where NVIDIA's GeForce Experience can help the majority: they go in, select their resolution, and let the game do the tricky part of selecting ideal settings. Maybe it won't be perfect, but for most gamers it's sufficient.

    TL;DR: Much ado about nothing.
  • Samus - Monday, January 26, 2015 - link

    And for those games, you'll need a higher end card. The realistic difference between 3.5GB and 4GB VRAM for texture cache means very little, even at 4K, where even 4GB is the ceiling NOW. Let's face it, with consoles having 8GB and high end cards having 6GB, 4GB cards just won't cut it in a few years let alone 3.5GB cards.
  • Mvoigt - Monday, January 26, 2015 - link

    You fail to understand that the consoles have a total of 8gb ram... not all dedicated to graphics... the OS uses some, the game uses some, and the graphics use a portion of that... Then i could say consoles fail, since my graphics card has 4GB ram and my machine has 32GB ram, I have a combined 36gb ram avaliable vs 8 gb on the consoles....
  • Kevin G - Monday, January 26, 2015 - link

    The thing is that the GPU and CPU have independent memory pools. If a game only uses 1 GB of that 32 GB main memory, you have 31 GB going to waste. Attempting to utilize that extra memory for a texture cache tends to make games crawl due to the latency bottleneck.

    On a console, what isn't used up by the host OS (512 MB last I checked), and core game logic can all go toward game assets. That can easily mean more than 6 GB of textures and models. With PC games typically running at higher resolution and using even higher resolution assets, it could easily equate to a demand for 6 GB and 8 GB graphics cards this year or next.
  • hermeslyre@hotmail.com - Monday, January 26, 2015 - link

    Last I checked both consoles reserve around 3.5GB for OS, With the PS4 having 512MB of that reserved pool as flexible. Which leaves not a megabyte more than 5GB available to developers to do their thing. On the PS4 at least.
  • McC54u - Monday, January 26, 2015 - link

    You guys act like consoles are running on Titan Z's or something. They are running on radeon 7850 at best. They will never have a real graphics intense 4K game on a console they can't even do 1080p on most their launch titles. Even with all that ram. Unless they get on board with some serious streaming tech for new titles we have seen almost the peak of what these consoles can do.
  • Galidou - Monday, January 26, 2015 - link

    What you don't realize is that the 8gb is usable for textures. Games tend to look very good on consoles even if they use an underpowered GPU. Take for example the modified GeForce GT 7800 in the PS3, how far did it go? I think it did ALOT better than the GT 7800 on the PC side.

    Fallout 3 was designed as a console port for the PC. Computer graphics cards had more VRAM than console and people developed textures pack for the game making it a lot more beautiful but using much more VRAM which was impossible for PS3 and XBOX 360.

    The same will happen with console games of this gen, at some point, they won't have the vertex/shader heavy capability of PC cards but the memory for really beautiful textures at 1080p. Take those console port on pc with very beautiful textures but play them in 1440p or 4k... there you go, VRAM utilisation way beyond what has been seen in the past.
  • anandreader106 - Monday, January 26, 2015 - link

    Did you read the post by Mvoigt? You DO NOT have 8GB for textures! The OS eats up over a GB of RAM and the CPU will use RAM while processing whatever is going on in the game world. Maybe, you can get to 5GB of RAM for graphics,....maybe. But that means the world your traversing is pretty static and boring.
  • Galidou - Monday, January 26, 2015 - link

    Yep 4-5gb of ram for graphics on console is possible, but then you didn't read what I've said?

    I was focusing on the fact that the same game that can use up to 4-5gb of textures in 1080p, port it to pc users that will play it in 1440p and 4k give them some graphical options not available on console and there you go, games that can make use of 6gb of VRAM easily
  • Galidou - Monday, January 26, 2015 - link

    I should have specified that the 8gb is ''usable'' for graphics but will not be totally used so considering the OS, normal usage and so on.

    Let's not forget the Xbox one has 8gb of Hynix H26M42003GMR 8 GB eMMC NAND flash memory for storing OS information, caching memory to give the maximum usage of the 8gb RAM for games. But you already knew that, didn't you? the OS goes on standby when you enter a game, goes to the NAND flash and is then loaded from the flash when it needs to Wake up.
  • hermeslyre@hotmail.com - Monday, January 26, 2015 - link

    Your point is sound, but facts wrong. Both Consoles reserve a whopping 3.5GB or around for OS. You can Google it to confirm if you like. PS4 has 512mb of that reserved pool as flexible. At the most console developers have not a megabyte more than 5GB for use. And that's total system RAM. The only way the developer can dedicate 4GB to VRAM is to have only 1GB for the rest of the game, which is very unlikely, take a look at the system RAM usage in the task manager when playing any modern game. My 8GB system RAM gets very close to filled in many games.

    3GB Of dedicated VRAM would require the devs only use 2GB for general use in the game. A better situation but still skewered. 2 for VRAM leaves 3 for the rest of the game. Most developers are going to chose one of these 2, or something close. Which does not paint as dire a picture as you paint above.
  • hermeslyre@hotmail.com - Monday, January 26, 2015 - link

    Your point is sound, but facts wrong. Both Consoles reserve a whopping 3.5GB or around for OS. You can Google it to confirm if you like. PS4 has 512mb of that reserved pool as flexible. At the most console developers have not a megabyte more than 5GB for use. And that's total system RAM. The only way the developer can dedicate 4GB to VRAM is to have only 1GB for the rest of the game, which is very unlikely, take a look at the system RAM usage in the task manager when playing any modern game. My 8GB system RAM gets very close to filled in many games.

    3GB Of dedicated VRAM would require the devs only use 2GB for general use in the game. A better situation but still skewered. 2 for VRAM leaves 3 for the rest of the game. Most developers are going to chose one of these 2, or something close. Which does not paint as dire a picture as you paint above.
  • hermeslyre@hotmail.com - Monday, January 26, 2015 - link

    Last I checked both consoles reserve around 3.5GB for OS, With the PS4 having 512MB of that reserved pool as flexible. Which leaves not a megabyte more than 5GB available to developers to do their thing. On the PS4 at least.

    That's total available RAM for games, not just vram. The only way a developer can get away with taking 4GB of that RAM for video is to only use 1GB for everything else.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    That's ridiculous. Since when does a console need an OS that hogs 3.5 GB?
  • Galidou - Tuesday, January 27, 2015 - link

    +1 Oxford Guy, my Windows 7 uses 2.41gb of ram and I can guarantee it's WAY heavier than any console OS. Plus the guy didn't read about the Hynix H26M42003GMR 8 GB eMMC NAND flash memory used to cache most of the OS non-usable GUI ressources and such while gaming to alleviate RAM usage during games
  • Mvoigt - Tuesday, January 27, 2015 - link

    What your point... i have an 128gb cache drive in my machine... i like consoles.. but get of your high horse and get your facts straight... show me on game that looks better on console than pc and i will rest my case...
  • Mvoigt - Tuesday, January 27, 2015 - link

    http://www.geek.com/games/ps4-gives-5-5gb-of-ram-t... read the freakin ling boy , and again http://www.eurogamer.net/articles/digitalfoundry-p...

    Why dont you google instead of comming of like an ignorant mofo....
  • Mvoigt - Tuesday, January 27, 2015 - link

    And the xbox one link... http://n4g.com/news/1262357/microsoft-confirms-xbo...
  • OrphanageExplosion - Tuesday, January 27, 2015 - link

    It's actually 3GB on both consoles.

    The consoles are designed with a ten year lifecycle in mind. The reservation is that - a reservation. The OS and overall functionality will grow over the years and both MS and Sony gave themselves the room to expand.

    You can't *increase* the reservation going forward with new features in mind without breaking compatibility with older games.
  • Kevin G - Monday, January 26, 2015 - link

    The weird thing is that the PS3 could off load vertex processing to Cell where it could be processed faster there. Also the FlexIO link between Cell and the RSX chip in the PS3 was remarkably faster than the 16x PCI 1.0 speed the 7800GT on the PC side had. This faster bus enabled things like vertex processing offloading and sharing the RDRAM memory pool or texture caching.

    Similarly the Xbox 360 had eDRAM for massive bandwidth and used a special 10 bit floating point format for HDR. That console could perform remarkably well for its actual hardware specs.

    In reality, the greatest handicap the PC platform has isn't in hardware but rather software: Windows is a bloated mess. This is why API's like Mantle, DX12 and a rebirth of low level OpenGL have the hype as they cut away the cruft from Window's evolution.
  • Galidou - Monday, January 26, 2015 - link

    It's not totally Windows's fault, the problem is it has to be so compatible with everything and god knows there's a HOLY ton of software and hardware it needs to consider that exists and doesn't exist yet.

    It's easier to design a link between a CPU and a GPU helping each other when they will be paired together for life.
  • Kevin G - Tuesday, January 27, 2015 - link

    The bloat didn't stem from abstracting different types of hardware from each other so that they could be compatible. Rather, it was the software architecture itself that became bloated to maintain compatibility with existing applications using that API while the hardware continue to evolved. Many of the early assumptions of GPU hardware no longer apply but legacy DirectX imposes artificial limitations. For example, AMD and nVidia GPU's have supported far larger texture sizes than is what DiectX lists as a maximum.
  • Flunk - Monday, January 26, 2015 - link

    I personally don't think this would be a big concern to me, Even if it only had 3.5GB of RAM the 970 would still be a good deal.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    "Even if it only had 3.5GB of RAM the 970 would still be a good deal."

    Red herring.
  • dagnamit - Monday, January 26, 2015 - link

    I see why you're willing to give them the benefit of the doubt here, and I may after some time, but holy cow, that's a pretty big boat of mistakes . I mean NO ONE at Nvidia saw the reviews and felt the need to correct them (or if they're barred by contract from contacting review sites, send the reports up the chain of command.). It defies credulity, but stupider things have happened, I guess.

    That shuffling noise you hear is the sound of 1000's of lawyer attempting to file to be the representative for the inevitable class action.
  • jeffkibuule - Monday, January 26, 2015 - link

    Would the people with this kind of intimate knowledge really bother reading in detail a review of a product they worked on for months/years anyway?
  • airman231 - Monday, January 26, 2015 - link

    One possible reason can be to 'correct' any mistakes or misrepresentations that a major reviewer might make. And IIRC, there are instances where some review sites have made noted edits to their reviews after being contacted by NVIDIA (or AMD)

    I suspect many wouldn't want to work so hard on a product and see it misrepresented. They take the time and cost to ship out free cards to some of these review sites, and so I wouldn't be surprised that they'd have some interest in how it's reviewed and viewed by media that can influence opinion and sales.
  • RazrLeaf - Monday, January 26, 2015 - link

    I know from working on long term projects/products that once you're done, you tend not to look back. I've only ever looked back when someone came to me asking questions.
  • Ryan Smith - Monday, January 26, 2015 - link

    This is actually a very good point. Especially in chip design due to the long development cycle.

    By the time GTX 970 launched, the architectural team would already be working on the n+2 GPU architecture. The manufacturing team would be working on finalizing the next card. The only people actively vested in the product at launch were support, product management, and technical marketing. The latter of which is technically the division that evaluates reviews, and they of course thought 64 ROPs was correct.

    We get quite a bit of traffic from the companies whose products we review. But when most of those employees either don't know the specs of a specific SKU (Jonah Alben won't know that GTX 970 has 56 ROPs off of the top of his head) or those employees have the wrong value, there really isn't anyone to correct it.
  • slickr - Monday, January 26, 2015 - link

    O come on. You sound like Nvidia PR. So after working months/years on this architecture, you won't take several days rest and look at some of the reviews on your baby, on your product? No one in their company did? No one even skimmed through the first page of reviews and news?

    We've had reviews from when the cards officially launched, to months later reviews of custom cards, etc...

    To me you sound like you are on Nvidia's pay check. I'm sorry, but I expect critical view from the media, not PR talk. Either you change the definition of this site from "Website" to "fansite" or start doing critical journalism, not this white washing PR bullshit!
  • Ryan Smith - Monday, January 26, 2015 - link

    To truly understand this, you probably would need to have been on the phone with Jonah. The team that designed the architecture is not the team that picked the individual product configurations, and as a result they have no clue how many ROPs a part is supposed to have. Never mind the fact that they haven't looked at the architecture in a year or more.

    Modern product development is highly specialized, with each team working on its own little niche. This means they're generally blind to what everyone else is doing.

    Whether you find that answer satisfactory or not is up to you. But the employees reading the review are generally not going to be the employees who know that 64 is the wrong number of ROPs on one specific SKU.
  • OrphanageExplosion - Monday, January 26, 2015 - link

    I find that hard to believe to be honest, Ryan.

    You think a guy like, say, Tom Petersen who knows the product and knows the press inside out doesn't read the reviews and doesn't know the true specification of the GTX 970?
  • anandreader106 - Monday, January 26, 2015 - link

    "....But the employees reading the review are generally not going to be the employees who know that 64 is the wrong number of ROPs on one specific SKU."

    So the employees that would know that 64 was the wrong number do not read reviews on the internet?
  • JarredWalton - Monday, January 26, 2015 - link

    Correct, because they're doing hardware design. I highly doubt Jonah even does more than a cursory glance at most of the reviews. He's paid too much to be doing that. Even guys like Jen Hsun aren't going to read all the hardware reviews -- they'll get the executive summary of how the launch went.
  • mapesdhs - Monday, January 26, 2015 - link


    In the 1990s I knew someone involved with CPU development who told me about the nature
    of his work, the pay, the hours, etc. I full agree with Jarred, these people work very hard and
    are extremely well paid; they're not going to be reading reviews, they're far too busy, almost
    certainly working on whatever's coming next.

    Ian.
  • alacard - Monday, January 26, 2015 - link

    Hook, line, and sinker. Watching all these tech journalists rushing to Nvidia's defense is just priceless. Guys, when your master's knock some shit off their table for you to eat, have some self respect and try to recognize it for what it is so you can treat it accordingly in your write-ups because as it stands right now you're all Exhibit A in the inevitable inquiry into the death and total collapse of the fourth estate.
  • dragonsqrrl - Monday, January 26, 2015 - link

    Yes, all these credible tech journalists are conspiring with Nvidia to cover this all up. Has Demerjian issued a statement yet?
  • yannigr2 - Tuesday, January 27, 2015 - link

    When AMD released Hawaii and the hardware sites where seeing the core clock throttling, it was not about performance, but about the core clock of the GPU. Now that Nvidia lied about the specs, it's not about the specs, but about how small is the performance penalty. Do you see the difference?
  • OrphanageExplosion - Tuesday, January 27, 2015 - link

    So nobody from the NVIDIA tech team read the reviewer's guide either then? I genuinely think it probably was a mistake, but the notion that nobody at NVIDIA noticed that *all* the public specs on the card are incorrect beggars belief and the notion of journalists *literally* making excuses for NVIDIA is stunning.

    The bottom line is this: GTX 970 is a fantastic card, the perf is stunning, but NVIDIA released dodgy specs, didn't correct them when they became public and should have been transparent about the 512MB partition right from the get-go. It wouldn't have changed anything in the reviews, and the card would still have been a huge success, 56 ROPs or not.

    This is far less a perf issue and much more about trust.
  • yannigr2 - Tuesday, January 27, 2015 - link

    They live deep in a hole and Nvidia forbids them to have any contact with the outside world. No internet, no telephone, no TV, no sunlight. That way they secure that their intellectual property will not leak to the reds.
  • dragonsqrrl - Monday, January 26, 2015 - link

    I thought last week Ryan was bought out by AMD?
  • alacard - Monday, January 26, 2015 - link

    This isn't a coverup, this is journalism acting as the public relations arm of a company they work closely with. Think of it sort of like regulatory capture but instead of a public-private relationship you have a private-private one.

    With regard to your puerile Nvidia vs AMD remark, you might want to elevate yourself above those pathetic, tribalisic, and base arguments and join adulthood where we don't give a shit who is doing lying, only that they get called out and criticized for it in an effort to curtail similar behavior in the future.
  • dragonsqrrl - Tuesday, January 27, 2015 - link

    ... you responded to the wrong comment. Can't really blame you though, at a certain point this layout makes it impossible to tell who's responding to who.

    "you might want to elevate yourself above those pathetic, tribalisic, and base arguments and join adulthood"

    You don't do much self reflection do you? This sort of critique only helps your credibility if you bring it up while you're not flaming at other people.
  • D. Lister - Tuesday, January 27, 2015 - link

    "you might want to elevate yourself above those pathetic, tribalisic, and base arguments and join adulthood"

    http://www.anandtech.com/comments/8935/geforce-gtx...
    "Is that PT Barnum's johnson i see swinging from your asshole?"

    http://www.anandtech.com/comments/8935/geforce-gtx...
    "Guys, when your master's knock some shit off their table for you to eat,..."

    If irony were a dollar a pound, you could be sitting on a gold mine right there.
  • maximumGPU - Tuesday, January 27, 2015 - link

    Priceless!
  • alacard - Tuesday, January 27, 2015 - link

    Eh, just having some fun with that colorful language. Dragonsqrrl, like you said, i thought you were responding to me and one thing i cannot stand is every discussion devolving into two cheer-leading squads fighting over their favorite corporate logo. Corporations who would gladly hire those same cheerleaders for 18 hour shifts at slave wages so their executives and board members can buy another yacht.
  • tspacie - Monday, January 26, 2015 - link

    Sure. It's fun to see what reviewers think of a product after you've made it. That said it would have been easy to miss the recent 3.5GB controversy unless you were looking for it.
  • limitedaccess - Monday, January 26, 2015 - link

    What about other Maxwell GPUs? GTX 980m, GTX 970m, GTX 965m, and GTX 750. Are they all correctly listed and therefore do not exhibit this issue?

    In particular a comparison between the GTX 980m and GTX 970 would be interesting.

    Also how about Kepler as well?
  • bwat47 - Monday, January 26, 2015 - link

    Its impossible for kepler to have this issue because only maxwell can partially disable a ROP/memory controller
  • Ryan Smith - Monday, January 26, 2015 - link

    I have updated the article with more information (page 2).

    In short only GTX 970 has this configuration. 980M is a full memory controller layout, and 970M disables a whole ROP/MC partition, which means it's still balanced.
  • OrphanageExplosion - Monday, January 26, 2015 - link

    Interesting - 970M looks rather like a downclocked version of how I'd envisage a GTX 960 Ti, so hopefully there'll be no issues with that product.
  • bznotins - Monday, January 26, 2015 - link

    /shrug

    As a 970 owner, knowing this wouldn't have changed my buying decision whatsoever. All I really care about is output (quality/framerate) and price (thus value). All the technical details under the hood really don't change that.
  • mapesdhs - Monday, January 26, 2015 - link


    Succinctly and sensibly put.

    Ian.
  • just4U - Tuesday, January 27, 2015 - link

    Not quite.. prospective buyers look at the reviews and likely go by that with an initial purchase. It's quite possible that the reviews would have been slightly altered which could change a person's buying decision.. either getting something better (smaller chance..) or holding off (greater chance..) and sticking with what they have a little longer.

    The reviewer may not go into as much investigative detail in certain areas because they make assumptions based upon the spec sheet their given. If they'd known what was fundamentally off from the get go they'd have gone into greater detail and their summaries may have been toned down a little.
  • cgalyon - Monday, January 26, 2015 - link

    My concern, which may be completely unfounded, is that developers assuming 4GB of memory will opt to take advantage of that (either by over-allocation or actual usage) and the card reports it has that much available. When items enter into the segmented memory space, or a texture simply spans between the two (overflow of some amount), how will the card handle it? Will it produce stuttering, asynchronous events, or just an overflow crash? If any of those should happen, nVidia may be able to address that scenario (as you described in the article), but it would seem that 970 owners would be more dependent on nVidia's support. If that support is not forthcoming, then 970 owners would either experience performance problems or have to run at lower settings to prevent the problem from occurring.

    It currently seems like a non-issue, for which I'm glad (as a 970 owner), but if it ultimately means this card frequently encounters problems that need to be fixed, then this card becomes a bit of a headache to own.
  • DarkStryke - Monday, January 26, 2015 - link

    So you're telling me, the 970 isn't as good as the 980, who'd have thought?!
  • Horza - Monday, January 26, 2015 - link

    Thaks for explaining it so simply who needs the article at all. /s

    The 970 isn't as good as they said* n
  • Fernando H - Monday, January 26, 2015 - link

    I got my GTX 970 due to its excelent speep and great price. Now, when I say price, I consider the performance and how long I will keep it to my computer. since the memory has been reduced by around 13%, it is still a great performer, but I'm not too sure about its future. As games demand more and more memory, it can soon become outdated. At least sooner than it would if it actually had 4gb.
  • anandreader106 - Monday, January 26, 2015 - link

    You should be compensated, probably by class action lawsuit, for that extra value you thought you were getting.

    Even if it's just $20, you can put that towards your next card whenever that time comes.
  • mapesdhs - Monday, January 26, 2015 - link


    This obsession with lawsuits never ceases to amaze me. Nothing I've read suggests
    any action would have even the slightest chance of success.

    Ian.
  • anandreader106 - Monday, January 26, 2015 - link

    Riiiight. Nvidia is scrambling to explain away their mistake just for good faith. Companies have never been successfully sued for false technical specifications /s

    This guy just gave a solid example of how knowing that unknown could have changed his buying decision. Lawsuits have a roll in keeping companies honest. If companies didn't fear being sued for false advertising, do you realized how fucked up the market would be?

    Stop apologizing for a corporation. They messed up, simple as that. It's not the end of the world for you or Nvidia. But they need to learn from this.
  • zodiacsoulmate - Monday, January 26, 2015 - link

    so the card will decide when to use the slow part of memory?????
    That is just anonyy!!!
  • Dave4321 - Monday, January 26, 2015 - link

    It's actually better because it will always use it last.
  • makerofthegames - Monday, January 26, 2015 - link

    I knew everyone was overblowing this. Very interesting explanation though. Glad I waited until AT dived in before I made any judgement either way.

    And honestly, anyone running >3.5GB loads is going to be running 980s anyways.
  • Dave4321 - Monday, January 26, 2015 - link

    I think the bottom line is the GTX 970 has 3.5 GB of usable VRAM and saying ti has 4GB of ram is misleading to consumers. You wouldn't advertise a PC with a quad core CPU if two of cores had no possibility of adding performance.
  • RazrLeaf - Monday, January 26, 2015 - link

    But saying that there's 3.5 GB of VRAM wouldn't be entirely truthful either. Sure, it'd be the other side where you get more than advertised, but they're not wrong to say it has 4GB.

    Nvidia put themselves in to an interesting marketing corner with this memory setup.
  • MaikT - Monday, January 26, 2015 - link

    "NVIDIA has disabled 1 ROP/L2 unit, removing 8 “ROPs” (or rather 1 unit capable of 8 pixels/clock) and 256KB of L2 cache from the GTX 970."

    How many of those ROP/L2 units are disabled in a GTX 980M mobile variant?
  • Ryan Smith - Monday, January 26, 2015 - link

    0.
  • slickr - Monday, January 26, 2015 - link

    So basically Nvidia lied/omitted/didn't disclose the true information that this card is actually a big turd, a big cripple, that only has 3.5GB of memory and NOT 4GB as advertised. So false advertisement right there.

    Then it doesn't have the 64ROP's everyone was lead to believe, so false advertisement once again, another fraud on their part.

    Finally the memory bandwidth is actually false again, since the maximal theoretical speed is lower than the advertised one. So once again false advertisement, fraud by Nvidia.

    This begs for a class action lawsuit against Nvidia and for people to start voting with their money against Nvidia. Don't buy their products, since they are lying fraudsters.
  • MrSpadge - Monday, January 26, 2015 - link

    Of course, everyone buy the better product from the other.. oh wait, what's the better alternative if one pays for electricity? You can get about the same performance for a similar price from the R9 290X, but you pay for it with ~100 W extra power consumption.
  • garbagedisposal - Monday, January 26, 2015 - link

    Oh please, a couple of $ more per year for the vast majority of people. You must be one of the few who games 24/7. Please go rest your eyes.
  • DarkXale - Monday, January 26, 2015 - link

    Power consumption is equally much about heat output, which in turn dictates system noise. A 100W heat output difference makes cooling substantially easier.
  • MrSpadge - Tuesday, January 27, 2015 - link

    No, I'm one of the guys who doesn't have much time for gaming but is running GPU-Grid 24/7 on my GTX970. I know that's not typical usage, so I didn't say it directly.

    100 W more at the wall in 24/7 operation would cost me about 200€/year in Germany. For average gaming of 3 hours a day that's still 25€/year. Not huge, but if you keep the cards for a few years it makes the AMD significantly more expensive.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    The bottom line is that some people would have bought the 980 instead had they known that it actually has a full 4 GB of VRAM operating at full speed. How much you or anyone else likes the 970 is irrelevant. People make purchasing decisions based on the known characteristics of the product, the specs.

    Some people who decided to get two 970s for SLI may have not done that if they had known about the RAM.
  • slickr - Tuesday, January 27, 2015 - link

    Change your light bulb hippie. Going from 2 100W or 125W light bulbs to 2 75W light bulbs and you save like 1-2 euros per year. Big deal.

    Do you own an electric car? Do you own solar panels, do you unplug all your electric devices when you leave the house? Do you air dry your clothes? Do you only use cheap abundant electricity at night after 12O'clock midnight? Do you leave the oven door open with all the smells coming out in the winter to save on heating electricity?

    If the answer is no to even one of those, stop with your absurd bullshit how 50W or 100W is important in a graphic card, its not unless you are Nvidia shill or mindless green hippie.
  • MrSpadge - Wednesday, January 28, 2015 - link

    Dude.. WTF?! You have no idea that this is about money and not Hippie, don't you?

    2 x 125 W vs. 2 x 75 W light bulbs results in savings if 100 W. Regular light bulbs run for ~1000h, Halogens for ~2000h. Thereby the regulars would save 100 Wh over their lifetime and the Halogens 200 Wh. That's about 25€ for the regulars and 50€ for the Halogens! You may not care about such sums, others do. BTW: I'm using 3 to 15 W LED / fluorescent lighting.

    - electric car: too expensive to be worth it
    - solar panels: almost cheap enough (but not without subsidiaries), but you don't attach these to a rented flat
    - unplug all devices: only the ones which are not in frequent use (we're typically talking about <1 W here, 2€/year per device)
    - air dry cloth: of course
    - cheap electricity after midnight: no such contract here, but my PC does number crunching 24/7 anyway
    - oven: heating with gas instead of electricity - much cheaper

    As I said before: 100 W more at the wall in 24/7 operation would cost me about 200€/year in Germany. For average gaming of 3 hours a day that's still 25€/year. Not huge, but if you keep the cards for a few years it makes the AMD significantly more expensive.
  • TallestJon96 - Monday, January 26, 2015 - link

    I am going to buy a 970 next week, and while this doesn't change my purchasing choice, I'm glad it came out.
    I game at 1080p, so an effective VRAM of 3.5 gbs is enough, and is probably enough for 1440p at least for a while. The ONLY real world problems I see with this are SLI configurations for 4k, and the lifespan of the card. The memory is still more than this card needs, so it doesn't change much. Bad press for NVIDIA though.
  • slickr - Monday, January 26, 2015 - link

    It won't be. Shadow of Mordor, Dragon Age Inquisition, AC: Unity from the current games and new and upcoming games like The Witcher 3, Star Citizen, etc... will all be utilizing 4GB of ram and more.
  • Goranm - Monday, January 26, 2015 - link

    Dragon Age: Inquisition uses ~2GBs of VRAM on 1080p with all settings on Ultra...
  • dragonsqrrl - Monday, January 26, 2015 - link

    It's like you're shooting in the dark but nothing's hitting.
  • Dave4321 - Monday, January 26, 2015 - link

    Looking at Annantech's review, it seems there was already evidence of this happening in reviews. NO one put 2 and 2 together. The Black Flag ultra benchmarks show a very large frame time variance at 4k almost three times as much as the 290x. The 980 frame time variance is also more than the 290x, but not nearly as much as the 970. The 970 has an even bigger spike in frame time variance in watch dogs on ultra.

    Scroll through the slides for each game until you get to 4k frame variance here.
    http://www.tomshardware.com/reviews/nvidia-geforce...

    BF4 on ultra 4k also has a big frame time spike, but strangely so does the 980. The 3GB 780ti has the same problem The 290 and the290x do not.

    If you look at the Theif 4k benchmarks at normal not ultra, there is very little frame time variance because at that setting less vram is used.

    Now if you look at Thief which is on the normal preset, not the ultra. The frame time variance is much better as the game is probably not using more than 3.5GB of vram.

    http://www.tomshardware.com/reviews/nvidia-geforce...
  • taekcool - Monday, January 26, 2015 - link

    Good find. There is definitely a tendency to create spikes in frame time. Now, I am wondering why no one has published a new data focusing on frame time variance to prove that indeed GTX 970 does not have any issues arising from 3.5VRAM cap.

    Probably Avg FPS numbers won't be affected by this but time frame variance is also very important.. so... we need data? Why are tech gurus are focusing so much on story and not data this time? Are they in denial?
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    I believe I read the author here on this article who pointed out he has been trying like heck to produce a scenario where the 3.5/.5 split speed ram issue comes to light, but so far his efforts have failed, but he will keep trying to cause a problem....

    So the issue is, no one can produce an issue...
  • yannigr2 - Monday, January 26, 2015 - link

    Nice one.
  • Mr Perfect - Monday, January 26, 2015 - link

    making available to us a “triage team” of sorts of technical marketing, product management, and engineering/architectural personnel to answer questions and to better explain the issue.


    With all due respect to the marketing and product people, this is not a time when people are going to want to hear from them. This is when the engineers and architecture staff are who need to be explaining the hows and whys. If the marketing and product people had been upfront with the specs in the first place, no one would have cared and this would be a non-issue.
  • ruthan - Monday, January 26, 2015 - link

    Most alarming is that in this age could company simply lie about real hardware parameters and media and customers are unable to check it.
    There could be lots of similar design "features"..
  • Shadowmaster625 - Monday, January 26, 2015 - link

    So after all this time, no one realized that the GTX970 was missing 8 ROPs? This kind of shakes my faith in the enthusiast and review community.
  • MrSpadge - Monday, January 26, 2015 - link

    The fill rate tests looked strange, but people couldn't explain it.
  • yannigr2 - Monday, January 26, 2015 - link

    So, Nvidia left a lie for months everywhere on the net about the specifications of the card and only fix it because they feel pressured to do so. Nice.
  • nevcairiel - Monday, January 26, 2015 - link

    Its only a lie if its intentional. You are free to believe that, however as pointed out in the article, there is no scenario where this "lie" would get them anything positive, so its rather unbelievable that it would be intentional. So lets call it a mistake, an error, shall we?
  • zmeul - Monday, January 26, 2015 - link

    quick question: is the Maxwell revised NVIDIA’s DeviceQuery CUDA application showing what cache size?
  • Ryan Smith - Monday, January 26, 2015 - link

    For GTX 970 all versions of DeviceQuery show the same result whether they know about Maxwell or not: 1.75MB
  • frodbonzi - Monday, January 26, 2015 - link

    No scenario where the "lie" doesn't get them anything positive??? How do you figure?? They get to claim the card has the same memory as all the "big" cards, and sell it for a fraction of the price... they have EVERYTHING to gain by lying!!!

    Alas, they have lots to lose by having their lie FOUND OUT.... hence the delay in admitting they lied...

    And since when is a lie only a lie if it's intentional? I think you might want to read a dictionary....
  • frodbonzi - Monday, January 26, 2015 - link

    Lie: an inaccurate or false statement; a falsehood.
  • yannigr2 - Monday, January 26, 2015 - link

    You mean that all those thousands employs at Nvidia and all those top execs that where talking to the press about the new cards, don't use internet at all? Because that's the only explanation that I can give in the question "Why no one from Nvidia sent an email to the press about the correct number of ROPs and cache all these months?". Well, because they leave under a rock and they don't use the internet at all.

    Come on. Even kindergarten kids are less naive.
  • mapesdhs - Monday, January 26, 2015 - link


    Clearly, plenty of people here have chosen to believe that lies have been said,
    no matter what actually happened. They want it to be the case, so for them, it
    is, irrespective of evidence or logic. 1st Rule gets broken again.

    Ian.
  • frodbonzi - Tuesday, January 27, 2015 - link

    "Clearly, plenty of people here have chosen to believe that lies have been said,
    no matter what actually happened. They want it to be the case, so for them, it
    is, irrespective of evidence or logic. 1st Rule gets broken again."

    OK.... give me the evidence and logic that says lies HAVEN'T been said? I think all the evidence says that they DID lie!! They stated "facts" that were later proven to be untrue.... then they admitted that they did this... How is this NOT lying?!!?!?

    Definition of the word "Lie": stating something that is untrue....
  • moosehead85 - Monday, January 26, 2015 - link

    I just bought the MSI Golden 970 4gb.
    This was false information those card have 3.5gb

    What is Nvidia will do with ppl that have already bought all those 970??

    I didnt want to buy a 3.5gb card...

    They will give us something I can't believe this...
  • mapesdhs - Monday, January 26, 2015 - link


    If you'be just bought it then presumably you have a right of return, not that this
    whole issue makes the slightest difference to the performance results which I
    assume initially helped inform your purchasing decision.

    Ian.
  • Michael Bay - Tuesday, January 27, 2015 - link

    Come on, that`s clearly an amd shill trying to get shekels.
    They are truly desperate these days.
  • nicolapeluchetti - Monday, January 26, 2015 - link

    "As part of our discussion with NVIDIA, they laid out the fact that the original published specifications for the GTX 970 were wrong"
    Since they published them and advertised them, that would be "false advertising" at best, wouldn't it?
  • nevcairiel - Monday, January 26, 2015 - link

    Well, they weren't advertised publicly at least, not directly anyway. Their website and ads don't contain information about ROPs or details like this, and despite what some people claim, the card does in fact have 4GB of memory like the specs claim, even if in a unusual configuration.
  • mapesdhs - Monday, January 26, 2015 - link


    Plus, nico, you're getting into legalise stuff at this point. I'm not a lawyer, but I doubt the notion
    of false advertising is immediately linked to deliberate intent. It's obviously possible to advertise
    incorrect data without meaning to do so. And as this article makes clear, whether or not you
    think the card does have 4GB is open to interpretation, but strictly speaking, it does. If you
    think it does not (because of how applications are able to use the card), then please post an
    immediate critique of all dual-GPU cards which bare far more aggregious specs claims; you
    can slag off the AMD models, others can toast the NVIDIA units. :}

    Ian.

    Ian.
  • frodbonzi - Tuesday, January 27, 2015 - link

    They gave the FALSE SPECS to review sites - such as this one... those specs were not corrected until last week....

    Whether or not they "knew" they had lied... they STILL LIED!!

    And the only point in releasing specs to a review site is to have them advertised to the purchasing public.... whether or not they are legally culpable, it is pretty clear that they are in the moral wrong...
  • FlushedBubblyJock - Friday, January 30, 2015 - link

    Since you know so well on morality, may I have your assessment of AMD's giant PR lie ?
    http://www.anandtech.com/show/5176/amd-revises-bul...

    AMD lied and said 2B transistors, for months, then corrected the number downward 40%, with no explanation about the fraud.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    That "unusual configuration" is a break with many many years of tradition. Consumers were not expecting an unannounced "innovation" like this.
  • Taneli - Monday, January 26, 2015 - link

    I smell a class action lawsuit coming up
  • bigboxes - Monday, January 26, 2015 - link

    I was seriously looking into purchasing a 970 for my new build. I think I'm going to wait for a while to see how this whole fiasco turns out.
  • mapesdhs - Monday, January 26, 2015 - link


    Why? In what way does any of it change the performance results?

    Ian.
  • anandreader106 - Monday, January 26, 2015 - link

    Maybe because he games at a resolution higher than 1080p and wants a card with more longevity at that price point?

    Are you blinded by your brand allegiance? Why are you defending them so fervently?
  • mapesdhs - Tuesday, January 27, 2015 - link


    I don't have a brand allegience, I own over 50 cards, from both sides. :D

    I'm merely pointing out the total nonsense so many people are posting.

    And for gaming higher than 1080, I'd just get a 980 or whatever anyway. Indeed, numerous
    older SLI/CF options are more than potent (eg. 7970 CF).

    Ian.
  • caioc2 - Monday, January 26, 2015 - link

    I could believe they at first didn't know about it, but surely they noticed it much time before they admitted.
    Say that it still the same card as before means nothing, the fact is they sold a lie, there is no excuse about that.
    If there is no punishment for that, despite being a mistake or not, in some months/years it will happen more frequently and the respect for us clients will disappear.
  • hübie - Monday, January 26, 2015 - link

    I don't get the 56 ROPs? :| 3 * 4 ROPs = 12. 64 - 12 = 52. So how you get to 56? I think they are organized in quads.
  • JarredWalton - Monday, January 26, 2015 - link

    There are 4 memory units each with two two 32-bit memory interfaces, and each of those interfaces has L2/ROP blocks of 256KB/4 ROPs. Previously you could only enable/disable at a higher level of one of the four memory units. Now you can go in and disable just one of the L2/ROP blocks.
  • ilyIzzy - Monday, January 26, 2015 - link

    What a shitty ass thing to do. Picking up a 290x or a 390 on a future date.
  • Michael Bay - Tuesday, January 27, 2015 - link

    Of course you are.
  • inolvidable - Monday, January 26, 2015 - link

    Nvidia is doing damage control. They remove every new thread in their official forum and (allegedly) they comment on every main forum triying to diminish the fact that they lied. They advertised and sold us something knowing it was not true. The thread Nvidia has removed:

    https://forums.geforce.com/default/topic/803518/ge...

    More than 1700 complaining posts can't be good for business ¿right?. However here we are stuck with our "unpredictable" card and with 350€ less in best case scenario.
  • inolvidable - Monday, January 26, 2015 - link

    The original thread is available again. It seems there has been a misterious bug that deleted every new thread about this issue...
  • D. Lister - Monday, January 26, 2015 - link

    It is more helpful for all parties concerned if all the posts regarding a single issue remain contained in a single thread. Instead of 50+ threads about the same topic. It is what all websites do to keep their forums clean.

    Although I'm sorry if several of your threads, even those under different user IDs got deleted. That would be a lot of effort wasted if it actually happened.
  • bigboxes - Monday, January 26, 2015 - link

    Uh yeah... you're not a shill.... :eyeroll:
  • D. Lister - Monday, January 26, 2015 - link

    Oh splendid, someone recently got introduced to sarcasm. Fun to use, isn't it. Okay let's play, my turn, you sounded very clever. Go on, give it a try - with practice you might eventually get rather intimidating. Or, alternatively, we could just behave maturely and avoid petty name-calling. Your call...
  • Oxford Guy - Tuesday, January 27, 2015 - link

    The marketing people didn't know about the bug, just the engineers.
  • michal1980 - Monday, January 26, 2015 - link

    looks like the new ownership is already showing its influence over this site. "NVIDIA gains nothing by publishing an initially incorrect ROP count for the GTX 970"

    Really they gain NOTHING but putting out fake numbers? How about more sales? To much Nvidia ad dollars flowing to call a spade a spade. A cheat, a A cheat. Anandtechs credibility takes another hit.
  • zmeul - Monday, January 26, 2015 - link

    along with PCPer that has a damning apologistic article
    I call this #HardwareGate
  • mapesdhs - Monday, January 26, 2015 - link


    So presumably you agree that calling a 295x2 an 8GB card is even more fake? Ok, good.

    Ian.
  • EasterEEL - Tuesday, January 27, 2015 - link

    I cant imagine there are many GTX970 that are ok about this memory state of affairs along with coil whine. Anybody willing to drop £300 on a graphics card has to be a power-user expecting every ounce of performance. Nvidia need to undertake a PR exercise, maybe offering a £40 refund or a lifetime rebate off the next card purchased.
  • Kjella - Tuesday, January 27, 2015 - link

    Single card owners are almost certainly going to become rendering bound before they're memory bound, you can find reviews of the GTX 970 in SLI and it does pretty good at 3840x2160 with still only 3.5GB/4GB effective memory. Maybe the gap to the 980 will be bigger on future games, but all that's changed is on paper. The performance you saw in the reviews remain unchanged.
  • FH123 - Monday, January 26, 2015 - link

    I bought a 970 last year. An (admittedly small) part of the purchasing decision was based on it offering the same amount of memory and, hence, the same amount of future-proofing as the 980 in that regard.
  • D. Lister - Monday, January 26, 2015 - link

    Ah, so we're still milking the "controversy", eh? On the bright side, at least it gives the AMD stockholders something to cheer about, especially after the recent Q4 fiscal reports. Heaven knows it's been a while since we heard a peep from that direction. :p
  • bigboxes - Monday, January 26, 2015 - link

    Are you listening to yourself? You sound like a fanboy. I've owned AMD and Intel, ATI and Nvidia. Can't believe you brought AMD into this, like they had something to do with it.

    "Yeah, it's AMD's secret plot to save their dying company." - D. Lister
  • D. Lister - Monday, January 26, 2015 - link

    lol, you're reading far too much between the lines. I am not into conspiracy theories, and never stated or inferred that AMD somehow impacted Nvidia's design choices, and hence had anything to do, directly or indirectly, with the press nightmare that Nvidia is in.

    All I said was that it is to AMD's advantage, hence all the ensuing excitement from the otherwise lately rather lulled fan camp. A flimsy advantage, if one were to further speculate, but understandably clutched at, in times of fiscal desperation.

    Just calm down and remember what master Yoda said: "lead to high blood pressure, rage does." :p
  • mapesdhs - Monday, January 26, 2015 - link


    Hear hear! :)
  • dgingeri - Monday, January 26, 2015 - link

    It's like you buy a car advertised at 275hp, only to find out later that it actually has 260hp. Sure, the car still performs the same, but it was still a marketing lie to claim the higher horsepower.

    It's still false advertising, no matter if the card is still a great deal. Nvidia needs to make an apology for this. I don't care if it is a misunderstanding by the technical marketing team. They should be given accurate information from the engineers. This is the whole company that falsely advertised.
  • mapesdhs - Monday, January 26, 2015 - link


    Don't you know the old net adage that as soon as one reverts to a car analogy,
    one has lost the argument? ;)

    Ian.
  • alacard - Monday, January 26, 2015 - link

    Here's a template Nvidia could use for that very scenario: http://usatoday30.usatoday.com/money/autos/2003-09...
  • Daniel Egger - Monday, January 26, 2015 - link

    Still sounds like a decent card to me. If the prices continue to drop due to this, all the better.

    One thing I'm wondering though is: Instead of leaving all the decision making to the drivers and games I'd rather have the option to restrict the amount of usable VRAM to 3.5 GB to make sure the extra dud memory never get's used. How about that, NVidia?
  • xrror - Monday, January 26, 2015 - link

    THIS. This so much.

    Since the 512MB segment completly blocks the rest of memory (seriously, wtf) it's effectly useless.

    I keep getting stuttering in game with my 970, and I'm only running a single monitor at 1920x1200. The 970 should be completly overkill for this, but say playing Borderlands TPS if you jump off the roof in Trition Flats, the framerate tanks.

    While an older game like this at a lower resolution shouldn't touch a 970 yet... there it is. The Radeon 280 I had in before had no problems with this. So at the very least I'd like the option from nVidia to just totally disable the 512Mb segment, so I could rule it out as the problem.
  • D. Lister - Monday, January 26, 2015 - link

    Borderlands tps is using more than 3.5gb of VRAM at only 1200p? Borderlands tps? The game based on the aging UE3 engine? Maybe there's something else wrong with your system, or it has a very underpowered CPU. Because if a problem were to occur in your alleged 970 because of fragmented VRAM, it would only at extreme VRAM usage (~4GB) which, at mere 1200p, is rather impossible for any Borderlands game even at ultra settings and 16xAA.
  • xrror - Tuesday, January 27, 2015 - link

    my "alleged" 970? Please .... go home. Borderlands TPS - sorry, TPS = The Pre Sequel. But sorry it's such an obscure game to you.

    when I say "While an older game like this at a lower resolution shouldn't touch a 970" maybe you should take a clue then Mr. Condescending One, and get that indeed this older engine game should not even remotely tax this card.

    My irritation is yes, a 2GB Radeon 280 didn't stutter on this game (cause yea, Borderlands doesn't use this much memory, no fuc**** s*** genius) but hey MAYBE the card doesn't fill memory lineally or something.

    PLUS HEY, 1440p players should hit this wall way harder, but strange I can't seem to find many reports sadly.

    My underpowered CPU is a socket 1366 Xeon W3670 running at 4.3GHz with 24GB of RAM. If you don't believe me, then check here http://boincstats.com/en/stats/14/host/detail/1260...

    Next time, don't be a dick.

    Now for the constructive part, for the nobody else who will ever read this reply, buried inside this thread. Yes, I'm acutely aware that socket 1366 is an old platform, but it's still PCIe gen2 i think? But having the option to "turn off" the gimped 512MB memory segment would greatly help me in trying to remove my computer in question.
  • D. Lister - Tuesday, January 27, 2015 - link

    "Borderlands TPS - sorry, TPS = The Pre Sequel."

    Don't worry, I got that, I repeated the name to express surprise, as should've been obvious by my next sentence. Surprise, because it is made with the "Unreal Engine 3" (UE3), which is nowhere near as demanding as the modern engines like Frostbite 3 or CE3, or the newer UE4. But I digress...

    "My irritation is yes, a 2GB Radeon 280 didn't stutter on this game (cause yea, Borderlands doesn't use this much memory"

    Exactly, and the problem that this particular article addresses, is the VRAM is divided in two segments, where the larger segment (3.5gb) is initialized first, and the potentially problematic 0.5gb only kicks in when the first one completely fills up.

    Now since we both agree that Borderlands tps cannot use that much RAM, so deductive logic says the problem might be elsewhere. Which is what I said earlier.

    "My underpowered CPU is a socket 1366 Xeon W3670 running at 4.3GHz with 24GB of RAM. If you don't believe me, then check here"

    That's a good CPU indeed and I believe you, I just mentioned that as one of the several possibilities. What about other games? Are they all behaving similarly?
  • xrror - Tuesday, January 27, 2015 - link

    okay, my hostility is dialed down. thanks for a metered response.

    I want to clarify, my ranting isn't to pile onto the "hey, nVidia messed up the 970 bandwagon" it's "why am I having these weird performance issues on this card, when it should be total overkill for what I'm doing?"

    I don't want a class action settlement or whatever, I just want nVidia to find a workaround or fix for this. It helps everyone. OR maybe there is no "fix" because it's MY machine is just farked. But I can't seem to nail that down, which is driving me nuts. It also doesn't help that people who do report issues either got ... unhelpful response, or just nothing. And ugh now forums have lit up with 970 issues but it's so hard to sort the static from people piling on now.

    And yea, as I type that I now get your hostility... too many people piling on the BS bandwagon. D'oh.

    I'm trying to figure out a way to rule out it being a quirk with my cpu/platform. I'm not trying to jump on the 970 as the problem - it was just totally unexpected. "Hey, the 970 is equal to the radeon 290x with way less heat, and heck let's try nVidia this round, their drivers are the gold standard" and then have stutter/lag to 15fps in this old (engine) game. Like huh??? Feels like my time with the Radeon 6950 again - good card, but by the time the drivers didn't suck the card was out of date. *sigh*

    I need a good utility to show me VRAM usage. Afterburner just pegs at the "3505" for "Memory Usage" as soon as I start the game, which surely can't be right? As we've both said, there shouldn't be any way for this game to use that much at 1920x1200.

    Is it confirmed that the 970 explicitly only uses segment 0 before even thinking about segment 1? I know that's what nVidia implies, but they don't outright say it. I know, I'm a cynic but...

    Again I'm completely open to the idea that it isn't the video card, and it's some quirk with this being and older arch (x58 / socket 1366) but again, this wasn't a problem with the Radeon 280. And don't read that as being an ATi (sorry AMD) shill... lets say in my past going from a mach64 to an Xpert@Play wasn't so impressive when the voodoo 3 was in the market.

    My other test cases for games not working now work (!!! yeay?!) which I'm guessing driver updates?

    One of the "non-mainstream" ones being doom64ex... which now works great (it was horrible at 970 launch) so maybe there is hope for me.

    https://doom64ex.wordpress.com/

    So yea, I need a good util to measure video memory usage - if all these things i'm trying are way under 3.5GB (I really suspect they are) then it's something else then. And/or I'm doing something weird to drive up the usage (have no idea how? or what?) but again, I can't report if I can't diagnose. I'll be the first to admit I screwed something up... but if not, it might be good data points for others.
  • D. Lister - Tuesday, January 27, 2015 - link

    If the other games are working as expected, then the problem is most probably not in the hardware. It could just be a corrupted installation.

    Another possibility is older AMD drivers not having fully uninstalled. Download the "Display Driver Uninstaller" from Guru3D (link: http://www.guru3d.com/files-details/display-driver... ) and use it to remove all display drivers, past and present, reboot, and install the latest nvidia driver again.

    As for a "reliable" VRAM monitor, under the circumstances, for a 970, there is probably none. I personally use the sensor output of GPU-z for such tasks, but for a 970, I won't even trust that at least until its future update.

    PS: My apology for the "allegedly", it was quite unnecessary.
  • xrror - Tuesday, January 27, 2015 - link

    Yea, I've already used DDU (which is an awesome program, for those who haven't tried it). I think I'm going to just pull the card out and try it in another machine temporarily that has different hardware as a test.

    Being an "nVidia Way Its Meant To Be Played" title, something like this would have shown up already in testing, especially at higher resolutions.

    Also going to play around with PhysX (is there any way to turn it off? already at Low) which I wonder how that might be afflicted by less cache on the 970, or memory, or how does that even work?

    But considering there don't seem to be others reporting having issues like this, looking more like it's just my machine. More testing to follow.
  • D. Lister - Wednesday, January 28, 2015 - link

    "Also going to play around with PhysX (is there any way to turn it off? already at Low) "

    You can change that from the Nvidia Control Panel. By chosing "CPU" as the physx processor, which would shift physics load from all games to your Xeon, just like if you were running an AMD GPU. Although it is best left at the default "Auto" setting, to let the profiler decide individually for different games.
  • obsidian24776 - Monday, January 26, 2015 - link

    If this was a "miscommunication" between marketing and engineering how does this explain fact that GPU-z reports 64 ROPS for this card, did marketing write the BIOS for the cards?
  • MrSpadge - Monday, January 26, 2015 - link

    Because GPU-Z takes these values from a data base, which contains mostly the smae information the reviewers have. It can't read out much more than shader counts and memory sizes.
  • MrSpadge - Monday, January 26, 2015 - link

    Ryan, you're especially careful to formulate that GTX970 can not read from all 8 memory controllers at once. What about writing? Is that pipe big enough? Are there any other cases where the "8th" memory controller could be put to good use? Like hiding latency on the 7 main controllers?

    And I don't care much about the 3.5 vs. 4 GB debate. But seeing the practically useable memory bandwith reduced by 1/8 hurts [me].
  • JarredWalton - Monday, January 26, 2015 - link

    In general, GTX 970 isn't going to be dramatically impacted by the loss of bandwidth at settings that matter. Usually the shader/CUDA core performance is going to be a bigger obstacle (except in a few select games).
  • Ryan Smith - Monday, January 26, 2015 - link

    You are correct on all counts. The GTX 970 can't read from all 8 channels at once, but with a mix of reads and writes it's still possible to keep everything busy at once. e.g. sending a write to the 8th MC while the first 7 are doing a stride.

    Read speed is being highlighted because of the particular limitation of the single read return path, and because in consumer graphics operations most of your resources are static and therefore are read-heavy.
  • MrSpadge - Tuesday, January 27, 2015 - link

    Thanks, Ryan!

    Exploiting this possible "read+write interleave" (or whatever one wants to call it) would be very difficult, software-wise. And pretty low-level, for what is currently just a single GPU model.

    A better idea would be "fix" this on the hardware side: make one 32 bit port of the bidirectional bus between crossbar and L2 configurable for read or write. This way the link could transfer 2 x 32 bit per clock into the same direction, if needed, using the same number of lines as before. This would require some additional hardware, though, which would probably remain unused for regular ROP/L2/memory contoller clusters.

    MrS
  • LoccOtHaN - Monday, January 26, 2015 - link

    No Comments :D Waiting for R380X WC and thats All.
  • toyotabedzrock - Monday, January 26, 2015 - link

    You mean to tell me engineers don't enjoy reading marketing fluff and only skimmed it before approving it?
    I'm shocked
  • ZeDestructor - Saturday, January 31, 2015 - link

    Funnily enough, no. Gimme a short technical spec sheet or technical document instead.. owait. this is a project I closed about 2 years ago, I don't care anymore, next.

    Plus, with the salaries the average hardware engineer is paid, they just beeline straight for the flagships instead - who needs cut-down hardware when you can afford the top of the line GPUs trivially? That or they don't play games anymore, so they don't care.
  • terror_adagio - Monday, January 26, 2015 - link



    If anyone believes this was a communication problem between different departments at NVidia they might as well believe in Santa Claus. NVidia was clearly hoping no one would notice this and it slipped by some of the best hardware review sites and was found out by the real smart community. Shame on NVidia and shame on the hardware sites for failing to notice any of this.
  • tuxRoller - Monday, January 26, 2015 - link

    I'm sure this is interesting to some, but are we going to see that Nexus 9 review? That might end up being the only product with the denver core.
  • Ryan Smith - Monday, January 26, 2015 - link

    Very soon.
  • tuxRoller - Thursday, January 29, 2015 - link

    Thanks, Ryan. Great to hear.
  • bigboxes - Monday, January 26, 2015 - link

    On a tech site? About computers and computer parts? Egads! I can't believe that computer enthusiasts would be talking about marketing deception. Sorry to take your attention away from a tablet. Here's to AnandTech poopahing the whole fiasco in favor of a tablet review. :eyeroll:
  • tuxRoller - Thursday, January 29, 2015 - link

    I don't really care about the tablet. I care about the denver core. No one has done an in-depth analysis of it that I've been able to find. My hope is that AT will perform something similar to what they did with cyclone/swift.
    ...I thought this was clear since I called out denver in particular...
  • Will Robinson - Monday, January 26, 2015 - link

    Somewhere the former Iraqi Information Minister is smiling at these revelations.
    "There is nothing wrong with the Coil Whine Missing Memory Edition"
    "Do not worry"!
  • Qbgobbler - Monday, January 26, 2015 - link

    Interesting that there aren't any 960 with 4gb. As someone that almost didn't purchase the 970 because wanting to hold out for an 8gb card, I'm pretty upset right now. Titanfall lagoon lags like crazy and I know titanfall used to kill the 2 gb cards. Far cry 4 runs horribly.

    I think nvidia just needs to enable the missing features with a firmware update to make things right
  • D. Lister - Monday, January 26, 2015 - link

    Missing features? What missing features? Be honest, you didn't even read the article before commenting, did you?
  • Elixer - Tuesday, January 27, 2015 - link

    Sorry, this don't pass the smell test.
    Nvidia is a public company, and if they are passing incorrect specs/documentation to review sites in order to hide the short comings of their card, then, this was blatantly, and knowingly done.

    You very well know that Nvidia has multiple people reading, and fact checking EVERYTHING in the "review" in order to correct the information ASAP. This has been seen numerous times in the past when Nvidia points out some correction, that they feel should be changed.

    In this case, you did not hear one word from them, and this only came to light because some users of the card found the issue.

    This means that these "reviews" these days are nothing more than PR for the company in question, all so they can continue to receive their free review samples.

    What Nvidia needs to do is fully own up to all the 970 owners, and offer then either refunds for those that feel they were not given all the facts, and free games to the others that feel they had been wronged, but, aren't that upset about it.

    Review sites need to stop believing what the companies PR departments put out, there is no excuse for not checking the facts yourself, and calling them out on it.
    This isn't a one time thing for this site (and others) either, their have been other companies that have played with the facts, and sometimes, there is a update.

    Heck, if Nvidia wanted to, they could just advertise the card as 3.5GB and leave it at that, I doubt they would have lost sales because of the "missing" .5GB, even though it is there, but severely crippled.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    "NVIDIA gains nothing by publishing an initially incorrect..."

    Um... 970 SLI sales? VRAM limitations are a major factor in purchasing decisions when it comes to SLI.
  • Oxford Guy - Tuesday, January 27, 2015 - link

    Also, the word initially is a bit questionable since this product has been out for months.
  • beginner99 - Tuesday, January 27, 2015 - link

    lol. That's what happens when you are a fanboy. If you are not a fanboy the 290 is a much better deal performance/dollar and you would have bought that.

    Now what I don't like about this article: "And in this case the GTX 970 would still perform better than a true 3.5GB card since the slow segment is still much faster than system memory". Wrong because the true 3.5 GB card would get full bandwidth of the fast 3.5 GB all the time were as this turd gets 0 bandwidth from the 3.5 GB while the other 0.5 GB segment is accessed. IMHO that is the real problem.
  • D. Lister - Tuesday, January 27, 2015 - link

    :rofl: stop, stop, oh my god you're killing me. That is an incredibly accurate (and hilarious) impersonation of a technically inept person. Especially the last sentence - anyone who read that, and thought you were being serious, would completely be fooled into thinking you actually had no idea what you were talking about. Well done indeed sir, for its sheer comical brilliance, your post shall be remembered and valued long after we're all dead.
  • chopdok - Tuesday, January 27, 2015 - link

    Bandwith is a measurement over time. Gigabytes per second. When talking about the issue of GTX 970 memory banks - they cannot be acessed both at the same time. At the same clock. But to say that bandwith of the 3.5 segment will be 0 when the 0.5 is pulled is wrong. In 1 second, there are millions of clock cycles. Acessing the 0.5 segment will reduce effective bandwith measured over time of 3.5 GB segment. In worst case scenario - by a factor of 1/8, if you assume even spread of data, where the 3.5 portion and 0.5 portion are needed to be accessed at rates relative to their size. That doesnt of course happens, because as was mentioned - 3.5 segment has a priority, and GPU will try to fill it with bandwith-sensitive data before it touches the 0.5 segment.

    When memory usage is under 3.5 GB - the card performs as "true 3.5 GB card". When its over 3.5 GB, it will perform better than "true 3.5 GB card" but worse than "true 4 GB card".

    By no means I am defending nVidia. They dun fucked up. And to me, its the reduced ROP count, and the increased performance hit from AA, larger hit than I expected, especially in future titles, is the killer. But other people might have other issues. They screwed us over and lied to us, but it doesn't mean that we should spread misinformation and accuse them of wrong things. There are plenty of actual things they screwed, so inventing "0 bandwith" stuff is not necessary.
  • vred - Tuesday, January 27, 2015 - link

    Why would 3.5 GB and 0.5 GB segments have mutually exclusive access? From the ROP/MC diagram, I would expect that 2 MCs have mutually exclusive reading, that's all. That is, 3 GB are full speed, and the two remaining 0.5 GB areas share bandwidth with each other, hence only one of them is used when possible. Also keep in mind these are not contiguous areas - they are interleaved.
  • tabascosauz - Tuesday, January 27, 2015 - link

    Looks like my original comment was more or less correct. Nvidia attempted to pull off the "2GB 192-bit GDDR5" trick again, this time on a much subtler level and with more drastic consequences.

    It's easy to see how this might have been easy to overlook from an engineering perspective, but it's up to Nvidia to avoid these messy situations in the first place. How do you explain to a non-savvy enthusiast that the GTX 660 Ti's 192-bit bus is essentially a 128-bit + 64-bit bus? How do you explain to a crowd that has always been accustomed to the disabling of entire SM/SMXes, that GTX 970's GM204 is has SMs that are both enabled and disabled at the same time?

    Again, if there's one thing Nvidia can learn from this fiasco, it would be to stop skimping on VRAM configurations in the future. They really have no reason not to; the power efficiency comes from the the architecture of the GPU core, not the GDDR5, and being a little tougher and wholesome on the memory side will bring a little bit more performance in their fight against AMD. Maxwell is ROP-heavy, and with a wider VRAM bus, could easily perform better against the likes of Hawaii, which is surprisingly (and shamefully for Nvidia, who still crows about their "texture compression tech" to offer revolutionary perf at higher resolutions) competitive with GM204 at higher resolutions.
  • Klimax - Tuesday, January 27, 2015 - link

    Dear author, could you please use "damage control" next time when it is actually appropriate? Otherwise is is looking as quite a bias there. And very incorrect usage. (I guess that "AMD center" may have some negative effects...)
  • Klimax - Tuesday, January 27, 2015 - link

    And reminds me of logical fallacy called "Poisoning the well"...
  • Will Robinson - Tuesday, January 27, 2015 - link

    That sounds rather butthurtey.
    Yes,the GTX970 has a problem with memory allocation and a fair bit of coil whine.
    You'll just have to deal with it.
  • HalloweenJack - Tuesday, January 27, 2015 - link

    is reading and writing over the same crossbar at the sme time per segment even possible? reading to the 3.5GB and writing to the 0.5GB just to combat the latency seems a horrific work around
  • Achaios - Tuesday, January 27, 2015 - link

    I am not buying NVIDIA's excuses. They sound quite disingenuous to me. I hope that a Consumer Protection Agency in the USA sues NVIDIA over this so they may learn, the hard way, to respect consumers more in the future. I hope they pay a huge fine. EU is a ridiculous puppet, and there are no consumer protection agencies that are not sold out to the big company trusts, so I am not going to bother calling EU's attention to this. Once again, global consumers depend on action taken by the USA to teach unscrupulous and dishonest vendors a lesson.
  • SloppySlim - Tuesday, January 27, 2015 - link

    thanks for the tour de force schooling ;)
    my guess was out in left field .
    if there are no stall multipliers then the worst case is when the .5g bank gets hit exclusively , that would account for the 'up to' 70% frame hit .
    if I allocate and lock the first bank and then load new data into the slow partition , if/while all my drawing or compute is in the slow partition , We is screwed .

    essentially , we don't want to step on that half gig if we can't hide the latency .

    I have to disagree about it not being unexpected behavior as no one outside of NVidia had a clue about the differing partition speeds ?

    depending on end user , there are situations where this might , or may increasingly become more than an annoyance .
    I'm not one presently , and I appreciate NVidias disclosure .
  • Galatian - Tuesday, January 27, 2015 - link

    I think there explanation is a little fishy. I mean they had the driver obviously ready, so quite a few employees knew about the architectural quirks of Maxwell. Also so convenient to just come out after the Christmas sales.

    What I would like to look at are FCAT results. From what I hear people complain about frame time issues and not about low FPS with games demanding more then 3,5 GB of RAM.
  • Rock1m1 - Tuesday, January 27, 2015 - link

    I dont' understand the picture used for the article though. Did a reference GTX 970 ever existed?
  • irsmurf - Tuesday, January 27, 2015 - link

    Yes, the reference GTX 970 exists. The only store selling them is Best Buy: http://www.bestbuy.com/site/nvidia-geforce-gtx-970...
  • yannigr2 - Tuesday, January 27, 2015 - link

    I hate to see Anandtech becoming Tom's Hardware. Nice cover up guys.
  • Michael Bay - Tuesday, January 27, 2015 - link

    You forgot to tell us you`ll never visit again.
  • yannigr2 - Wednesday, January 28, 2015 - link

    I will. I saw Tom's article today about the 970. It's much worst than this one.
  • irsmurf - Tuesday, January 27, 2015 - link

    This is incredibly disappointing to me. While my GTX 970's perform better than the GTX 680's I just upgraded from, I upgraded EXCLUSIVELY for more RAM. Star Citizen at 2560x1600 was suffocated by my 2GB 680's, and it's still struggling with the 3.5 GB on the 970's. I need more memory. That's all I've wanted since I went to 2560x1600, five years ago. I wouldn't have upgraded to the 970's if I'd known they didn't even have 4GB of fully functional RAM.
  • Dr.Neale - Tuesday, January 27, 2015 - link

    To me, the issue is very simple. NVidia marketed the 970 as having the SAME memory subsystem as the 980. Not "almost the same with a really great workaround in the drivers" but the SAME.

    That was a big selling point. And it allowed NVidia to sell all those 970 chips (with a minor memory defect and a great workaround) as having the "same" memory performance as a 980 chip (with no defect), even though it wasn't strictly true. The mere existence of the great workaround proves that. But the workaround was so great that the performance hit was virtually nonexistent in practice.

    So they told a "white lie" and made it through the holiday shopping season before they got caught and "fessed up".

    Yes, 970 buyers got a bargain price. But, anybody who buys a "factory second" with "cosmetic" defects "negligibly" affecting product performance ALWAYS get a bargain price. That's why scratch and dent sales are always so popular.

    But most people are peeved if they bought something supposedly of "first" quality, only to discover later that it was actually a "second" with a really great touch-up job. They feel cheated, no matter HOW great the touch-up job looks.

    NVidia failed to disclose that the memory subsystems of 970 chips were "seconds", and sold them as being the SAME as the "firsts" in 980 chips.

    People love a great bargain. But they need to know exactly what they are giving up to get that great bargain BEFORE they buy, not AFTER, in order to make a FULY INFORMED purchase decision.

    It's a little like giving informed consent before surgery. It's a matter of ethics, not outcome.

    Some people only care about the outcome (performance vs cost). Other people want to know exactly what they are getting before they sign on the dotted line.

    Most people will tolerate a minor flaw for a major savings, but "minor" and "major" are both judgement calls.

    NVidia acted unethically in not making full disclosure before marketing chips with a known flaw and a great workaround as being the "same" as chips without that particular flaw.

    No matter how big the price reduction, or how little the performance reduction, they hid the flaw until they were forced to reveal it and confess.

    That's what bothers me. How much can I trust the published specs on ANY tech purchase, if NVidia gets away with this Scott Free?

    It's not the cheating, it's the betrayal of trust. How many NVidia Lovers will become Exes over this? How many will forgive them, believe their promises that they will never do ut again, and give them another chance? How many will divorce NVidia and walk?

    Only time will tell.
  • chuwdu - Tuesday, January 27, 2015 - link

    Maybe this is not a big deal but still this is very simple 3.5GB is not 4GB.
  • Nilth - Tuesday, January 27, 2015 - link

    Yeah, my thought too. And I'd like to remember everyone that this isn't a cheap card at all. Right now I have the asus strix on amazon at 361€, and knowing that in the future I won't be able to use the full 4GB or vram and have proper memory bandwidth and performance is really really bad.
  • DiReis - Tuesday, January 27, 2015 - link

    I did some testing on my GeForce GTX 680 and found the same Fast/Slow Memory pattern on it.
    mine has 2GB or VRAM and when it goes over to the last ~250MB it slow down from ~200GB/s to ~20GB/s.

    I used the same "benchmark" app I saw some sites using (a simple text-only) and it seems to mirror the GTX970 behavior.

    I see they mentioned this was used on the 660, but nothing on the 680.
  • jann5s - Tuesday, January 27, 2015 - link

    Is there an indication that the card can disable ROP's on the fly, to save power?? that would be great.
  • YoloPascual - Tuesday, January 27, 2015 - link

    You were sales-talking me that I would get a 5.0L V8 for the mustang that I will be buying but for some other reason the mustang you gave me has a 3.0L V6 :(
    It gets the job done but..
  • Ballist1x - Tuesday, January 27, 2015 - link

    My analogy is as follows:

    Its like buying a V8 engine car, except it can only ever run as a V7 or a V1. Never as a V8.

    As a V1 you get a lumpy ride and as a V7 you never truly get the full performance of a V8.

    Can it be sold as a V8?
  • Michael Bay - Tuesday, January 27, 2015 - link

    AMD does it all the time when selling multicore cpus, yet there is no outrage.
    Maybe that`s because nobody is expecting anything out of them anyway.
  • AnnonymousCoward - Thursday, January 29, 2015 - link

    Yolo, your car comparison isn't valid. 5.0L and V8 are significant for bragging rights, while someone who buys a non-top-of-the-line 970 won't even know what the hell a ROP is. Second, performance is the thing to focus on.

    So here's a better analogy: you thought you bought a 450 horsepower 5.0 V8 with 64 air inlets, but it turns out you have a 450hp 5.0 V8 with 56 air inlets.

    Moral of the story: why give a shit?
  • iamKG - Tuesday, January 27, 2015 - link

    i really wonder....
    Are GTX980 also have the same memory structure? (3.5GB + 0.5GB)
  • mapesdhs - Tuesday, January 27, 2015 - link

    No.

    Ian.
  • Rollo Thomasi - Tuesday, January 27, 2015 - link

    Couldn't the 970 be potentialy worse then the a card with the same gpu and only 3.5GB?

    While the gpu is trying to access the slow last 0.5GB the first 3.5GB is inaccessable right?

    If the card had only 3.5GB and the game needs 4GB it woud have to use a painfully slow 0.5GB of main memory through the PCIExpress buss but at least it could still have access to the first 3.5GB while waiting.

    Am I right?
  • zlandar - Tuesday, January 27, 2015 - link

    I would be upset if I found out my video card has 3.5 GB of video RAM when it's advertised as 4 GB.
  • xenol - Tuesday, January 27, 2015 - link

    Even if NVIDIA "overestimated" the specs and "lied to customers", this just makes the card actually appear better considering how well it performs.
  • koss - Tuesday, January 27, 2015 - link

    I am really impresses by the comment section. Those doing reviews are aware they'd be out of business soon and join the dark PR site. 'Performance do not suffer.', says a guy in Anandtech, who talks the talk of a GPU mastermind, yet he walks the walk of his colleagues in nVdia PR team - the very same people not knowing specs of a product they work on(and its not a joystick or dvd player, something their company tries for the very first time and they never heard or seen before). You are coming as either really gem of an employee or you mustthink we all idiots.

    Btw why didn't your professional reviewers find the problem, but those people who shouldn't question specs and just TRUST you, because they don't understand better? You know the same one that don't talk with engineers and can't get it how a review of a product they were working on for two years is a thing they've never seen. They only logical reason being - it is done by unbiased expert just like you and therefore easily predictable.
  • dejo1967 - Tuesday, January 27, 2015 - link

    I have a gtx970 and think its a fantastic card. The problem comes from the fact that Nvidia would rail their mother to make a few extra dollars. As customers, we dont earn any repsect from nvidia! I am one of the ones that purchased a 6800GT that was broken at the chip level and wouldnt play high def video as they stated it would. That card would use 100%cpu load to do anything. They came out with the 6600GT and it would use roughly 30% cpu to play hd video. Nvidia then at least did come out of the closet and state that the chip was broken. But they didnt offer to take care of those of us that did pay full price for a card that didnt do what what stated.
    In the end they came up with a software workaround that did bring cpu usage down a bit. But they then wanted another $40 for that as a fix. Nvidia is the worst company I have ever dealt with when it comes to taking care of customers.
    This whole fiasco is about Nvidia wanting to make something looks as though it is better than what it is and to take as much advantage of the customer base as they can. There are zero morals to be found in the whole nvidia corporation. Take that as the facts
  • Wesleyrpg - Tuesday, January 27, 2015 - link

    This review has been sponsored by nvidia

    ;)
  • Exchequer - Tuesday, January 27, 2015 - link

    Obviously a lot of time and effort went into writing this article.

    However the one thing I do not get is why there are no frametimes performance figures. Nvidia has commented that the performance degradation is only 1-3%. However this is measured in 'old school' average fps.

    It is possible (maybe even likely if I understand the story correctly) that a game running on 3.6 GB of vram might show 100 100 30 100 100 100 100 30 100 100 100 100. Meaning that in terms of average FPS you will see nothing worrying here. But in terms of percentile performance you will see annoying lagspikes going from 30 to 100 fps.

    So Instead of knowing the reduction of average fps on the 3.5 gb vs >3.5gb performance we NEED to know the increase in "worst" percentile frametimes (or framerates). Only then can we be sure that no annoying micro stutter is introduced at 3.5+ GB loads.
  • Ryan Smith - Tuesday, January 27, 2015 - link

    "However the one thing I do not get is why there are no frametimes performance figures."

    We only had 12 hours (overnight no less) to prepare the article, which meant there wasn't time to do anything more than this.
  • Bytales - Tuesday, January 27, 2015 - link

    Now its interesting to see how Geforce 750 is compartimentalized compared to the full chip, the 750Ti !
    Is it the same issues, that wasnt discovered, only because these are cheaper cards, and if someone get such a cheap card will get probably the 750ti !?
  • snouter - Tuesday, January 27, 2015 - link

    I had two 2GB GTX760 in SLI. I got tired of fussing with the SLI and sold those cards and got a 4GB* GTX970.

    I play games, I don't really sit around benchmarking and blah blah. I knew I was taxing my 2GB cards though and SLI does not pool memory, so it was not like I was in a 2GB+2GB situation.

    The 970GTX works fine. Except... when I do grow into it, my ceiling won't be 4GB, it will be 3.5GB. When I go to sell it used, it will "be that crippled card."

    It's not the end of the world. My girl loves me and I have meat in the fridge, but... there is no way around it. This is not the video card I thought I bought.
  • Quad5Ny - Tuesday, January 27, 2015 - link

    @Ryan Smith
    Do you think it would be possible to have a driver option to use 3.75GB and forgo the split partitioning? Or would that not be possible because of the 1K stripe size?
  • Ryan Smith - Tuesday, January 27, 2015 - link

    3.5GB you mean? In theory I don't see why that shouldn't be possible. But keep in mind if you were to do that, you'd start spilling into system memory instead of the second segment.
  • Quad5Ny - Thursday, January 29, 2015 - link

    Yup 3.5GB. For some reason I was thinking each chip was 256MB while writing that.
  • 3ricss - Tuesday, January 27, 2015 - link

    At $329 the GTX970 is a compromise I'm willing to take. And did.
  • gudomlig - Tuesday, January 27, 2015 - link

    I own an MSI gaming 970. It runs everything at 1080p smooth as butter and runs most of my games in 1080 3D (vizio passive HDTV) with no trouble. 3D solution by NVIDIA is a bit weak,had to do some driver work arounds to get around 30fps lock at 1080.My 7950 with tridef seems better on some games but given there is like none of us trying to use 3D I guess I can't complain that much. So what they screwed up the specs by a tad, its not like this isn't still a serious kick-a$$ card. The benchmarks speak for themselves, find me a practical application where this matters and maybe then I'd care, but probably not.
  • HisDivineOrder - Tuesday, January 27, 2015 - link

    I think the theory laid out here for why nVidia would be a fool to lie assumes the lie was out the gate intended to be a lie OR that they could have just been the victim of a terrible mixup. I think the answer is somewhere in between.

    I think the far more likely scenario is they did not set out to lie to the press, but when the mixup happened and they discovered it (almost right away), they realized that they could wait a few months and let the thing play out through the holiday season. They would make a ton of sales, they could focus the press entirely on the performance given rather than the specs and when the truth was discovered they could shrug it off as unimportant because really performance was all that mattered. Not specs.

    The fact that they knew for months would mean little because ultimately the performance and benchmarks would still be (mostly) applicable and people who bought in got exactly what they were promised even if they didn't know to ask the precise question that would have illustrated greater weaknesses than they expected in the long run.

    So the deception carries on for months and then when pressed about it, delaying talking about it for a month (Dec-Jan, big sales month), they admit it after all the sales and virtually all the return periods are up. Then they shrug and say, "But the performance is the same anyway, so hey."

    That's the way they went. Imagine if they had not. Imagine instead if they had announced it as soon as they realized it after the initial reviews went out. Suddenly, the big story is not the amazing performance of the card, the value of the card compared to AMD's pricing at the time, or the percentage of performance you get compared to the nVidia high end. The story is how the press were mislead and had to change the specs. The story becomes what it is now, except without all the sales in front of it.

    Suddenly, the 970 has a stink of failure on it and people avoid it even though the performance is just as good as it seems. "nVidia tried to pull a fast one," people would say (like they are now). Except BEFORE all those sales happened. Now, the card won't sell and all because of a mixup in the marketing department. Now nVidia's got the stink of fail on them from being brave and admitting what they'd done by mistake, leading to story after story of how nVidia mistakenly mislabeled the card's technical specs.

    Tanking sales through the holiday season by a decent margin and costing nVidia tons of money.

    That's the lie, people. The lie is not the mixup as though they don't happen. They absolutely happen. The lie is nVidia not knowing almost immediately they'd mixed things up. You know they did. And unlike the writer of this article, I see a clear and easy motive for why they'd continue the lie. They wanted to stall and shrug and gesture and act like they were figuring out what happened right up until the cards they'd sold between November and December were all universally securely at home in buyer's possession.

    Once the holiday return periods were up and once the cards were mostly bought as much as they were going to be in the mad rush, that's when they fess up.

    It's the old adage: It's easier to be forgiven than ask permission.

    There's your motive for deceit. I'm not saying it's right. I'm just saying that's the motive and that's why they did it and that's the timeline for how they did it. The sad part is the article here is not wrong that if nVidia had made no mistake in the first place, the story would have been squarely on how great a value the 970 was.

    But after the mistake, nVidia had the choice of fessing up and losing a ton of sales to bad press surrounding a non-issue or stall for a few months until purchases were settled and unreturnable (mostly), then fess up instead and grin and say, "Whoops."
  • SunnyNW - Tuesday, January 27, 2015 - link

    Except for the people in the Forums are the ones that brought this up not nvidia on their own...Just so many cards had been sold an a larger percentage of people starting noticing issues. But of course they knew, I agree, not Initially but pretty soon after (within days For sure). Just everything played out (time-wise) as best as it could for nvidia, considering the circumstances.
    The issue here is The Performance of the card contrary to what most keep saying, the performance of the memory. The card simply does not act the same way as a "traditional" 4GB would. Yes the extra .5GB is better than system memory but that does not change the latter fact.
  • Expressionistix - Tuesday, January 27, 2015 - link

    Most of the people buying these things just use them to play video games on the computer - does anyone really care?
  • R. Hunt - Wednesday, January 28, 2015 - link

    Gamers pay good money for these things, so I don't see why not.
  • nos024 - Tuesday, January 27, 2015 - link

    wow...as if knowing this info changes all the benchmarks. i am more disappointed with the 128bit memory bus on 960gtx.
  • nos024 - Tuesday, January 27, 2015 - link

    Oh and i bought a brand spanking new 970gtx today despite after reading this article. Msi version.
  • Dr.Neale - Wednesday, January 28, 2015 - link

    Under the circumstances, I strongly believe that NVidia should be forced to accept the return of any 970 the customer no longer wants to own, on the grounds that it does NOT MEET THE PUBLISHED SPECIFICATIONS and is therefore DEFECTIVE in that it was NOT AS DESCRIBED.

    For example, AMAZON has exactly this policy, giving the customer (at least) 90 days to return any such product sold through Amazon Marketplace, for a full refund of all costs.

    Now that NVidia has admitted that the original published specs are NOT MET by EVERY SINGLE 970 card, they would have no way to deny any customer claim.

    I believe that Consumer Protection Laws would also dictate that a full refund must be issued within a reasonable time after the defect is "found".

    So, to those who are unhappy with their 970 purchase, use this as a means to get a full refund, and buy something else instead.

    To those who aren't willing to give up their wonderful 970, simply accept the fact that this memory defect is main reason the 970 is so much cheaper than the defect-free 980, and move on.

    I further believe it would be in NVidia's long-term interests to facilitate the return of any unwanted cards, and to offer some freebie to compensate those willing to keep their 970 cards, despite the defect.

    Anything less is unacceptable.
  • GGlover - Wednesday, January 28, 2015 - link

    Early adopter here. I paid for 2 970's over 1 980 because I was lead to believe that the specs were extremely close and that the 2 970's were slightly cheaper than a single 980. I had believed that they would perform better than a single 980 (extra ram etc.). I would have probably gotten a 980 had I known that there was in fact a much larger difference in specs. Real world performance or not. The numbers weren't really in at that time. So I was misled by a bait and switch.
  • Oxford Guy - Thursday, January 29, 2015 - link

    SLI is definitely the biggest problem Nvidia is facing.

    This article's author said he couldn't think of a reason why Nvidia would benefit from misleading consumers, but SLI purchasing decisions are heavily influenced by the VRAM amount on a card. Having the 980 be ostensibly the same in terms of VRAM was a very significant factor as well as the claimed amount for the 970 by itself.
  • jbluzb - Wednesday, January 28, 2015 - link

    I do not like their unlawful business practice of false advertising. They waited after the Christmas season is over before acknowledging that there was indeed a problem in the reported specs.

    That is what really turned me off from the company. This will be last NVIDIA card that I will ever buy because I do not want to support a company who does such things to its customer.

    Also, I it made me weary of review websites. It is big eye opener for me ---- they are just different websites handled by a marketing team. They cannot talk negatively about a company because they are a major sponsor. There is no such thing as truth in journalism. :(
  • Will Robinson - Wednesday, January 28, 2015 - link

    You're going to love this then...
    http://gamenab.net/2015/01/26/truth-about-the-g-sy...
  • Oxford Guy - Thursday, January 29, 2015 - link

    Fascinating link, for sure.
  • mudz78 - Wednesday, January 28, 2015 - link

    "we have also been working on cooking up potential corner cases for the GTX 970 and have so far come up empty"

    Riiight.

    "As part of our discussion with NVIDIA, they laid out the fact that the original published specifications for the GTX 970 were wrong, and as a result the “unusual” behavior that users had been seeing from the GTX 970 was in fact expected behavior for a card configured as the GTX 970 was."

    Nvidia has already admitted they had complaints about performance.

    If you want to come up with scenarios where the 970 shits its pants you should really try harder:

    http://www.overclock.net/t/1535502/gtx-970s-can-on...

    http://forums.guru3d.com/showthread.php?t=396064

    https://www.reddit.com/r/hardware/comments/2s333r/...

    http://www.reddit.com/r/pcgaming/comments/2s2968/g...

    All of those threads have been around for weeks before Nvidia's announcment.

    Who cares what Nvidia's take on the situation is? It was an accident? Oh, no worries, mate!

    They are a business that lied, there's consequences to that. Nobody cares that they didn't mean it.

    Refunds will start rolling out in coming weeks.
  • Yojimbo - Wednesday, January 28, 2015 - link

    Hey, can you link to the actual relevant part of those threads where someone is posting his methodology and results for creating a performance problem? The overclocker link seems to be a link to a 106-page thread whose first message is just a link to the other 3 threads you posted. The first message in the guru3d thread claims that the card can't use more than 3.5GB at all, which we now know to be completely false. It's like you're throwing us a cookbook and flour and saying "Here, there's a pie in here somewhere." If it's somewhere in there, and you have seen it before, could you please find and point to the methodology and claimed results so that people can try to repeat it rather than you just saying "you really should try harder"?
  • mudz78 - Wednesday, January 28, 2015 - link

    I think a more fitting analogy would be, somebody is complaining they can't spell and I am handing them a dictionary. I'm telling you the information is in there, so have a read and find it.

    Maybe if you bothered to read beyond the first post in each thread you would have some answers?

    " The first message in the guru3d thread claims that the card can't use more than 3.5GB at all,"

    No it doesn't.

    "I think (maybe) is here a little problem with GTX 970. If I run some games, for example Far Cry 4, GTX 970 allocate only around 3500MB video memory, but in same game and same scene GTX 980 allocate full 4000MB video memory.
    But if I change resolution to higher - 3840x2160, then all memory is allocated.
    Same problem exist in many other games like Crysis 3, Watch Dogs etc..

    Where is problem?? I really dont know..."
    http://forums.guru3d.com/showthread.php?t=396064

    "I didn't believe this at first, but I just decided to try and test it myself with texture modded Skyrim and my SLI 970s. I tried to push the 3.5 GBs barrier by downsampling it from 5120x2880 with the four following experimental conditions:

    1. No MSAA applied on top
    2. 2xMSAA applied on top
    3. 4xMSAA applied on top
    4. 8xMSAA applied on top

    Since MSAA is known to be VRAM heavy, it made sense. I also kept a close eye on GPU usage and FPS with the Rivatuner overlay as well as VRAM usage. All of this was done running around Whiterun to minimize GPU usage. My results were as follows.

    1. Skyrim peaked at about 3600 MBs in usage with occasional brief hitching while loading new textures in and out of VRAM. GPU usage remained well below 99% on each card.

    2. Skyrim once again peaked at about 3600 MBs with the mentioned hitching, this time somewhat more frequently. Once again, GPU usage remained well below 99%.

    3. Skyrim yet again peaked at about 3600 MBs and hitched much more prominently and frequently at the same time as VRAM usage droppped down 100-200 MBs. GPU usage was below 99% again with FPS still at 60 aside from those hitches.

    4. Now Skyrim was using the full 4 GB framebuffer with massive stuttering and hitching from a lack of VRAM. This time, I had to stare at the ground to keep GPU usage below 99% and retain 60 FPS. I ran around Whiterun just staring at the ground and it remained at 60 FPS except with those massive hitches where GPU usage and framerate temporarily plummeted. This last run merely indicated that Skyrim can indeed use more VRAM than it was with the previous 3 settings and so the issue seems to be with the 970s themselves rather than just the game in this example. The performance degradation aside from VRAM was severe, but that could just be 8xMSAA at 5K taking its calculative toll.

    So it seems to me that my 970s refuse to utilize above ~3600 MBs of VRAM unless they absolutely need it, but I've no idea why. Nvidia didn't gimp the memory bus in any overly obvious way from the full GM204 chip therefore the 970s should have no issue using the same VRAM amount as the 980s. I don't like what I see, it's like the situation with the GTX 660 that had 2 GBs but could only effectively use up 1.5 without reducing its bandwidth to a third, so it tried to avoid exceeding 1.5. The difference is that was predictable due to the GK106's 192-bit memory bus, there's nothing about the 970's explicit specifications that indicates the same situation should apply.

    A similar shortcoming was noticed sometime back regarding the 970's ROPs and how the cutting-down of 3 of GM204's 16 SMM units affected the effective pixel fillrate of the 970s despite retaining the full 64 ROPs. It's possible that Maxwell is more tightly-connected to shader clusters and severing them affects a lot about how the chip behaves, but that doesn't really make sense. If this is an issue, it's almost certainly software-related. I'm not happy regardless of the reason and I'll try more games later. Anecdotally, I have noticed recent demanding games peaking at about 3500-3600 MBs and can't actually recall anything going beyond that. I didn't pay attention to it or change any conditions to test it."
    http://www.overclock.net/t/1535502/gtx-970s-can-on...

    "I can reproduce this issue in Hitman: Absolution.
    Once more than 3.5GB get allocated, there is a huge frametime spike.
    The same scene can be tested to get reproducible results.
    In 4k, memory usage stays below 3.5GB and there is no extreme spike. But in 5k (4x DSR with 1440p), at the same scene, there is a huge fps drop once the game wants to allocate 2-300MB at once and burst the 3.5GB.
    It happens in the tutorial mission when encountering the tennis field.

    With older driver (344.11 instead of 347.09), memory usage is lower, but you can enable MSAA to get high VRAM usage and thus be able to reproduce by 100%.

    Could a GTX 980 owner test this?"
    http://www.overclock.net/t/1535502/gtx-970s-can-on...

    "Without AA or just FXAA, I have around 3.5GB used in AC: U and mostly no stuttering. With 2xMSAA it rises to ~3.6-3.7GB and performance is still ok. But when I enable 4xMSAA and it needs ~3.8GB, I often have severe stuttering.
    When I set resolution to 720p and enable 8xMSAA, VRAM usage is well below 3GB and there is no stuttering at all."
    http://forums.guru3d.com/showpost.php?p=4991141&am...

    "In Far Cry 4 @ 1440p
    No AA: 3320MB Max Vram, locked at 60 fps
    2x MSAA: 3405MB Max Vram, locked at 60fps
    4x MSAA: 3500MB Max Vram, 45-60fps
    8x MSAA, starts around 3700-3800MB @ 4-5fps, stabilizes at 3500MB @ 30-40fps."
    http://forums.guru3d.com/showpost.php?p=4991210&am...

    There's plenty more evidence supporting the acknowledged (by Nvidia) fact that the GTX970 has performance issues with VRAM allocation above 3.5GB.

    And all those people posting "my games run fine at 1080p", you are clearly missing the point.
  • aoshiryaev - Wednesday, January 28, 2015 - link

    Why not just disable the slow 512mb of memory?
  • SkyBill40 - Wednesday, January 28, 2015 - link

    Why not just have the full 4GB at the rated speed as advertised?
  • Oxford Guy - Thursday, January 29, 2015 - link

    Ding ding ding.
  • MrWhtie - Wednesday, January 28, 2015 - link

    I can run 4 games at 100+ fps on 1080p simultaneously (MSI GTX 970). Power like this used to always cost $500+. I have no complaints; I didn't have $500 to spend on a GTX 980.

    I feel Nvidia is doing us a favor by significantly undercutting AMD.
  • mudz78 - Wednesday, January 28, 2015 - link

    Yeah, a huge favour. By lying about their product specs, undercutting the competition and concreting market share, they set themselves up to hike prices in the future.
  • MrWhtie - Thursday, January 29, 2015 - link

    "they set themselves up to hike prices in the future" hyperbole at its finest. Try again once you calm down and start speaking with some logic and reason.
  • Casecutter - Wednesday, January 28, 2015 - link

    ^ It's simple they can't... They fused off one damaged defective L2 as they didn't have the volume of good GM204 to go to market with enough 970 parts. I would believe that the GTX 980M is a 12SMX part alought has all the L2 as the GTX980. They realy screw th pouch by touting "the GTX 970 ships with THE SAME MEMORY SUBSYSTEM AS OUR FLAGSHIP GEFORCE GTX 980".

    Nvidia came out and said "This team (PR) was unaware that with "Maxwell," you could segment components previously thought indivisible, or that you could "partial disable" components."

    Had Nvidia called it 3.5Gb with Active Boost... or 4Gb Memory Compression... or something like that and explained I think a lot of folks would've taken interest and weigh that when making a choice... But we weren't privy to that information.
  • Elixer - Wednesday, January 28, 2015 - link

    Looks like Nvidia is trying to make good.

    They will help you get a refund/credit for your 970 if you feel you want one.

    http://forums.anandtech.com/showpost.php?p=3712051...

    "I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.

    It sucks because we're really proud of this thing. The GTX970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.

    Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help."
  • MrWhtie - Wednesday, January 28, 2015 - link

    Yeah right, they can't have my 970 back :P What am I going to buy instead, an AMD? (LOL!)
  • CX71 - Wednesday, January 28, 2015 - link

    > "What am I going to buy instead, an AMD? (LOL!)"
    That would be a good move ... TROLOLOLOLOL
  • mudz78 - Wednesday, January 28, 2015 - link

    That's great news. Good see a company owning their mistakes and doing the right thing by consumers.

    The GTX 970 is still a good card, but purchasers have a right to make an informed decision before handing over their hard earned dollars.
  • Man_Of_Steele - Thursday, January 29, 2015 - link

    If they really are accepting returns and refunds regardless of when you purchased your card, that would make this seem like it really was an honest mistake - despite the mistake seeming highly unlikely.
    As for me, 3.5gb VRAM isn't enough to justify $350-$400 (EVGA SC or SSC). I will wait and see what AMD does with the 390x, and wait and see if nVIDIA launches an 8gb (read 7gb) 970
  • Man_Of_Steele - Wednesday, January 28, 2015 - link

    I understand that the cache isn't a huge issue, but as far as a future proof card, only 3.5gb VRAM..?
    You guys can form your own opinions, but that is a lot of people within the company that this error just so happened to slip past. There is also a notable stutter in Shadow of Mordor for those who haven't seen that yet.

    What I am curious to see, is if there is a way to re-enable the ROPs and increase the clock on that last 500mb vram. This may be impossible and I'm just ignorant on the issue, but if someone knows please let me know if its possible!
  • MrWhtie - Thursday, January 29, 2015 - link

    I don't think its possible, that last 500mb is slow because it only has one of its ROP/L2 units disabled (compared with 8 fully active on the 980). This causes one of the remaining ROP/L2 units to communicate with TWO of the 500mb DRAM sections (instead of 8 ROP/L2 each with their own 500mb DRAM.) Since it cannot access both 500mb secitions at once, it can only use 500mb in a "fast mode". When it tries to access both the speed is cut in half since normal operation is ONE ROP per 500mb DRAM. (The diagram on the second page illustrates this.)

    Its a hardware thing, its not a software issue whatsoever.
  • Man_Of_Steele - Thursday, January 29, 2015 - link

    I saw the diagrams on a couple different sites, I just wasn't sure if they disabled it the smart way or the lazy way.
    Thanks for the clarification!
  • Harry Lloyd - Thursday, January 29, 2015 - link

    This card needs 8 GiB of VRAM with eight 8-Gbit GDDR5 chips (instead of eight 4-Gbit ones). The price would not be much higher, but we would get 7 GiB of full bandwitdh. That would be enough for pretty much anything until Pascal comes along.
  • Ranger101 - Thursday, January 29, 2015 - link

    Nvidia HAS BEEN LYING and they should be ROASTED not meekly forgiven as Ryan Smith suggests. It remains a mystery as to why Anandtech should be so keen to absolve them....
  • Man_Of_Steele - Thursday, January 29, 2015 - link

    I agree with you there. No one seems to really be giving them a hard time... the mistake isn't as simple as Anandtech is trying to make it seem IMO.
  • TEAMSWITCHER - Thursday, January 29, 2015 - link

    What has changed in light of this information? Did the GTX 970 benchmarks suddenly decline? Will AMD raise the price of the R9 290 and R9 290X now that the GTX 970 scandal has been "exposed"? No...all around.

    Same Process, More Transistors, More Performance, Lower Power, and Lower cost all the while using a non-symetric memory partitioning scheme to maximize high speed VRAM. nvidia's only fault was not telling us about it. If it bothers you that much spend another 40% and get the GTX 980, but know this...it will NOT get 40% more performance.
  • itproflorida - Thursday, January 29, 2015 - link

    With 970 gtx SLI; There is frame time lag when enabling 2x, 4xmsaa or TXAA @ 4k and some games at 1440p. Eventhough most games run fine @ 4k with no AA , Fxaa or 1xSmaa enabled some like AC Unity have hitching or frame time lag with no AA with Maxed settings. Its not just a vram issue like this site and others are proposing.

    I have a video of AC Unity @ 1440p Native resolution. Ultra settings, HBAO+ and soft shadows using FXAA. Very intense action scenes with no lag and averaging 60+ fps. With vram at 3990MB.

    Yet I can experience frame time lag in FC4 with vram at only 3436Mb @ 4k with 2mxssa enabled

    CODAW @4k Ultra, Maxed settings cached textures with 1xsmaa is fine also while it goes over 3500 MB vram..

    So I am not convinced that it is just the vram segmentation and slower speed of the cache and how the drivers handle memory allocation.

    Are they still great cards, yes. As long as you know how to tweak each game
  • piiman - Saturday, January 31, 2015 - link

    Or just return my 2 970's for 1 980 and save 40%
  • wolfman3k5 - Thursday, January 29, 2015 - link

    NVIDIA, a company that has the engineering talent to produce highly complex GPUs with billions of transistors whants us to somehow believe that they made a mistake when they publicized the specs for the GTX 970? And now they are applogizing like that will make everything okay? I am so so sorry, however no amount of "mea culpa" will make things right. "I am sorry" doesn't pay the bills, doesn't feed the kids, and most certainly, doesn't make up for the deception. I own two GTX 970s, and while I have never-ever been satisfied with their performance, now I know why. I have purchased them both from NewEgg.com and they are in mint condition. I would like to return them and at least get the GTX 980, which is more in line with the specs that they published originaly, minus some CUDA cores. No NVIDIA, you will not loose me as a customer, however I want what I "thought" I paid for. I will foot the bill for the price difference. Please, someone from NVIDIA, if you are reading this, please contact me at wolfman3k5_at_gmail_dot_com, and tell me how you can help me return my GTX 970 cards for a refund. Thank you.
  • piiman - Saturday, January 31, 2015 - link

    " I will foot the bill for the price difference."

    If you bought 2 970's they will owe you money. My 2 970s cost almost 700.00 just for the cards the 980 is going for 550.00
  • piiman - Saturday, January 31, 2015 - link

    oh and have you tried calling NewEgg they are very understanding and will work with you/us.
  • Magictoaster - Thursday, January 29, 2015 - link

    The reason no one is giving Nvidea a hard time is because most people dont buy a card based purley on its published specifications. They look at the benchmarks, for multiple games, and the price, and if its a good fit, they buy it.

    I bought a GTX970. I love the card. It plays all my games at 1080P (my monitors max resolution) flawlessly. It performs exactly like the benchmarks said it would. I don't really care that the internal workings of the card are not the exact same as a 980. They are not supposed to be, thats why I paid $200 less than a 980.

    Those suggesting litigation would need to consider what the damages are, are really, there are very few. NVidea's published specs are accurate (though incomplete), new eggs specs were accurate (and incomplete), and the card performs as expected. I don't see any deliberate attempt at fraud. I got what I paid for, and the card works as expected.
  • CX71 - Thursday, January 29, 2015 - link

    Has nothing to do with damages, I doubt anyone has lost income as a result of buying a 970, it's all about consumer rights and the advertised specs which weren't accurate. Particularly for those who bought two or more 970s (which I was planning to do) with the intention of using SLI to push pixels on screen(s) above 1080, that's the issue for nVidia. If they had of stated right from launch that the card would loose performance if VRAM usage went above 3.5GB, and people still bought it then they'd be fine.
  • Magictoaster - Thursday, January 29, 2015 - link

    Litigation has everything to do with damages. This was the point I was trying to make. Your rights as a consumer are to have the card returned and your money refunded. If you want to bring suite, class action or otherwise, against Nvidia there need to be damages. If you have no damages, you have no case. If Nvidia gave you a hard time, and wouldnt issue an RMA/refund, you could argue they are acting in bad faith and ask for treble damages (a punitive tripling of any damages awarded). Again, you still need to prove that you were damaged by Nvidias misrepresentations.

    Seeing as Nvidia made no false representations (Nvidias listed specs only state the amount of VRAM, not ROPs, not partition size, not performance scaling, etc) you would be hard press to even prove that Nvidia deliberately misrepresented the specs of the card.

    Legal action is based on damages, you have to show Nvidia misrepresented their card, you relied on that representation, and as a result you suffered damages. If you can't show that, then don't mention litigation.

    With regards to your consumer rights, ask for a refund, you are probably entitled to it. You are not entitled to a free upgrade, a pile of gold, a unicorn, or any other non-sense.
  • Magictoaster - Thursday, January 29, 2015 - link

    Just so we are clear, I'm not a lawyer and this is not legal advise.

    I did work work for a law firm, and have seen countless cases were people sued "on principle," or to "correct the system" and every single one of them was a loss/dismissed at considerable expense to the plaintiff.
  • Elixer - Thursday, January 29, 2015 - link

    Looks like it was too good to be true.

    While some people got help, others are getting the shaft.
  • AnnonymousCoward - Thursday, January 29, 2015 - link

    It could just as well have 1024 ROPs and 2GB L2; who gives a shit?
  • M1cha3l - Friday, January 30, 2015 - link

    excellent article is there any change to
    make or link explanation about ROP's and SMMs ??

    Thanks :D
  • Fishman44 - Friday, January 30, 2015 - link

    This is a big deal. The most disturbing thing about this story is that Nvidia knew, and took the calculated risk that it wouldn't get noticed.
  • Ballist1x - Friday, January 30, 2015 - link

    I guess the question is now:

    Is Anandtech going to change the way they review GFX cards in the future to avoid this debacle in future?

    Maybe test the memory bandwidth, test with memory usage etc instead of trotting out the exact same synthetics every time and claiming that there were some unknown results - and then not revisiting the tests as they did for the GTX 970 launch?

    Were Anand complicit?
  • Nfarce - Friday, January 30, 2015 - link

    I go on vacation for a week, then come back and catch up on my tech news and then THIS happens! In any event, EGVA Superclocked 970 owner here. I have been extremely happy with the card running 1440p on all my games. Even if the specs were reduced, I still would have bought it over the 980 for $200 less. I did not see the 10-15% better performance with the 980 worth the 55% increase in cost, especially when I can safely overclock the already factory overclocked card to within a few frames per second of the 980.

    But yes, I am not happy with Nvidia's massive fall down here. If anything, I have diminished respect and trust for the company. It will not go forgotten.
  • Dal Makhani - Friday, January 30, 2015 - link

    you just said that the card is great for you, so why have less respect and trust with them? Contradictions dont help arguments, if anything this will only affect users who are maxing out VRAM and thats probably only a few of most owners who are probably on 1080p or 1440p without cranking modded textures/effects on games.

    PR issues like this arent really a big deal because companies make mistakes all the time and this is far from major and i can see miscommunication like this happening between parties at large companies all the time.
  • piiman - Saturday, January 31, 2015 - link

    " if anything this will only affect users who are maxing out VRAM and thats probably only a few of most owners who are probably on 1080p or 1440p without cranking modded textures/effects on games."

    Oh well if that's all it hurts.....oh wait that would be me. I bought 2 970s to do just that but guess what?
    But some how since you don't' think it effects you it's ok? VRAM use is going up and will go higher I didn't buy these cards to play last years games but the next gen games.
  • Nfarce - Saturday, January 31, 2015 - link

    Thank you piiman. I failed to mention that while I'm happy TODAY, I would not be affected TOMORROW. And this is WAY beyond a PR issue as Dal claims. WAY BEYOND. It's a complete misrepresentation of the card's stats whether by accident or intentional. And it is extremely hard to believe that all the engineering, marketing, and management teams collectively completely MISSED the wrong specs officially released by Nvidia for the card. REAL hard.

    So while I'm happy *currently* in my usage, there is a good chance I may have problems with games released this year and next year, something I did not anticipate as someone who skips at least one and sometimes two refreshes or generations of GPUs. That's not a contradiction in my beliefs as Dal also wrongly claimed.
  • Nfarce - Saturday, January 31, 2015 - link

    Let me restate: while I likely would have bought the 970 over the 980 even with reduced stats, I would definitely have rethought the purchase decision for long term usage.
  • Dal Makhani - Tuesday, February 3, 2015 - link

    You guys are overreacting, you will be fine in next gen games, especially if piiman has 2 of them. So what you have to reduce settings a bit so you dont hit the frame buffer limit, its not a big deal. While I am upset to hear that Nvidia flat out lied, we are all people and if marketing and engineering misrepresent something, its just business as usual because all companies face situations like this.

    HOWEVER, I think Nvidia should somehow make this up with game codes or some sort of step up program where they pay part of the bill for users to upgrade to a 980 if they are unhappy with their purchase. Loyalty should be kept through a response, i am in no way saying they should just let this go. I would bet a fair bit of money to say your 970's will still last as long as you thought they would.
  • inolvidable - Friday, January 30, 2015 - link

    I leave you here an interview with an engenieer from NVIDIA explaining everything: http://youtu.be/spZJrsssPA0
  • piiman - Saturday, January 31, 2015 - link

    LOL But what would Hitler say?
  • peevee - Saturday, January 31, 2015 - link

    Admit it, really used memory bus width is 224 bits, not 256 bits.
  • InsidiousTechnology - Saturday, January 31, 2015 - link

    There are times when a Class Action law suit are prudent and this is one of them. All those people who mindlessly cry about law suits fail to realize they help prevent clearly deceptive practices...and with Nvidia this is not the first time.
  • FlushedBubblyJock - Saturday, January 31, 2015 - link

    Can we sue AMD for worse at the same time ?
    http://www.anandtech.com/show/5176/amd-revises-bul...
  • aliciakr - Monday, February 2, 2015 - link

    Verry nice
  • Orange213 - Monday, February 2, 2015 - link

    I've got EVGA GTX970 SSC and I am totally fine with cards performance. However, this is an issue that nVidia should take more seriously, but obviously they don't care too much about it which is sad....Anyway I am not switching to AMD, but they are doing is pretty low.
  • Ballist1x - Tuesday, February 3, 2015 - link

    So no follow up from anandtech in probably the biggest scandal to hit the GPU market in the last couple of years?

    No opinion or viewpoint on how Nvidia have handled the fiasco or continued testing to understand the memory implications?

    I am losing my faith in anand...Why not take a stance? Or are they afraid to bite the hand that feeds.
  • Ballist1x - Tuesday, February 3, 2015 - link

    Is anandtech now deleting my posts for asking where the follow up is and what Anand's stance is on this mis information?
  • Ranger101 - Wednesday, February 4, 2015 - link

    Following their astonishingly quick (and completely erroneous) absolution of Nvidia Anandtech is no doubt keen to sweep this under the carpet as soon as possible, especially as legal action is now pending...Lol Anandtech.
  • paulemannsen - Wednesday, February 4, 2015 - link

    its a 3.5 gb card. end of story. and this WILL become a problem in the future for some. while i expected that kind of behaviour from firms like nvidia or amd i didnt know anandtech would chime in so blatantly and take their readers for fools. im deeply disappointed.
  • matcarfer - Thursday, February 5, 2015 - link

    I will post an analogy that everyone should understand and we all are going to arrive at the same conclusion:

    You buy a car thats supposed to reach 224km/h, no matter how many people sits in it (car has only 4 seats). Problem is, this car can peak 224km/h if only 3 adults and one child are in it. If it has 4 adults, it will have problems, behaving like a slower car.

    See the problem? Nvidia should've told us, they didn't. If we had known this (and review sites), people planning to max out Ram wouldn't buy this. It's false marketing, plain and simple and everyone who got a 970 should have the opportunity to take the card back and/or recieve a compensation for this.
  • wolfman3k5 - Thursday, February 5, 2015 - link

    You guys really should check this out: https://sqz.io/gtx970
  • Azix - Friday, February 6, 2015 - link

    I am wondering how this will play out later on when driver support for the card lags behind. Really disappointed in both nvidia and AMD. if AMDs cards weren't so power hungry I'd never have gotten cheated by nvidia because I rather AMDs more solid feature offerings than nvidias "hey, we got a different way of doing ambient occlusion!" software features.
  • Stas - Saturday, February 7, 2015 - link

    That was the first thing to come to mind. 8800GTX doesn't work well for me with drivers that were released in the past 9 months. I feel like 970 will be a dead card by the time next models have settled.
    I'm quite skeptical of the yields being the reason behind this. There is a very small chance that one section of L2 or ROP is bad but the rest is good. I bet nVidia purposely castrated 970 to create favorable market for the 980.
  • LazloPanaflex - Sunday, February 8, 2015 - link

    Damn dude, still rocking an 8800GTX? Um, anything you buy now is gonna be a massive upgrade, LOL
  • loguerto - Saturday, February 7, 2015 - link

    Such a shame ...
  • fuckNvidia - Monday, February 9, 2015 - link

    Just because you are happy being lied to don't mean everyone else should be. Just because you believe NVIDIA made a mistake don't mean it's ok they falsely advertised the gpu. What can NVIDIA gain out this? a high volume of sales and profit from false advertising. which is fraud so yeah i want my damn money back for both my 970's. Thanks anandtech.com for supporting false advertisement.
  • Oxford Guy - Saturday, February 14, 2015 - link

    8800 GT has twice the VRAM bandwidth of the 970's bad partition: 57.6 GB/s. It also doesn't have weird XOR contention problems.

    Midrange card from 2007...
  • ElGuapoK20 - Monday, February 16, 2015 - link

    As a process engineer I can tell you what probably happened. Salesmen misinterpreted what engineers said, engineers saw the semi-published information and tried to correct the salesmen. Salesmen decided they were too far into it and nobody would notice, engineers aren't supposed to be involved with sales so they can't do anything except say, OK then.

    Yea, been there, done that before. I don't believe it's a big conspiracy at all, because it's not, just misinterpretation and then laziness.
  • Oxford Guy - Wednesday, February 18, 2015 - link

    Nonsense. This flawed design was purposefully chosen by Nvidia executives. No one just says "Sure, we'll let the engineers come up with a design that provides 28 GB/s of bandwidth and XOR contention, just because".

    Nvidia made a big mistake by using this flawed design and was able to hide it from consumers for months.
  • Oxford Guy - Wednesday, February 18, 2015 - link

    And what kind of engineer is going to come up with such a flawed gimped design in the first place? It's always management that comes up with these hare-brained schemes. Apple is famous for it, like when it gimped various computers (e.g. Apple IIgs) to prevent them from competing with favored models. Apple's hare-brained scheming with the Apple II vs. Mac line (trying to make the Mac a success at the cost of the Apple II line) cost them the schools market which they had totally dominated.

    No engineer who is competent enough to get a job at a high-profile tech company is going to say "Let's invent a way to create XOR contention and cut down VRAM bandwidth to half the speed of a 2007 midrange card for a $300+ product." Never ever. Engineers aren't the type of people who see virtue in hobbling things and designing awful flaws. It's the antithesis of engineering which is about creating something better than what came before. Management is the one who comes up with these schemes because it's not about product quality but about profit.
  • Oxford Guy - Wednesday, February 18, 2015 - link

    When OCZ started selling Vertex 2 ssds with half the NAND chips (and the reduction in performance and capacity that went with that) it was the engineers not communicating well with the executives, right?

    If you believe that I have some excellent swampland in Florida for your perusal.
  • P39Airacobra - Monday, June 22, 2015 - link

    Nothing to see here move along please! Pay no attention to that man behind the curtain! https://www.youtube.com/watch?v=YWyCCJ6B2WE

Log in

Don't have an account? Sign up now