Why do workstation graphics cards cost far more than equivalent consumer graphics cards?

7,963

Solution 1

It's primarily market segmentation to allow price discrimination. Businesses who make money from work done with these cards have different requirements than gamers. Nvidia and AMD are taking advantage of that by asking them to pay more.

There are some minor differences to create this rate fence. For example, the Quadro / Fire Pro models use different drivers which prioritize rendering accuracy over speed. On the Tesla models, ECC RAM is a selling point for server farms, and NVidia claim higher reliability for 24/7 operation.

The company I work for designs GPGPU accelerated software. Our server suppliers will only sell us Tesla (or GRID) systems. I know if I buy a 1U server with 3x K40 cards, it won't melt in my client's data center. So I'm willingly paying triple price for my cards. I imagine anyone buying a Quadro card for business has the same rationale.

Solution 2

It is pretty simple, and certainly not conspiratorial. It has to do with basic economics and finance.

Let's start with the CAD software. CAD software programmers and that programmer community is pretty inwardly focused. To be sure, the math behind the user interface is very complex, and absorbs huge amount of resources to develop. There is an effort to make the software more usable and user friendly, but that input is usually driven by company insiders and the IT people from the customers they usually interface with, not actual users (CAD Designers). CAD designers generally are not computer geeks, and complain among themselves, not to people who can change things, like the aforementioned IT people, who are not likely to pass on concerns anyway.

That is the back story. So CAD developers spending time to make there graphic interface more generic is not high on there agenda; this would be a huge investment of time and dollars. This would allow standard drivers to be used efficiently interfacing with the software. The structure of their business, and their customers business will probably never make this a priority. Ever. This is why SPECviewperf V12.0.1 results vary so wildly from card/software combinations, making it almost impossible to pick a 'beast' card for multiple software packages unless you spend ridiculous amounts of money.

As for the card makers themselves, well developing drivers ain't cheap. Millions of lines of code, and thousands of man hours of debugging. To develop a driver for each software is out of the question; there are not enough users to justify that expense. So they come up with something that kinda works with all CAD. For the large volume packages, like AutoCAD, they may have a patch that boosts performance, but most have to get a one size fits all, which compromises for some more than others.

Then, they get to certify with the software makers; oh joy. This is a long, arduous, very expensive process, where the hardware providers make a lot of driver changes, and kiss a whole lotta butt. Accommodating one software and making sure they don't jack it up for the other software packages, is a whack-a-mole game almost impossible to imagine.

ECC memory in "pro" cards is actually not overkill. With the software packages having such touchy tessellation, and designers making crappy models, bit flips are not that uncommon. This eliminates a lot of graphic generated crashes (but not all of them). Other than that, the hardware is extremely similar to consumer cards, as far as I can tell.

Then, you have to take all of this expense, and cram it into 10% percent of the volume of their consumer cards, add a reasonable premium for going through all of that crap, and VIOLA, you have a quite expensive video card. If you do some research, and have only one CAD package, you may find a consumer card that blows away the high end (i.e. CATIA V6/ AMD R9 290X) for a quarter of the price, but not likely.

That's my thoughts, anyway. I read a lot of stuff from sellers of CAD cards which is a load of poo, and I thought I would add my two cents.

Share:
7,963
bwDraco
Author by

bwDraco

I'm a computer and tech enthusiast with a variety of interests ranging from video games to fountain pens. I stand behind everything I post here. If you find any problems with my posts, feel free to leave comments, edit them, or let me know in chat if you have any questions or concerns. If you're curious about my answering style, it's inspired by the answers posted by Thaddeus Howze ♦ at Science Fiction & Fantasy. This chat transcript details why I made this change. I'm also one of the resident spam fighters on Super User and can catch spam others miss. If you suspect that one or more posts are spam, bring it up in the Ask a Super User Moderator chat room, and be sure to ping me—I'm around most of the time. Don't forget to flag blatant spam—it takes just six spam flags from the community to delete it. It sounds like flagging posts as spam isn't enough; they need to be flagged specifically to your attention (and what are the odds that you haven't already found it). —fixer1234 (source) If you're wondering about my name, it used to be DragonLord, but I've had to change it due to a bizarre trademark dispute. Feel free to call me Draco or by my real first name Brian. As an aside, I'm a bit of a photo enthusiast. I shoot with Pentax eqiuipment, and you can find me on Pentax Forums.

Updated on September 18, 2022

Comments

  • bwDraco
    bwDraco almost 2 years

    An Nvidia GeForce GTX 780 Ti costs $700, while a Quadro K6000 costs $4000—yet they use the same underlying GK110 core!

    The same can be said for other workstation GPUs from both Nvidia and AMD.

    What exactly does this price difference pay for with a workstation GPU? It is my understanding that they have specially-tuned drivers for CAD and other intensive business applications, sacrificing speed in gaming applications for greater accuracy and performance in such business software, but this by itself can't explain the cost difference. They may have more memory, and often of the ECC type, but that still can't explain a nearly sixfold difference.

    Would hardware validation explain the difference? I suspect it goes like this: among the GPU chips that test as usable, 30% go into a high-end consumer card, and 68% go into a slightly cheaper consumer card; the other 2% go through even deeper validation, and the few that pass get put into a workstation card. Could this be the case, and is this why they're so expensive?

    • Ƭᴇcʜιᴇ007
      Ƭᴇcʜιᴇ007 over 10 years
    • Ƭᴇcʜιᴇ007
      Ƭᴇcʜιᴇ007 over 10 years
      "certification with over 200 applications by software companies" - Big $$$/time-soak there, I'd bet.
    • Nullpointer42
      Nullpointer42 over 10 years
      I'd guess one factor would be smaller sales volumes, which would lead to higher prices.
    • Synetech
      Synetech over 10 years
      Those are professional products. Professional products are always more expensive than consumer goods because they are meant for commercial use, which means they are expected to be used to make money. That’s also why a corporate license for a program is more expensive than a personal license.
    • bwDraco
      bwDraco over 10 years