Its called Titan Z, it has 2xGK110 chips with 12GBs of VRAM and the price is.....2999 USD/EUROs...
What in the actual flapjacks?
GTX 590 used to cost 600 USD/EUROs. 690 has been on sale for 1000 USD/EUROs. And now this... I wonder if the next dual GPU will be 6000 at this pace...Soon people will need to take mortgages to buy it, lol.
Even Apple products look inexpensive next to them now. At least the iPhone does not cost twice as much with every consecutive generation.
I don't see what the big deal is, I just bought a couple of them. Get a job already.
J/K, I don't think my car is worth that much money, I couldn't afford that if I sold a kidney.
They would not be making these cards if there wasn't people buying them. And those people will probably buy two and put them in SLI.
Fortunately it is utterly not needed to have a good 3d experience, its like those kids putting 30 grand in car modifications. It wont get them faster to work than my 3000$ truck.
can't blame nvidia for this
they made it for supercomputing (i.e. researchers/scientists/engineers)
it just happened to be good for gaming as well.
the gtx 7## cards are the more mainstream ones
You have the bitcoin,etc.coin miners to thank for the spike in high end GPU prices. High end cards are getting snagged up almost as quickly as they can be manufactured. This gives both NVidia and AMD welcome shots in the arm, but doesn't help us regular folks whose lives don't revolve around bitcoins...
except i read that people don't mine with nvidia? or at least not until maxwell
Bitcoin is well suited to AMD GPU's, Litecoin seems to work respectively well on recent versions of NVIDIA GPU's (recent AMD is still tops though). You'll probably get more bang for the buck with AMD, but NVIDIA works as well. Plus some people like NVIDIA's gaming performance better.
https://litecoin.info/Mining_hardware_comparison
Also, since the AMD GPU's have went up in price at the various vendors, NVIDIA doesn't need to hold their prices down, as people looking for cheaper alternatives to, say, the R9 290X will be looking NVIDIA's direction. If NVIDIA vendors CAN get more money/profit margin, of COURSE they will try to do so...
They make lamborgini's but you don't have to buy one.
As somebody said, I doubt that these cards are aimed at gamers. It can be advisable to write certain classes of programs (generally those that are highly parallelizable) to use GPUs instead of CPUs. nVidia has this big CUDA framework for doing this:
http://en.wikipedia.org/wiki/CUDA
I would assume that cards at this kind of pricepoint are for this - you make a supercomputer with these at a research facility.
It's a very, very high-end card. And you're acting surprised it's not aimed at mass market?
My point, the previous generations of this card were "very, very high-end" too, at their time, but they were not 3000.
They ask 5x more for the successor of my current card (gtx590) than i paid for that card 3 years ago, shall i not act surprised? I guess those 590s were aimed at mass market back then?
On topic of this being a supercomputer at the research facility, NO, the TESLA cards are meant and sold for this purpose. I dont know exact numbers, but i am inclined to believe that single Tesla K20X or whatever is it called has higher 64-bit performance than 2 of these chips combined (and increased 64-bit support is the main reason for price of Titan being higher compared to regular Geforce 780/780Ti)... while being not that much more expensive...not to mention Teslas have things like ECC memory, which Titans dont have and i am sure for serious scientific work that is important...
If they called this Tesla, i would not say a word and they could be asking 10000 EUROS for it. I would just accept its aimed at that research related market, where different kind of money lie than in my business. But if they did not make it Tesla, they should not make it Titan either (since people in need of 64-bit are going to prefer tesla anyway, Titan might have had its niche, since it was 4x cheaper than regular tesla, this however is NOT), it should have been regular Geforce priced as regular Geforce.
We do a massive amount of GPU computing at our company (we have an entire GPU cluster). The problem for NVidia is that we, and everyone else, have discovered that we can get as good performance with the consumer level cards as we can with the high-end professional ones. They've recently modified their driver EULA to try to prevent this, but it doesn't surprise me that they're also addressing it by simply increasing prices and/or rebranding.
That being said, the website for it does comment about using it for gaming, which is a little baffling.
Workstation cards are even more pricey... the Firepro W9000 has a suggested retail of $3999, no idea yet what the W9100 will weigh in at...
http://www.anandtech.com/show/7901/amd-announces-firepro-w9100
The consumer level cards are equally good at single precision computing as Teslas. If you dont need dual precision and lot of CUDA apps dont, then the Geforce is actually preferable choice, since you get the same performance many times cheaper.
I personally use my 590 to some archviz rendering with Octane render, which to quote one of its devs uses "half precision". So i definitely dont need titan nor tesla, thus my current annoyance
Yeah, I'm not a CUDA guy myself, but most of the people I know who do this stuff do end up using GeForce line cards. But as somebody did point out, my understanding is that things change if you want to do double precision calculations. But I'm not expert as I don't actually do this kind of stuff myself.
There are many great features available to you once you register, including:
Sign in or Create Account