The titan line-up, the ONE thing that definitely needs to die, has a new offspring. And it's a new bastard of "hurrdurr workstation"-price for a glorified gaming card. I weep for humanity and for the people who "praise" this bullshit behaviour.
>>46848693 >hurrdurr shill As someone who owns and loves his 780ti, fuck you. The titan line-up is fucking disgusting and terrible for everyone who "just" wants a high-end gaming GPU. It'll be advertised as "the ultimate gaming graphics card!!!!" And as soon as people question the 1000 USD pricetag they will suddenly call it a developer card because muh double precision and muh framebuffer! It's a glorified gaming card that they sell at a HUGE markup because people are willing to pay for it. And he worst part is that it'll hinder ACTUAL gaming GPUs. I would have loved to buy a 6GB 780yi, but wait they never released that because it'd be the exact same card as the 1000 dollar titan black just without the double precision! Can't have that!
>>46848635 Lel then their next generation of midrange cards come out a year later to stomp out the titan so you ended up wasting 500 USD for a glorified card. Its like buying beats, half of what you're paying is the name
>>46848635 wow, what will probably be a $1500+ gpu that will be beaten by AMD's inevitable $1000 gpu 2 or 3 months later that blows it out of the water. >but muh extra $20 a year on electricity with AMD
>>46848954 No, double precision is irrelevant it those tasks. For scientific computation it might work somewhat, but it will NOT work as a developer card because it lacks the proper workstation drivers and the certifications that come with them.
>>46848978 "Over clocked" is a useless figure, you might have a cherry picked sample of one GPU and a crappy sample of the other. Stock clocks or a slight OC (like the factory OC that many GPUs come with) are acceptable, but basing benchmarks on the silicone lottery is bullshit. And at stock (or with the factory OCs) an R9 290x and a 980 trade blows at 4k. Maybe the 980 wins by a couple percent on a larger number of games, but in the end it's irrelevant. GM 200 will not offer double the performance of a 980, and an R9 295x2 will most definitely not be outperformed by a Titan X at 4k in games that scale properly with crossfire.
>>46848760 You seem highly upset. Big chips are very expensive to produce as a general rule and ship few units. They are always expensive. These big chips happen to be of interest to people with more money than sense that want to play video games as well. You are in the exact same market with your GTX780TI, please don't delude yourself because you can't afford the next price bracket. Previous Titan cards have shown there is quite a market for compute cards as regular graphics cards. This is not /v/.
>>46848948 fuck you talking about? if anything it shows NVidia is trying to one up AMD by releasing this monstrosity before the 3xx series is released, NV knows they fucked up with the 970(and most likely the Shield box also)so the TX is their Ace in the hole.
>>46849041 As if. The majority of Titans were used as simple gaming cards, just like their marketing suggests. People bought it because it was the best gaming GPU available, not because of double precision or computer performance. And yes, I'm pissed that they try to upsell me a "workstation" card (which ironically doesn't actually function as one) for gaming with that silly price-tag on top of it. >this is not /v/ You're literally discussing a product that is designed to play games, deal with the fact that many people on /g/ play games.
>>46849081 Again you're playing the GPU die lottery, if a GPU's die doesn't fall in an acceptable range of tolerances its binned for either laptop use by shaving off cores and RP or just thrown away.
Same applies to intel's and AMD's processors, I had an i5-2500k that OC'd to 5.0ghz on a hyper 212 fully stable but other people may not be so lucky, others may be more lucky and be able to go further while being stable.
>>46849137 The chip the Titan used was also used in other products. There are quite a few workloads you would normally want a Tesla/Quadro branded variant for you can get away with on a Titan branded variant.
>>46849266 interesting. i noticed that a screenshot of my game on my phone had artifacts compared to the same screenshot on my pc. so disregarding price per performance, then there might be some benefit in getting a Titan X vs a GM200 gaming card?
>>46849324 I remember when you could take a 680 and flash the firmware and chance one of the capacitors or resistors in the bottom left of the card near the PCI-e connector and you would have the same thing as a (then current gen) K6000 or something
you could take a 560ti and do the same thing and get a K4000 except you didn't need to physically change anything just flash the firmware
>>46849345 after the 500 series, Nvidia installed what is basically "physical DRM" by making one of the capacitors on the GPU output less farads than the ones found on the K6000, they were otherwise exactly the same card, so if you were good with a soldering iron you could easily take a 400 dollar 680 and get a 4000 dollar K6000 or what ever they generally went for.
>>46849371 it's not impossible that he's done a few cycles in the past but i think he's natural now. he's not that muscular, it's very easy to maintain with a non-shit diet and lifting for 30-60 mins once every few days.
>>46849210 Except that you're stuck with gaymen drivers without the proper certifications. You can't just pop in a gaming GPU in a workstation or use it for scientific computation like that, many programs flat out won't work without those certificates while others might work poorly. The drivers and the software support is half of what you're paying for with a workstation card. You'd be ridiculed and thrown out if you showed up with a titan at a big-time animation studio.
>>46849081 I wish the silicon lottery wasn't a thing and that we could buy the GPU or CPU based on a grading system. For example you say 1500 to 1600 so 1500 to 1525 is grade D 1525 to 1550 is C 1550 to 1575 is B 1575 to 1600 is A and oddballs that OC higher could be something like A+.
Anyone else agree? I personally would always go A+.
BTW I still wouldn't go Nvidia I was a Nvidia fanboy but I saw the light and I plan to go 390x Crossfire 4k in NY next build.
>>46849552 We're not talking about Photoshop here, there are proprietary 3D modelling and animation programs that get used there that literally won't work without the certificates. It might be shitty DRM but that's what you're paying for in the end. A Titan is NOT a proper workstation card, it might be able to do similar tasks at adequate levels, but it lacks the support and the proper drivers to actually fill that role.
And today we’re excited to announce an expansion of that partnership with NVIDIA providing all UE4 developers with not just binary but C++ source access to the CPU-based implementation of PhysX 3.3.3, including the clothing and destruction libraries, through Epic’s Unreal Engine repository on GitHub. This means that the entire UE4 community can now view and modify this PhysX code alongside the complete C++ source code for UE4. Modifications can be shared with NVIDIA who will review and incorporate accepted submissions into their main PhysX branch, which then flows into future versions of UE4
>>46849992 i don't think there's a UK equivalent except being a scumbag and buying a bunch of chips at the store and trying them all with careful resealing of the packages or being slightly less of a scumbag and selling the less good chips as open-box on ebay. shipping should be pretty cheap and they ship to many places (even though they don't list them all on their site yet) but you might get hit with custom fees if you're unlucky. i haven't bought CPUs specifically but i've ordered a bunch of stuff and as long as you stick with USPS you should be fine (UPS for example are stricter with custom fees).
>>46850096 >these workstation cards that required billions of dollars of R&D and fabrication are easily worth thousands of dollars each for professional applications >b-but i just want my gaymes >ok we'll give you a gaymen card for a few hundred bucks >this is somehow evil
>>46850202 Do you have to be such a good goy? I buy the best product and the best GPU's are AMD. I plan to go 390x Crossfire in my next build 16GB of the best RAM possible 4790k or better with a IB-E cooler.
If there is the option I will go OLED Freesync 21:9 HDR 4k and I would be willing to spend £2k / $3k or under for that monitor.
>>46850202 >company which is hardly worth over a few hundred million >capable of spending BILLIONS all my wats? you marketer shills are really bad at math. how much do you faggots get paid to defend your company against every post?
>>46850741 the gtx 980 with 4 GB vram and a 256-bit bus is as good as if not better than the 290x with 4 GB vram and a 512-bit bus so i'd say it is perfectly reasonable. nvidia also has 7 GHz memory while amd only has up to 5.5 GHz memory.
>>46850805 Yeah essentially the 980 might be on par or better than a 290x if you want to play minecraft at 1080p 9000 fps but if you actually want to use it at the kind of resolution it's designed for the 290x blows it out of the water.
>>46850805 >>46850873 >In most of the games tested GeForce GTX 980 SLI matched the same gameplay experience as AMD Radeon R9 290X CrossFire. This was surprising considering single-GPU GeForce GTX 980 is able to outperform single-GPU AMD Radeon R9 290X. >GeForce GTX 980 SLI was on par, equal with AMD Radeon R9 290X CrossFire in performance, most of the time. http://www.hardocp.com/article/2014/10/27/nvidia_geforce_gtx_980_sli_4k_video_card_review/11
>>46850948 Yeah I do feel quite shit aswell for how I treated AMD fans just trying to help. They would for example recommend me a 295x2 which cost £20 / $30 less than 970 Gigabyte SLI and I gave them the whole LOL OVERHEET HOUSE FIRES POORFAG SPEECH.
>>46852806 >>46852844 we're apparently already using 4 GB considering the gtx 970 fiasco, so we'll be using more than 6 GB soon. the gaming version of the titan x will of course have "only" 6 GB vram. the 12 GB is for "prosumer" usage rather than gaming.
>>46852884 I doubt we will use much more than 4GB of vram any time soon simply due to the console limit. The textures aren't going to need more memory, only the buffers and partial renders. 5GB limit on consoles pretty much puts how much pc would need. This doesn't include modding but 4k textures are pretty much standard now and not really much gain on going higher.
the real turning moment for me personally was the drivers not being updated for the 780ti to the point where its now losing in benchmarks to the 290X
I can understand a manufacture not keeping driver updates and a priority for something that was old and sold for cheap like a fucking 460 but not having drivers for a 780ti? literally the last generation card and a card that people spent 500-700$ on?
that proves to me beyond any doubt that Any nividia card i purchase will have no longevity
Im seeing alot of threads with people who have had 760's and 770's burn out and im also wondering why none of them stop to ask why those cards were destroyed after such a short lifespan? does it have to do with their bad coolers? their excessive clockspeeds?
>>46854336 it doesn't try to, the game engine realises there isn't enough vram and start streaming vram through the pci to system ram better, if that fails, the game will probably drop to 0 fps every time it fails to locate data in vram.
>>46853941 >Im seeing alot of threads with people who have had 760's and 770's burn out dude those cards (well, the chips and design) are three years old at this point. games are just becoming more demanding as time moves on.
>drivers not being updated for the 780ti to the point where its now losing in benchmarks to the 290X a 16 months old card is a bit weaker than amd's best single-gpu offering? boo- freaking hoo.
>>46855304 You mean a 16 month old card is beaten by a 15 month old card. The 900 series is also with the latest drivers a 6 month old card being beaten by a 15 month old one and the 390x when it is released will blow the 900 series away. Don't forget the 295x2 is the most powerful dual GPU card likely still more powerful than this Titan and there is a 395x2 planned to be released.
>>46855830 that doesn't say the OC'd boost clock does it? and i was talking about performance you fuckwit, not about clock speeds.
>AMD Radeon R9 290X Creamed
>You don't have to look long at our results today to see that the AMD Radeon R9 290X is crying out for help. The AMD Radeon R9 290X is currently AMD's flagship single-GPU, just like the GeForce GTX 980 is NVIDIA's flagship single-GPU. Other than adding more GPUs, this is as fast as it gets in single-GPU form from both AMD and NVIDIA. Yet, it looks like the AMD Radeon R9 290X is lagging severely compared to what NVIDIA currently has on the table.
>The GeForce GTX 980 is making the AMD Radeon R9 290X look like last-generation technology, which you can argue it is since its launch nearly 1.5 years ago. We are shocked how far behind the Radeon R9 290X is falling from the newer GeForce GTX 980. It seems to keep falling further and further behind in every evaluation we write! Chock some of that up to the clocks we are seeing, but drivers are certainly part of that equation as well.
>The ASUS ROG Poseidon GTX 980 Platinum has surely laid the smack down on the AMD Radeon R9 290X. We were using a very high factory overclocked customized and expensive AMD Radeon R9 290X GPU based video card today. The Sapphire Vapor-X R9 290X Tri-X OC debuted at the same price as the ASUS ROG Poseidon GTX 980. However today the Sapphire can be had for about $400. The SAPPHIRE card let us overclock the AMD Radeon R9 290X to its highest frequency we've ever achieved consistently.
>This is the AMD Radeon R9 290X at its best, at its absolute highest performance potential on air. However the AMD Radeon R9 290X is slower in every game, not just by a little, but by a lot. A highly overclocked R9 290X cannot keep up with a factory overclocked GTX 980, or a manual overclock GTX 980. It would be even worse if this were a stock, default clocked AMD Radeon R9 290X. It is time for AMD's next generation, because Hawaii (R9 290/X) just got old.
>>46850317 mantle was originally supposed to be open source but later AMD decided to put their work into DX12, glNext and also Vulkan. All these three are basically utilizing mantle code. Vulkan is probably Mantle's true successor, but others are too.
>>46848635 It's kind of neat I guess but I can't help but feel like this was bad timing on Nvidia's part. We are on the verge of a tech shift away from GDDR5 in favor of stacked DRAM technology like HBM which may offer 2-4x the memory bandwidth and possibly more with time and further development. This card will be rendered obsolete within a year of its inception. If the card was like $600, $750 tops it might be acceptable but leaks say about $1300 and I feel at the absolute minimum it'll be $1000 so given that it just isn't worth the money.
Whether you choose to get the 390X or wait for Pascal is up to you but regardless of what company you shill for I feel like the Titan X is a bad buy.
>>46855870 The thing to realize is most sites just used a stock cooled 290X and called it a day http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-5.html
The 290X Tri-X (marketed as OC, not OC'd by reviewers) runs at 1010mhz baseclock, it's the closest to what you'll get out of a baseclock 290X with proper cooling. You will note it's on par with a stock GTX 780 Ti.
>>46849250 But cooler. Show me your 290 under 50 degrees load, mine is 1,55 Ghz.
Also the Titan X will be overpriced, it wont have more than 50 % more performance (960 has half of the 980 specs and exactly performs 50% worse). I'll keep my 980 and save my money for a proper VR headgear until 2016 Pascal era.
>>46855995 >He posted relative performance at given clock speeds. no he didn't. the hwbot stuff is pure autism, it's only about the clock speeds.
>One of the most biased tech "news" sites I've ever visited. right... >AMD Radeon R9 295X2 CrossFire, or QuadFire as it is commonly called, provides the best gaming performance we have ever experienced. Two of these AMD video cards, costing $3,000, offers up the absolute best gameplay experience. You will be able to take all your games to the highest possible settings on a single-display, or sub-4K Eyefinity configurations. Even at the very demanding 4K resolution you will be able to enjoy a high-end gaming experience with the highest graphics settings, and this is what is important. http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review
>>46856112 >HardOCP is literally the only site that agrees with you maybe because sites for casuals like you only compare stock clocked cards. you have literally nothing besides your butthurt to base your opinion on that they're biased. they do the tests like they say they do them and they report the results as they get them.
>>46856278 ...which doesn't agree with other sites. The R9 series handily beat the 7xx series (before the 780 Ti, which is really just an OC'ed 780, and could be beat by an OC'ed 290X) on nearly every site, yet HardOCP always showed Nvidia winning. The bias is obvious to anyone who isn't a fanboy.
oh great another 1000$+ card so all the fan boys can orgasm and all the reviewers can get bought out. And AMD will go out of business. I don't know if I even care any more. Enjoy your 5% increase a year for 1000$ gfx card once there is a monopoly. Look at what intel has done. They've increase their desktop chips by like 20% total over two generations.
>>46856321 GPU memory is used to hold texture, model, and frame information. As geometry complexity, texture size, and resolution size increase, more memory is required.
However, 2gb of memory will do you no good if your card isn't capable of physically putting information into it fast enough.
As an example of how crucial memory speed is, the GTX 970 debacle with 3.5+0.5 gb of memory, was caused by that last 0.5gb being accessed at 1/7th the speed of the rest of the 3.5gb of memory. When that last 0.5gb of memory requires access, your entire application will start lagging terribly. This can also sometimes result in visual glitches.
>>46856371 even in a monopoly they would have to compete with themselves to get people to upgrade their existing graphics cards. and ARM SoCs would catch up if they slacked off. intel has still made significant gains in both performance, power efficiency and iGPU.
All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.
If a post contains personal/copyrighted/illegal content you can contact me at firstname.lastname@example.org with that post and thread number and it will be removed as soon as possible.