Why do AMD owners constantly get fucked over these days with PC games? We always get the short straw compared to Nvidia owners.
Nvidia always seems to pull way more shady shit and trying their darnest to get monopoly on the market. I bet they encourage devs to make unoptimized pieces of shit so people buy their latest graphic cards.
>mfw have 7950's in crosshousefires
>mfw keeps my room warm
>mfw 60fps everywhere
AMD always beats Nvidia when it comes to price/performance ratio, OpenCL beats Cuda in the majority of supported applications out there, and AMD's last gen high end GPUs beat Nvidia's "next gen" Maxwell cards when it comes to high resolutions like 4k/multi-display gaming.
Nvidia can only compete by buying out developers with their Nvidia Gameworks crap and use the power of marketing to make it seem like their GPUs are worth the premium price they are at.
Both AMD and Nvidia do shady practices, but Nvidia is the bigger jew out of the two.
I've always disliked nvidia's proprietary shit
works on both kinds
only works on nvidia
works on both kinds, part of dp standard
works only on nvidia, costs $100
>AMD always beats Nvidia when it comes to price/performance ratio
Not necessarily. People forget things like noise, heat and power consumption are also performance related factors that must be considered.
>GTX 980 is 0.5FPS better than the R9 290X
>in an Nvidia optimized Ubisoft title
>"hurr it's worth paying the extra $300+ just because it says Nvidia on it"
>inb4 drivers make it worth the extra $300
Because making a game run well (bug free) at a good framerate takes more work than using some 3rd party middleware.
nVidia has more shekels to throw at devs to make sure they make their games run properly.
enjoy your inferior shit tier hardware poorfag :^)
>using synthetic benchmarks
why not list actual games then?
>tfw so many unoptimized games
you don't get it, Nvidia would have no reason to innovate since AMD not existing would not give them any incentive to either.
PC gamers can either suck it up and pay $600 for a rebranded last gen GPU or go to back to consoles.
Well, nVidia is basically unsupported without tainting your kernel license. So no, AMD's support doesn't really compare, as it's basically plug-and-play with any kernel in the last 5 years.
AMD just doesn't have as much of the market share as Nvidia.
Which is a shame really, because competition is only ever good for the consumer.
That's one think most Nvidia fags
including myselftend to forget. Without AMD there would be no one to stop Nvidia from going full Jew
yes, AMD even has a much better open support community than Nvidia.
Nvidia is the king of proprietary shit and bad support if you haven't fucking paid attention to technology for the past decade.
>How long until we start seeing what we were hoping to see?
The studios that produced launch games are now pretty well-versed. Games getting announced around now should be a good deal better than those developed in tandem with the hardware.
That's a capitalist myth. People won't buy luxury shit that's overpriced even if they can't get alternatives. They wouldn't sell enough product to maintain the price structure.
>inb4 some rant
Naw, save your time m8.
It's probably just due to design differences. AMD has an incredibly inefficient architecture, in that they throw almost twice as many cores at a problem, but each one is incredibly specialized, whereas nvidia designs around much fewer cores that can do anything they're assigned.
AMD's design ends up being a lot cheaper to make, and if a game doesn't happen to need more cores for a specific thing than they alot, they come out ahead. The second any game goes heavier towards one type of core than another, Nvidia blows them up because the AMD card is sitting there with literally unusable sections of hardware.
But AMD does have a big market share when it comes to GPUs.
AMD CPUs have fallen behind when it comes to raw performance, but that hasn't stopped many mobile companies and even Sony/Microsoft from asking AMD to make a shit load of APUs for their devices like the next gen consoles and loads of laptops and mobile devices out there.
It's going to be increasingly hard for them to sabotage performance on Radeon parts, given that literally every console on the market now runs one. Anything not developed as a PC exclusive is going to be tuned aggressively for AMD APU.
I have a 280x currently, but I wanted to upgrade to a GTX970 in Jan/Feb as the only 'AMD' game coming out in the next few months is GTAV, and even that would perform better on a 970.
I asked about it in a nvidia thread and literally everyone told me that it was 100% worthless and a waste of money.
What do you think?
So.. 2016/17 releases should be leaps and bounds better than the current next gen crop? Any indication of what we can expect (view distance, map size etc)? I would like to know what we should be seeing in a couple years.
I just wish /pol/ wasn't dead at the hands of moot
I miss fat shaming fridays already
nvidia pays developers to purposely make their game play worse on amd cards.
It's dirty play but legal. Take the phys x bullshit. Even amd processors and cards do physx it with some tweeking on games like borderlands. But amd won't come out with make it work with it due to legal bullshit that they don't wanna get involved in.
nvidia is just has grey area practices. Not ethical but effective at brainwashing retarded gamers into thinking "The Way It's Meant to be Played ".
Architecturally it's going to be about systems involving tight synchronization of the CPU/GPU cores. You have a shit-ton more bandwidth into the GPU than you could get on a PCIe bus or something. Expect more dynamic set dressing. Megatexture-like approaches will finally not look like shit.
Not everybody wants to play only Dragon Age at good levels though. It's a shame that everybody around the internet recommends the 970 and not just /v/.
Are you from /g/, trying to pretend you know tech?
Its called the /v/ reality distortion field. This board is filled with shit eating morons who don't keep up with anything. Every time a game specific performance issue arises a driver update which addresses it is quickly published.
AMD is about 30% of the discrete consumer GPU market.
They're up to 25% of the professional market, up from about 5%~ just a couple years ago.
Though when it comes to compute clusters AMD has about twice the market share of Nvidia.
Nvidia is dominating the consumer discrete GPU segment.
Thanks anon, that's good to know. I was scared we wouldn't get any new real tech this gen, just improvements.
>Megatexture-like approaches will finally not look like shit.
Carmack ahead of the curve again.
no one wants to play shitty call of duty and unoptimized ubisoft trash either you Nvidia shill
>Nvidia is dominating the consumer discrete GPU segment.
Too bad literally every console is AMD now.
Even if nVidia was 100% of PCs, it would still be a minority in gaming hardware.
Carmack always was ahead of the curve with Megatexture, but he was also too focused on the 'what if' to see the actual reality of the fact that everything else around it just simply isn't capable of making the megatexture system shine like it should.
Nobody really wants to download and install 1TB of textures, after all.
Because single-threaded performance matters in the real world. If it's synthetic benchmarks or we live in a fairytale world where programmers make every program multi-threaded, then AMD > Intel.
If only that shit were true.
Why does AMD shill on /v/ of all places? Seems like a waste of time when they could be spending that time improving their products to stay competitive. I guess poor people need GPUs too though.
Nvidia don't really deserve the position they're in in the market though, whenever they so much as have a glimmer of a new piece of technology they'll artificially lock it to their platform.
I mean yes, I'm running a 970 right now because AMD's drivers are still complete shit with multi monitor, but I don't like Nvidia as a company.
in every game the 290x beats the 970, the 980 wins but just barely with only a 4-6 FPS advantage.
But I guess to you that small FPS increase is surely worth the extra $300 for the GTX 980.
>I mean yes, I'm running a 970 right now because AMD's drivers are still complete shit with multi monitor, but I don't like Nvidia as a company.
I realize that, but look at what you're saying.
>Well this product is better but they're a bunch of shitheads because they out-compete
The solution isn't not to buy Nvidia, it's for AMD to go
>Oh shit, they have better stuff, we better work harder
If they don't they lose. Welcome to capitalist Earth.
guys i need a second opinion
an r9 290 sapphire here is 420 wooping fucking dollars
and the gtx 970 zotac is 350 dollaroos
am i being rused by my country?
It seems like every game I have bought this year has performed like horse shit on my card when compared to nvidia.
With more and more games coming out with baked in 'gameworks' features, It feels like that next year is going to be awful for AMD users. Both Killing Floor 2, Batman and Witcher 3 will use baked in Gameworks.
Nvidia seem to get a whole lot better support and drivers also, as well as a ton of features that I have to jimmy rig to get working moderately okay on my AMD card.
Everyone keeps saying it isn't worth it right now, but next year just seems like I am going to be forced to upgrade at some point to nvidia.
GTAV was rumored to support Mantle
The noticeable differences are worth it to so many people it's constantly recommended over the 290x. Anonymous people on /v/ are the people I've seen recommending 290x.
No offense but go show some creds and recommend it, AMD could use your help.
The thing about bad amd drivers is that... it's a false narrative that nvidya fanboys like to spread and circlejerk to. It's not true at all.
It's just misinformation that gets passed around and around like a cheap whore. People start thinking it's true without ever trying it for themselves.
I actually feel that amd has more stable drivers than nvidya. But saying that on /v/ just makes a shitload of nvidya fanboys start shitposting me.
Why even attempt to get monopoly? Do you really want the government forcing you to split your company into multiple smaller companies?
The problem is that the 970 is even worse then the 290x in higher then full-hd resolutions.
It makes no sense to get one right now. Maybe for a short time at release, before AMD matched prices. But right now, just nah.
I fuckin trusted that Gaming Evolved program to 'optimize' BF4 for me and it made it fail to even launch.
You're just in luck.
>it's a false narrative
Speaking of that, the justification for buying shittier (but admittedly cheaper) AMD cards fits nicely into that description.
I have a 270x and I'm getting 60FPS in the majority of my games with everything on high or max settings at 1080p.
It's a great mid range card, if you wanted more you should have gotten a 280x or 290.
>puts settings on 8x MSAA and vsync
>why can't i run this on 60 FPS
I have that card and unless it's some unoptimized shitgame I can play it at near max @ 60 FPS 1080p
And with near max I mean everything on max except motion blur, vsync and AA tuned down to 4x or 2x.
>The thing about bad amd drivers is that... it's a false narrative
Not really. Even the omega drivers released lately are bugged as fuck and produce many blacks screens and shit. I can tell I'm a AMD user and I never had problems with drivers until I got one. Also despite higher framerate I have with my card, games always seem to run like shit.
GF100 more exactly, not "400 series", 460GTX is using GF104 which solved heat problems.
>GF100 more exactly, not "400 series", 460GTX is using GF104 which solved heat problems.
Yea well it happened.
You can say a lot of things about AMD, but they never burned down your house or killed themselves because lol no failsafe against heat
Nah that was not a lie. The older ones did have issues where it would overheat and start smoking slightly. It doesn't happen in their newer cards but people just haven't forgotten about their older shit. Sadly the media never actually spoke about this much.
I'm sorry, but you don't seem to understand libertarianism.
No libertarian would give a shit if a monopoly establishes itself due to providing a superior service than all the other competitors. If that company ever gets complacent with its quality, then in a free market there will be no real restrictions for a startup competitor to shake things up.
The problem comes from government or law enforced monopolies, or the inverse, little companies using the government to attack the big successful companies on the basis of "unfairness". That is when the monopoly is created not on merit but on corruption, and that is when the consumer suffers. There is no downside to the consumer if a monopoly forms entirely through succeeding in the free market.
that's like my GTX 560ti. I can't justify $200/$300 just to turn up AA and put shadows on max.
AND IT'S A 500 SERIES FOR FUCKS SAKE!
I swear the last couple generations of cards have been a scam.
Also AMD gaming evolved games always run really well while nvidia the way it's meant to be played games are almost exclusively shitty performance games.
They just design a reference PCB and the chip. The rest is up to the different manufacturers.
>AMD is switching focus to APU market.
They aren't switching focus to anything from anything. They're still making enthusiast level and and high end workstation GPUs. They have a band new X86 core arch coming in 2016 that is going head to head with intel's Skylake.
APU is just a marketing term. Intel's Core i series chips with HD graphics are functionally the exact same thing, the only difference being that intel has had a much better core arch.
I seem to think that Carmack himself mentioned that the original files were actually in the 1TB ballpark.
For obvious reasons that won't be feasible for a long long time, but I don't doubt you'd get a decent effect with a fraction of that.
I think that ultimately the biggest issue with its application was the fact that it was a blanket over the entire game rather than being used in certain areas. There was a lot of stuff you saw up close that had a disproportionate lack of detail to everything else as a result.
> it's a false narrative that nvidya fanboys like to spread
My HD5770 and HD7850 would disagree with you. They crashed quite a lot, and to run a third monitor the HD7850 needed to be underclocked by 5Mhz at all times. Not to mention the cursor bug STILL EXISTS. It's a damn shame because for the money the things were pretty tidy pieces of kit, and I've still got them running in other systems in my house.
My main complaint is that it's stifling to have this technology which will only run on one particular card, because as a result of them limiting the userbase for that technique or feature it can never be used in a meaningful way beyond simple fluff. While most of it is inoffensive because it's technology which is sometimes literal fluff, there's also things like hardware PhysX which a lot of games could benefit from.
Do nvidia gpu work better with Intel cpus? I have heard that amd cpu are shit and so I want to upgrade and I know before (in 2007)when I built my PC it was better to go amd gpu and amd cpu. All I want is the crisp Witcher experience (and to record my LoL games)
>there's also things like hardware PhysX which a lot of games could benefit from.
Hardware PhysX is a sham. Their software-mode code is compiled without any optimization passes and uses - no joke - 80-bit x87 FPU instructions for everything (while of course truncating every result to 32 bits).
> My HD5770 and HD7850 would disagree with you. They crashed quite a lot, and to run a third monitor the HD7850 needed to be underclocked by 5Mhz at all times.
You're probably one of the very few who has issues like that.
Yes, and AMD literally had no drivers. Keyword: "had"
It doesn't stop AMD shills from denying it, though. Oh, and also keep on with le nvidia housefire meme when their own cards reach 95 °C under normal use while green cards are approaching arctic temperatures.
Recently, Intel CPUs have always had better performance than AMD CPUs, but Intel CPUs are generally more expensive.
Nvidia and AMD GPUs are neck and neck in terms of performance, getting one or the other largely depends on what games you intend to play since some may have gimmicks or may be optimized better in some games than the other.
>this talk about AMD and NVIDIA
>not realising masterrace matrox cards
Matrox makes better cards than AMD and nvidia
>Recently, Intel CPUs have always had better performance than AMD CPUs, but Intel CPUs are generally more expensive.
Intel makes the fastest high-end CPUs and the cheapest low-end CPUs. If you want value in most of the price range, AMD wins.
I was using a 9600GT for nearly 5 years and my life was a living hell with those shit drivers.
I moved on to a R9 270x and I would kill myself if I had to go nvidia again.
Of course, I'm always in a little doubt when I say this because I was also using the shitpile that was Vista while I had the 9600GT but I don't think that's an excuse.
>80-bit x87 FPU instructions for everything (while of course truncating every result to 32 bits)
>mfw i can understand what he's talking about
it's a good feel
It was fun growing up with all the crazy early 3D accelerators.
>ATI 3D RAGE
Sucks that we've just got our little Christmas-themed duopoly these days.
To ask that question means you're already not worth answering to.
You pick whatever card performs on the level you want for the cheapest price. Taking into consideration warranty and delivery times.
I would recommend AMD low and mid-tier cards over Nvidia but I don't really have experience with high tier cards.
I'm so happy I bought I chose a 7870 over 660 last year and I'm even happier now I can crossfire it with my R9 270x.
I'd tell you to look into the history of high performance SoCs, but you have no idea what any of these words mean anyway.
I hate this board. Truly.
>is to keep nVidia and Intel prices in check
Ha, that's hilarious. Maybe at one point in time by stealing Intel's processor name and confusing people into buying their product but nowadays they have absolutely nothing on the giants that are Nvidia and Intel (I will admit that ATI was many times more competent than under it's current management, they had a chance). AMD is lapping up the budget/poor casuals market with their CPUs and the budget gamer market with their GPUs, they simply do not have the revenue to compete 1:1 with either of them.
The reason Nvidia and Intel are the giants they are aren't because of any sort of foul play, but because they are just that good of a company. There are no barriers to entry into the market yet you don't see companies trying to compete with them. It's because other companies simply can't keep up with the rate of innovation either company is pushing out. How are you supposed to build a fabrication factory or come up with a process so you can compete when by the time you finish it it is already outdated to what they are releasing? "Good job, you built a plant that can construct 22nm chips, too bad Intel is already rolling out 14nm."
There is absolutely nothing wrong with a monopoly in the sense that it is the only company in the market. Competition doesn't need to physically exist, only the threat of it does, and as long as regulators stay away from the computer industry we will have innovation for decades to come.
>I'd tell you to look into the history of high performance SoCs, but you have no idea what any of these words mean anyway.
It's mostly because ATI was still independant. If AMD owned ATI before x360, it would had been a good deal for console makers.
>tfw 290 Tri-X for 260€ on Amazon
had to fight every fiber of my being to not order it
I'm waiting for the new cards and that's final
>The reason Nvidia and Intel are the giants they are aren't because of any sort of foul play
I can't count how many times intel had anti-trust lawsuit against them in europe and the US.
Its a set of FPU instructions, and only intel maintains hardware support for it because they're not used by anything. AMD handles them through software.
Skyrim only used a little X87 code, and that was never the issue with its poor performance. The engine's terrible threading is the issue.
Microsoft purchased the design of the GPU from ATI. They own the Xenos GPU in its entirety. This is how they managed to produce the latter Vejle SoC in the 360 Slim.
>mfw i5 4690k @ 4.5ghz
>mfw 16gb of ram
>mfw gtx 970 @ 1500mhz%4000mhz
>mfw minimum 60fps in ubishitgames in fullhd and ultra details
feels good to live with parents, not having to spend my income on actual living.
The 8087 was the floating-point coprocessor for the 8086. The part names evolved similarly, ending with the Pentium when they just built in floating-point as standard on all parts.
The instructions to control the x87 are still supported, but are ridiculously slow on modern systems. They all work internally at 80 bits of precision (32-bit is standard, 64 is used in scientific code, 80 is unheard of). They have a strange register file organized as a stack. They only operate on one set of data per instruction.
Real modern code uses one of the SSE (steaming SIMD extensions) instruction sets. They control a modern floating-point unit, with a flat register file, 32- or 64-bit precision, standard rounding and under/overflow rules, and vectorized instructions for doing most of the relevant math multiple times faster.
SSE has been around for 15 years. Every 64-bit processor has at least SSE2. There's no reason to ever use x87 instructions unless you're trying to intentionally kneecap performance.
Oh absolutely, but I remember a few early Nvidia revisions to it could be modded to run on ATI hardware, and then Nvidia later even blocked it off if the presence of an ATI card was detected in the system (which prevented me using my 8800GT as a physics processor for shits and giggles).
The whole post-AGIEA PhysX deal is nothing but bullshit all around.
The screen needing a downclock probably, but I know that the cursor bug is a common one. Also worth remembering was how using Flash H/W acceleration would cause the card to get stuck in 2D clocks because of another driver fault, though you could likely attribute half of that to '>flash'.
>Maybe at one point in time by stealing Intel's processor name
If you don't have the money for it then sure, go with the i5. But i7s give you maximum performance without going full retard for extreme processors. If you have a decent budget for a PC you should definitely just go with whatever the highest tier, non-extreme i7 is and never have to worry about it.
>Oh absolutely, but I remember a few early Nvidia revisions to it could be modded to run on ATI hardware, and then Nvidia later even blocked it off if the presence of an ATI card was detected in the system (which prevented me using my 8800GT as a physics processor for shits and giggles).
>The whole post-AGIEA PhysX deal is nothing but bullshit all around.
I will never forgive NVIDIA for the crap they pulled back then.
>There are no barriers to entry into the market
>tfw 290X for 207$ ebay, free shipping no tax
>tfw it used to be a scrypt mining card
>tfw i dont give a shit
>tfw i've ran countless hours of OCCT gpu data failure / artifact test
>tfw ive been gaming for two solid weeks with 100$ GPU load
>tfw card is fine
Nvidias graffix suite is just more useful for devs and they can get moar graffix easier on nvidia.
AMD needs to at least try, tressFX was a single step and then they failed miserably at that too.
AMD has historically always been the "better value" option that's cheaper but worse. But these days, Nvidia is being even more sketchy with Gameworks etc. Intel did the same on the CPU side, except they actually got caught bribing and doing other illegal stuff.
AMD is worse, but it's still better for everyone if they exist. Competition is always better.
They didn't "close."
Most of them were absorbed by ATI or Nvidia.
Matrox is still operating independently, though they just produce nich cards using Nvidia or AMD dies
PowerVR is still alive and well with Imagination Tech
Qualcomm has their Adreno GPU IP
Vivante is cornering super low power segments
ARM has their Mali GPUs that are used by a dozen different chip makers
Samsung is creating their own GPU from scratch
intel has their HD graphics
There are at least a dozen independent firms out there still producing their own unique GPU IP, they're just focusing on embedded and mobile devices rather than desktops.
How is that even legal? You buy a cheap NVIDIA card which is advertised to support PhysX. They then go and disable that function since you have and AMD card installed as well. Didn't you already pay for PhysX?
failed dude. any new tech sees heaps of startups fail until there's a clear market leader or two. The last example of this was when the bubble burst on internet stocks. Shit even Henry Ford had competition in the car market initially.
I would buy an AMD GPU if they weren't such fucking power-hungry fucks.
Compare the GTX 750Ti, 970 and 980 to their AMD equivalents in terms of power, then see their power requirements. Yeah, I'm not getting a new PSU just to feed an inefficient piece of shit.
I don't think anyone in the business really wants to provide the hardware for consoles. It's a lot of work for not much money, which is why AMD begrudgingly has to. I mean, they push 50 million units over the span of like 8 years, but the real money is in tablets, phones, notebooks, etc. Tablets and phones especially, where hardware is deprecated faster. A single phone can push over 15 million in a quarter. Everyone needs a phone and upgrades get re-bought yearly.
I think that's why Nvidia pushed so hard towards Tegra, which pretty much got blown the fuck out so hard they had to resort to suing Qualcomm and Samsung. They want mobile money, because that's where the money is actually at.
AMD has always been the cheaper brand. In the last few years they focused more on mobile processors and chips for the new consoles rather than paying devs for increased optimization on their hardware. It is known.
>AMD has historically always been the "better value" option that's cheaper but worse.
When it was ATI yeah the FX5800ultra totally beat the Radeon 9700pro hurrr and the P4 was Better than the Athlon XP
Ultimate combination was GeForce FX + Intel P4 amirite guise?
An anti-trust lawsuit for what? There has been no point where Intel has lobbied regulators except for possibly trademarks for their processor names, and that's hardly a problem at all.
I could set up my own fabrication plant and start selling processors to compete with Intel and no one would stop me, I would be very bad at it but I could still do it. This is unlike the Automotive industry where you have to get permits and slgo through red tape just to get your car on the road, and then have to get a dealership to sell your car since by law the manufacturer can't sell them themselves.Or with ISPs where local governments will not permit you to start your own ISP and force you to once again go through permits and red tape while simultaneously letting companies like Comcast who have lobbied the municipalities to allow them to get around that red tape. Why do you think Google Fiber is asking the cities themselves for special rights to go ahead lay down their service? Because they want to avoid all that bullshit red tape. Those are real barriers to entry, the electronics industry has hardly any, the only parts dealing with the FCC and radio frequencies of the hardware, which is not much of a problem in the computer industry.
>I could set up my own fabrication plant and start selling processors to compete with Intel and no one would stop me,
Intel would, because they hold patents that are essential to make an x86-compatible part.
This, AMD was allowed to use since there can't be a monopoly. Since AMD invented x86-x64 they had to give it to Intel as well.
You can't just go ahead and start making x86 cpus lol.
AMD made AMD64, which is an extension to X86.
Intel adopted the instruction set after Itanium failed, and both companies basically had each other by the balls for the better part of a decade.
VIA got a license from intel and had it extended almost indefinitely after the FTC slapped intel's shit for being filthy kikes.
Now the issue isn't so much X86 licensing as it is no one wants to bother with X86 any more.
Horrible software support and focusing on gimmicks withotu actually promoting/supporting them propely, see Hydravision, Crossfire, TressFX and Mantle.
On the other hand NVidia offers driver-based AO, hardware-based video capturing of gameplay, working drivers on release day for AAA titles and generally less issues with them.
I just got a new rig with a GTX 970.
Should I have gotten the R290x?
>working drivers on release day for AAA titles
and this is not even an AMD Evolved tittle
unless it's 'The Way Its Meant To Be Played™' tittle Nvidia has worse drivers than AMD
HBM does not significantly lower power consumption of a GPU. It draws less power than GDDR5, but this isn't a significant portion of a card's power draw regardless.
High performance 20nm bulk GPUs are not happening.
The R9 390X is going to be a large 28nm die. They'll push on perf/watt with better adaptive powertune, and maybe an SOI process if GloFo is the foundry.
Maxwell isn't magically any more efficient than Hawaii, Nvidia is just keeping power down with their memory sub system and adaptive power delivery. In high shader utilization workloads Maxwell is barely any more efficient than Kepler and still nearly as poor performance wise.
>An anti-trust lawsuit for what?
They were bribing companies like Dell, HP and other companies that sell fully built desktops to only use Intel processors and not AMD. They were caught doing it too.
Like 10 years back these desktop companies would only offer Intel desktops. Euro and US slapped them for anti-trust lawsuits. Intel lost ended up paying AMD a shitload of cash.
do i need to change motherboards now?
Around 2005 to 2009, companies like HP, Dell, Gateway and more would only offer Intel processors with their desktop because they were being bribed to only buy Intel processors and not AMD. AMD went to court over this in europe and US. It ended with Intel settling outside of court in 2009 and paid amd 1.25 billion. Now companies like Dell and HP do offer AMD processors.
>Why do AMD owners constantly get fucked over these days with PC games? We always get the short straw compared to Nvidia owners.
Only when it comes to games/publishers Nvidia has a contract with, and they'ee usually fucking disgusting dog shit like Ubisoft games that even run bad on Nvidia cards.
Otherwise game run great on AMD cards, Alien, for example, and AMD cards have a way better price to performance ratio. They aren't trying to bring that childish console warrior shit to PC unlike Nvidia and they outsource all their special shit.
So yeah, I thinkyou have to be an idiot if you support Nvidia, same kind of people that buy Beats headphones, Apple shit, a PS4 or AssCreed.
Well Intel knew they would lose in court... so they just settled outside of court and paid 1.25 billion to amd.
>oh wow, I better get hyped for new DRAM that has literally no tangible benefit in a desktop
>I should buy it just because its new!
The effects of memory scaling from 1600mhz to 3000mhz on Haswell are virtually non existent. Minor frame variations are all within margin of error. DDR4 does literally nothing for gaming. I will never understand why retards here thought it was something to get excited about.
Cuz Njewia moneyhats like crazy while Ati plays it cool and delivers driver upgrades to make games run better a few weeks later.
I don't want to know how much they paid Kojima so MGSV has 5 more frames with Nvidia cards. Hell, an R9 290X has less frames in that game than a fucking 670. Njewia is disgusting as fuck.
>The effects of memory scaling from 1600mhz to 3000mhz on Haswell are virtually non existent. Minor frame variations are all within margin of error. DDR4 does literally nothing for gaming. I will never understand why retards here thought it was something to get excited about.
AMD has no APUs that use DDR4.
Their future Zen based chips have on package HBM to feed the IGP. Again, DDR4 has no tangible benefit in a desktop.
DDR4 only has use in enterprise where the reduced power consumption and increased density add up across dozens of server racks.
AMD focus is on brute force. They draw massive power loads and use hundreds of thousands of simpler cores. As a result they dominate in terms of FLOPS (and is the reason why bitcoin miners use them). They are absolutely excellent as simple tasks such as shading in pixels or basic AA. Consequently they suffer with more complex calculations such as more complex AA methods or vector calculations. They move pixels on the screen very fast, they just don't do it intelligently.
Nvidia on the other hand is the exact opposite focusing on maximum efficiency. They use much less power and a few thousand cores but those cores are intelligent. Pixel shading and the like they end up struggling on, but on these complex calculations they breeze through them like no other (which is why PhsyX exists, because the cores are smart enough to do these physics calculations). Nvidia also has a team dedicated to optimizing their own drivers for certain games, which is a huge plus.
It all depends on what settings you are going to run games at. If you run them on medium or low then AMD is going to dominate, but on high or ultra Nvidia will come out ahead (depending of course on the features the game uses and at what settings it uses them on).
I literally just met someone yesterday who bought a GTX 980 over an R9 290x just because "muh shadowplay" and "muh physx".
I seriously hope you wasn't that guy /v/.