>/v/ overwhelmingly prefers nvidia cards
>/g/ overwhelmingly prefers AMD cards
explain this shit
freedumbs mainly I think.
Nvidia is pretty much Linux' archenemy, whereas AMD are much more lenient.
That said, for some reason, nvidia cards get 10 bit on linux but amd cards don't, so I'm thinking about switching for my next purchase.
/v/ is about gaming.
Nvidia cards are better with gaming.
/g/ is about budget computing.
AMD cards are cheaper "graphics cards" for dick wheels who can't afford intel chips with onboard graphics.
Yes, I know this reasoning only holds water if you assume people who are on /g/ aren't on /v/.
Because if they were, they'd support Nvidia like the rest of /v/.
>/g/ knows how computers work
>/v/ doesn't know how computers work
Hence the people who dont know how computers work buy nvidia because they dont understand how they are getting fucked out of their money for inferior hardware
>mfw the fastest card for 4k gaming is the 295x and it's half the price of the titan Z
oh but congratulations on spending more money than me, that just makes you a retard fanboy
>Nvidia cards cant render video (shitty drivers LOL)
>Nvidia cards cant run OpenGL
>Nvidia cards cant run on linux (SHITTY DRIVERS AGAIN!)
>Nvidia cards cant solve complex mathematical algorithms to mine a crypto-currency
literally the only thing a nvidia card can do is run a directX game and nothing else hence why the luddite faggots on /v/ like them so much
if you are using a GPU for ANYTHING other than DirectX games you are using an AMD GPU every......single....fucking.....time
I giggled. I'd worry more about the average user. The rest of your machine has to be equally impressive for that card to even matter. That's not what /g/ is about. If the faggots over at /v/ are worried about the highest quality they'd do their research.
>/g/ prefers AMD cards
I've never read more potent literary bullshit in my life
It's true that /v/ shills nvidia hard though, the entire board has Mr. Huang's dick so far up their ass it's affecting their brain
well of course they do why would those little retards think they are l33t haxorz and since they think they are they come to the board for that shit we are a computer illiterate, retard magnate and those retards love them some green nvidia cock up their tender assholes
>mfw the 290x is half the price of the 980
>gets higher fps regardless in any higher resolution
the real question is, why do people overpay for AMD? Are the shills on /v/ really that effective?
I've always preferred Nvidia, but thanks to AMD I made over 20k from mining in a single year so I honestly can't bring myself to hate them.
If I had bought a 5870 instead of a 470 when I built my first rig I probably would've been a bitcoin millionaire by now.
Meant to type overpay for nvidia
whoops, the shills have got to my brain :^)
I've been an AMD user for years but switched to NVIDIA because I prefer having a usable stable driver which works with the latest X.org release as well as the latest kernel version
I might switch back if AMD's open source strategy shows some great success, but so far I'm happy with my NVIDIA GPU
I know that feel man
the fucking ebin face cracks me up too
This is more of a "price range" style question rather than a "I have a huge ass monitor, and would like the best resolution" question.
Even the older GTX 600s are capable of a 4k resolution.
I read somewhere about a monitor that uses two HDMI inputs to attain a 4k output, but I'm not sure if that was just snake oil, or a specific card configuration or... what. I was half asleep when I read it.
I'm running a HD 7950 DC2T right now to give some perspective. Honestly, it does most of the job fine, maybe something about 30% more powerful would satisfy me.
The main issue is not having 10 bit output support. It seems like you either need an NVidia GeForce card on Linux, or an AMD FirePro / NVidia Quadro and those *literally* cost 10x as much as comparable Radeon/GeForce cards.
That doesn't change anything, you stupid piece of shit.
GFX driver development is basically compiler development nowadays and AMD has proven enough times that they can't do compilers.
/g/ understands technology. /v/ is a bunch of kids who pay for proprietary software
Well, you say that. But "for 4k gaming" would imply that you spent a crazy amount of money on getting a decent 4k setup. And if you didn't, then I don't see the point of the argument, as I play on 2k
/v/ is a bunch of underinformed gamers easily swayed by Nvidia's propaganda. /g/ knows that, for cards in the same generation (7000 series vs 600 series, R9 series vs 700 series, etc.) AMD usually wins on anything that's not 720p with 4096x FXAA. However, I will admit that for someone buying a card right now the 900 series is hands down the best option, as they're cheaper than the R9 series yet perform equal to or better than R9 cards while using less power.
>buy a NoVIDIA card years ago instead of a AMD one like I always do
>stop using winshitter since I'm no longer retarded
>completely useless, can't even pass it through to virtual machines on my VT-d enabled system
I made the mistake once, don't do it. AMD > shitvidia
You don't buy hardware. You buy game optimisation. Friend has a better amd than my nvidia, still lower fps and lag in the same game with the same settings. If you are not gaming buy amd. Cheaper and freedom.
I prefer whatever card is best for the price at the time I am buying and will serve my needs, maintaining cross-platform compatibility.
The last two times that's been Nvidia.
If AMD has better offerings when I buy another card, I'll go with them, but unlike seemingly everyone else I don't have pointless brand loyalty... The AMD driver situation in Linux has been a major turn off for me however.
Regarding graphics accelerators for PCs, ATI mostly cooperates with the free software movement, while nVidia is totally hostile. ATI has released free drivers.
However, the ATI drivers use nonfree microcode blobs, whereas most of nVidia's products (excepting the most recent ones) work ok with Nouveau, which is entirely free and has no blobs.
AMD might be more cooperative, but I care more about performance than software freedom, and Nvidia's proprietary drivers offer that far beyond either the open or proprietary offerings from AMD.
If I could have performance AND freedom that'd be nice, but... AMD seems like it's getting better, maybe in a few years.
It's true. Nvidia works poorly on Linux. See http://www.phoronix.com/scan.php?page=article&item=amd_nvintel_316&num=1 for benchmarks.
(No, no one cares about the KDE skinned Windows box you get by installing non-free drivers)
It's simply marketing and heavy shilling on gaymen forums and maybe on /v/ itself. I still remember when the 7970 was released and there were people shitting over it because of the price. After four months nvidia released one of the most shittiest card ever made in the last years, the gtx680 who costed more that the 7970 and performed about a 10% less than the older amd card. Surprisingly the card did sold well and there were even people who changed their 7970 for a shittier card with factory overclock, promoted as "uber ultra nvidia turbo boost". When things like that happen I think that nvidia fanboys are similar and even dumber than applefags.
nvidia shills anger me
>7970 comes out
>wipes the floor with nvidias current offerings
>nvidia shills spam nothing but "TOO MUCH MONEY MUH COST TO PERFORMANCE RATIO"
>skip to now
>AMD cards cheaper than nvidia while offering better price/performance
>nvidia shills spam nothing but "LOL POORFAG"
this shit makes me mad yo
but of course when you bring up that the best graphics card is a 295x they'll tell you that it doesn't matter because "nobody games in 4 or 2k"
Except that nVidia still has a pretty good price performance ratio, first with the 660, and now doing great with the 970, which also has pretty much the best performance/watt ratio of any high-end graphics card bar none.
I don't mean overheating, I mean high temperatures. Which means unnecessary heat being dumped into your case, unless you have water cooling. Even so, that leaves your options as 1.) Spend more on additional cooling options (2.) Make other parts do additional work to cope.
Not saying the card is necessarily bad, but the symptoms of using it are definitely unfortunate.
/g/ cant even afford to buy an operating system, which is why they use $20 decades old computers that can't run anything but a free version of linux. of course they will go with the cheaper option with tons of software bugs so they can brag about how much time they spent 'fixing' it.
/v/ is full of children using laptops with shitty intel integrated cards, playing mobas and praising le gaben while thinking they are part of 'le master race', if they actually had mommy and daddy buy them a computer they looked at 50 youtube videos instead of making their own decision, and went with nvidia
Dear god, somebody on /g/ with a brain.
Why fanboy corporations? Buy whatever is best your for your needs at the current time. It's up to you to decide which card is best for you.
Ah yes, the latest Nvidiot marketing spiel to justify why the 970 costs 50% more than a 290 for the same performance.
>muh $5 a year extra leccy bill
Are you a fucking idiot? A water cooled card and an air cooled card put out the same amount of thermal energy, even if the water cooled card is at 60C, and the air cooler card at 80C. Temperature just has to do with how efficiently that thermal energy is being carried away.
If you think another 40 watts of thermal energy in your case will kill you, better not stick in that harddrive RAID, those things make 12 watts of it each!
>implying they have even that
And the work that puts a recent nvidia card at 60 is the same work that would put a recent amd card at 90. And no matter what bullshit you try to spew, when given the option between more/less heat, or more/less energy consumption, it is better to pick the cooler, more efficient option.
The 970 is the best value on the market right now and by a fucking landslide. Especially if you go two-way SLI.
It's up there with the 9800GT and the 560Ti in the Nvidia hall of fame
>and the 290x is the bets videocard for 4k gaming
>Nvidia cards cant run OpenGL
I remember a few years ago nVidia's OpenGL performance absolutely shit on AMD's, like AMD would have a slight DirectX advantage but nVidia would rape them in OpenGL every time.
Did this trend reverse? When did that happen?
Ok so i shitposted in 2 threads now because i need an answer badly and looks like i went to all the wrong threads so i will ask here. So i would need a honest opinion. I have 2 graphics cards to choose from which are: Sapphire r7 250 1gb gddr5 boost lite and the other is Asus GT 740 oc 1gb gddr5. I have read some reviews and to be honest difference is minimum (atleast to my knowledge and from what i saw which is around 3%).. My question is which card would preform better in graphics design and in newish games? No need to post other cards etc . i just need opinion between this two cards. Thanks in advance for answers
>goes out of their way to laser out the chip so they mass fab the same piece but sell you a gimped version for more
>drivers that burn the cards
>proprietary shit like mlaa that runs like crap with the sole purpose of putting it into their engines so incompetent devs use it. don't care that it runs like shit even on their cards, as long as they run even worse on amd
>more proprietary shit like g-sync rather than using the existing standard freesync
>shills everywhere, like that faggot durante on neofag
>refuse to implement mantle because their cards are shit without their proprietary software and they don't want to compete directly with amd on an even ground
>still supporting nvidia after knowing about all this shit
shiggy diggy. the only thing they have over amd is the linux drivers and even then, amd's proprietary's drivers are still useable even with dual monitors and displayport daisychan -- i know, because i use it