Going to be picking up a monitor soon, a bit unsure what I should be looking for. Obviously 1920x1080 and I would like a 120hz refresh rate, but I don't know what else I should be looking for. Price range is 200-300.
Hmm, ok. So is 120hz really worth it? Or should I go for the best picture available?
My apologies, dollars.
I suppose you can add "at least" before that 1920x1080.
>My apologies, dollars.
oh that narrows it down
/g/ is retarded tonight apparently.
OP, either get the BenQ XL2411Z or the Asus VG248QE. They're both TN panels, because IPS has input delay, so you're trading slight picture quality (namely color accuracy, which is not truly important unless you're working in print media as I understand it) for refresh rate and 1ms input delay. Both are excellent for muh gaymen. Go with like the Dell here:
This is the monitor i plan to buy. Not 1920x1080, But the panel is so nice.
The way I see the tn vs ips debate is
>How often are you going to be consuming content above 60hz?
Relatively rarely in anything other than indie games.
>How often will you be viewing colour on your monitor?
Pretty much all the time.
>Obviously 1920x1080 and I would like a 120hz refresh rate
>still buying 1080p monitors
If your GPU can play games at 120fps, then they can sure as hell play them at 2560x1440.
You're kind of buying at a point in time that's somewhat of a grey area. We're coming close to point (2015/16~) where 2560x1440 starts to become the standard. That said, 1080 is perfectly fine.
I'd recommend an AOC 2752 series monitor.
27" (which everyone here will hate, of course), 1920x1080, 2ms delay. Great for gaming, really. I couldn't do less than 27" again.
The reason people think they can discern these things is that back when everyone used CRT's the difference between 60 and 70 fps was incredibly noticeable; this is because up to ~67-77 fps the human eye can see the screen redrawing. With LCD panels things aren't shown the same way. it still updates line by line, but the previous drawing does not fade away because unlike CRT's where the electron guns would fire and momentarily light a phosphor, then move on, an LCD (or other flat-panel technology equivalent) _keeps_ the previous line displayed until it is told specifically to re-draw the scanline. It does not disappear the moment the electron gun has moved past the line. There is no electron gun, after all!
at 67-77 FPS (with the _EXTREMELY_ rare outlier) the human eye stops being able to see the flicker. Most people when buying their flat panels did not know this but placed value on higher refresh rates because they thought that it would work the same way a CRT did, in that respect. For non-sudden changes (like a jeep driving down a dirt road) the human eye cannot detect any difference above around 30fps. But with sudden changes (like flashes, that one pokemon episode that caused all those seizures, etc) the eye detects the change much faster because that's how we evolved. In fact, most people's response times upon seeing something is around 1-3 whole seconds!-- so double the FPS does not really change much.
It's a placebo effect.
above 14 inches the same dot pitch as compared to a 14 inch monitor (like one that fits on my desk) will allow more pixels to be displayed.
I chose 14 inches because that's the size of my computer's CRT.
I have an LCD and a CRT; I play movies and games on the CRT and I do work and shit on the LCD. Depends what you're looking for.
If you have a 27 inch monstrosity (like my TV in the basement) at the same dot pitch as my 14-inch CRT you can get an incredible number of pixels in that space.
Unfortunately, CRT TVs usually have a very coarse dot/stripe pitch. The finest CRT I can think of is the Sony Trinitron F520 which had a .22mm stripe pitch and 20" diagonal viewable area, which would be just over 1800 stripes. I think the FW900 has more stripes, but just because it's larger (the pitch is higher).
Aperture grille CRTs have stripes rather than dots. That's what makes them aperture grilles rather than shadow masks. An aperture grille CRT like a Trinitron has a stripe pitch rather than a dot pitch.
For someone who plays a lot of games, which is the better monitor?
>want a new monitor (have a AOC 29inch 21:9 IPS that i regret buying because lots of vidya have issues with 21:9)
>decently priced 120-144hz monitors in 27''
>also 1080p and 27inch dont go well together
>decently priced 2560x1440
>most are IPS with garbage refresh rate and that famous IPS glow
>check for 120+hz screens with 1440p
>overpriced as fuck
only panel taht fits my needs is that Asus swift something that costs 800$ because it has g-sync
Taken with a handheld camera, a magnifying glass, and my trinitron 100sf. It does indeed use stripes but it also uses empty scanlines to differentiate between pixels. At 800x600 it's noticeable with the naked eye, and looks like playing a SNES.
i dont want to buy some fagleap or any other shit panel coming from those bastards
korean 1440 were only popular because they were way cheaper than any other similar monitor by a proper brand at the time.
No, not really. If I get really close I can see pixels, but that's to be said of any monitor.
I'm really impressed, myself. Looks great. The only issue, and I think it's driver related, is that the screen will act like it's changing inputs for a brief second when I turn on full screen 1080p video, be it YouTube or VLC.
Shit, even then, 60FPS is perfectly smooth. I'm an avid PC gamer and I'd rather have a nice, clear, detailed screen than over 60FPS. That said, I put enough money into my PC that I rarely get under 60FPS either.
You also get smoother motion for cursor movement, scrolling, etc.
Plus the ability to play 24 FPS, 30 FPS, and 60 FPS video without changing refresh rates. Most monitors aren't even capable of playing 24 FPS video properly because they don't support any refresh rates other than 60Hz, even though 48/24 are lower.
i really hope Nvidia and AMD step their shit up, because from the evolution that we see in GPU we wont be able to get decent framerate in 4K anytime soon.
Thats why 1440 still sells better and that the price difference isnt that much.
A 24fps video will look pretty much the same on a 60hz and 120hz screen. 120hz just refreshing the same still image 2 times more than the 60hz.
So I'd rather take the one that looks nicest.
Your statement would be true if you replaced 24fps with 30fps, but 24fps video can't even be displayed properly on a 60hz display because it's not an even multiple. The difference is quite noticeable under certain circumstances.