Does /g/ like 16:10? Also general monitor thread.
Yes, but /g/ is so full of shilling and plebs wanting to feel good about their purchases that preferring 16:10 is now largely seen as an elitist position.
> 16:10 > 16:9 > 21:9 = 4:3, although 1200+2560+1200x1600 PLP with 2x 4:3s is god tier of last generation
People generally want more horizontal space, but they're ashamed they have to chop vertical space off to be able to afford larger monitors.
I have used many different monitors, and have found that height increases on small monitors are very useful. As the height starts to break beyond around ~16" though, it starts being difficult to read text without increasing font size because of physical restraints with how close you can be comfortably. At that maximum height, I then find more width useful for displaying more documents at once.
So, I would prefer a 16:10 at the cheapest and smallest screen, 16:9 as it gets larger, and ultrawide aspects like 21:9 if you can afford buying a screen that maintains a comfortable minimum height.
Tl;dr find a screen height that makes text documents feel nice and buy as wide of a screen as you can afford. (assuming you're a programmer obviously)
I've fallen for 21:9 and I don't think I can go back now. Past the point of no return.
Although when 4K becomes easier to drive and makes more sense in general, I may switch but even then I dunno. Having this much screen real estate without multiple monitors is an amazing thing.
That golden ratio stuff sounds pretty neat, but the high-res 16:10 monitors are way too expensive.
A 2560x1600 screen will cost you at least $1000, you can get a 4K model for less.
I'm in the market for an external monitor for my rMBP (which is 16:10), which also obviously have to be 16:10 in order for my wallpapers to match. However, I can't seem to find any cheap 1920x1200 screens around 21-23''.
I don't think I'll be giving up my FG2421 for a long time. It'll suck missing out on 4K but at least there's supersampling.
I want to pick this up when it hits $300 again. What do you think?
I use a 32 inch Samsung television as a monitor
LE32C530 from the 2010 model year
come at me
Fuck I love my pb278q
Only complaint is a single stuck pixel the day I started using it.
then you're retarded for buying a monitor that is gimped for everything but what you do. these super accurate color monitors are for nothing but editing. that's the only area they excel. they suck for everything else.
>these super accurate color monitors are for nothing but editing
This is complete bullshit. Color accuracy is important for virtually anything, although gaming benefits the least from it.
not him, but by "super accurate color monitor" do you just mean not completely washed out TN garbage?
pb278q is not overkill at all, and it's not "super accurate color" it's just a standard 1440p PLS
So you want your monitor to be.. bright? Have you considered swapping out the backlight for a car headlamp or cinema projector bulb or something?
I assure you, they will be quite bright and “striking” in the most literal of senses.
>Color accuracy is important for virtually anything
wrong, absolutely wrong. the only people how need full adobe RGB coverage are editing professionals. for anyone else, it's just another means of showing off on the internet. spend you rmoney on something that will better benefit you. for exaple, if you game get something with more muh hz and less muh response time.
>bright and striking
>attracting attention by reason of being unusual, extreme, or prominent
Jesus, I'll even do color for you.
Is the visual perceptual property corresponding in humans to the categories called red, blue, yellow, and others. Color derives from the spectrum of light (distribution of light power versus wavelength) interacting in the eye with the spectral sensitivities of the light receptors. Color categories and physical specifications of color are also associated with objects or materials based on their physical properties such as light absorption, reflection, or emission spectra. By defining a color space, colors can be identified numerically by their coordinates.
Can you put two and two together now?
Am I being fucking trolled? Do you not know the difference between gamut and accuracy? Where the fuck did I even mention AdobeRGB?
Shit monitors will obscure all sorts of details and make simple tasks like browsing a huge pain. Even if I simply turn off the calibration on my monitor and set it back to its default gamma tables, I fucking cringe every time.
The accuracy is so much off of what a true response should look like that it messes with pretty much everything. It's like using shitty headphones that amplify and distort the bass.
Would you argue that Skullcandy and Beats are fine as long as you don't do any mastering on them?
I'm using two IPS panels, get the fuck off my case.
So you want your monitor to attract attention by reason of being unusually, extremely and prominently bright? You want the numerical stimulus values of your monitor's output to be unreasonably high?
Both of these are achieved by turning up the backlight intensity. So in other words: You want a bright monitor. Good job!
>spend your money on something that will better benefit you. for example, if you game get something with more muh hz and less muh response time.
I wanted a 23+ inch 2560x1440 monitor that wasn't a tn. As I stated, refresh rate isn't the biggest deal in the world to me, especially considering I don't play shit like cawadoody. I had no interest in buying the swift, especially after seeing that it was a tn panel, and all of the qc issues it seems to have.
I fail to see how I'm showing off. As far as monitors go, I could've spent alot more than what I did.
>I don't know how adjectives work
NOW WAIT JUST ONE FUCKING MOMENT.
He said, and I quote, “vibrant colors”. He did not say “better”. He said “vibrant”.
I am, quite specifically, trying to figure out just *what the fuck* exactly he thinks a “vibrant” color is supposed to be.
>I am, quite specifically, trying to figure out just *what the fuck* exactly he thinks a “vibrant” color is supposed to be.
A color better than found on a typical tn panel, a color that is of better quality than others.
How dense are you?
1. Take a picture
2. Saturate it to hell
3. Put it beside the original picture
4. Not the obvious differences
Wow that was hard. Not even mentioning anti-glare coatings muddying up your picture even further.
Wait, so a “vibrant” color is a “better” color? Actually, how is a color even “better” than another?
Is “green” better than “blue”?
So by “vibrant” he means “oversaturated”? If that's what he was trying to say, why didn't he used that term in the first place?
>I like oversaturated colors
Because it would make him sound like a fucking retard.
you don't get what im saying. i'll explain it again. even the poor colors of a tn will suit 99% of users fine (yourself included) because you are not a professional.
get the new asus that they showed off at ces. 120hz 5ms qhd ips. $600. don't be retarded like the other poster i'm talking about and buy some sort of super accurate color monitor like a dell ultra $1000+ with shit hz and response time and use it to fucking game.
>What the fuck does this even mean?
Vibrance is similar, and related to saturation.
More specifically: how saturated the non-saturated pixels are compared to the saturated pixels.
>Wait, so a “vibrant” color is a “better” color? Actually, how is a color even “better” than another?
It's usually more pleasing to the eye.
>So by “vibrant” he means “oversaturated”?
In fact when you have vibrant colors you can reduce the overall saturation and sill have a pleasing picture.
Oversaturating is what people tend to do who can't get vibrant colors.
They are basically raping the color because they can't make them nice.
Yes, 30" 2560x1600 was the best available monitor for the better part of a decade, while 2560x1440 is pleb and the new 2560x1080 "ultrawide" is trash.
I weep that we won't be seeing 4096x2560 or even 3840x2400 monitors to any real extent, though I will be OK with mere UHD if it ever comes in 40"-45" IPS.
>TFW playing elite dangerous in surround 3x1080p
Truly a glorious feeling. Any more pixels and Itd compromise fps too much, but I easily get 60fps+ on ultra settings and its amazing feeling like you're actually surrounded by the cockpit.
>It's usually more pleasing to the eye.
But that's subjective.
Video displays should be objective. There's no way you could possibly argue against that statement.
Whoa, that's a weird definition. Anyway, assuming that all displays want to be able to reproduce a gray tone, we have a fixed lower bound for saturation.
So if the definition is inverted, wouldn't that mean that a high vibrance = lower difference between low saturation (gray tone) and high saturation (spectral color), and thus mean “lower upper bound on saturation”?
Based on this definition, it would seem like a more vibrant display device therefore has a *smaller* gamut than a less vibrant one.
>buying anything lower than 4k
The UP2414Q is only $700 cucks
The UP2414Q is also on the upper end of the spectrum for 24" 2160p monitors.
As discussed in this very thread, the entry level is much lower. Consider whether you truly need AdobeRGB coverage, 8+2 bit AFRC and BG-r backlighting.
>Video displays should be objective.
And what is "objective"?
Should a picture taken under tungsten light appear yellow when displayed in your home? - or should we mimic the human eye's ability to adjust for color temperatures?
Should shadows be as dark as they are in real life or lifted like the human eye sees them?
I like my 16:10, and am likely to continue to try and buy them, when I eventually change these. I've seen newer ones with smaller bezels, but I'm in no hurry to change.
more screenspace for no additional desksize.
vertical screenspace is handy for many websites, as well as any coding. Most layouts are still up/down layouts rather than left/right.
>And what is "objective"?
Depends slightly on the specification you're implementing, eg. ITU-R Recommendations BT.1886, BT.2020 and BT.2035, EBU Tech 3320, or equivalent SMPTE standards.
Alternative definitions exist, for example sRGB or AdobeRGB. Professional monitors will therefore generally implement multiple of these modes as “color profiles”.
>Should a picture taken under tungsten light appear yellow when displayed in your home?
Chromatic adaptation is not the job of a fucking display device, it's the job of your CMM. If you want to adapt the white points in a particular photo, fucking go ahead. That's completely outside the scope of this discussion, which is about display accuracy. Display accuracy has a technical interpretation - in that the API for a display is a 3-tuple of numerical values ranging from 0 to 2^n (where n is the bit depth of the monitor, typically 8).
We are concerning ourself primarily with the mapping of these triples to the (physical) tristimulus characteristics of the resulting color on the display, as measured with a colorimeter or similar device. There are monitors which are more or less accurate in their ability to achieve a native reproduction.
(Note that this function is also dependent on the viewing angle, so when you're talking about the “color accuracy” of a display, you have to remember that we're implicitly including “over all typical viewing angles”)
>Should shadows be as dark as they are in real life or lifted like the human eye sees them?
An ideal display device would map them to be as dark as they are in real life. Of course, since this is not practical in the real world, specifications typically have a certain defined tolerance for black level brightness - but these are still seen as deficits (errors) of the display device.
That's pretty standard, but the DPI (~100) is on the lower end of what I consider acceptable.
Personally, for a desktop monitor I would advise targeting something with ~150 dpi, perhaps even higher (eg. if you want more clarity in small CJK fonts)
For reference, 150 dpi ≈
- 1920x1200 at 15"
- 2560x1600 at 20"
- 3840x2400 at 30"
Scaling, son. In general, the way elements are scaled on a 4k screen gives you the same space as you'd have on a 1600x900 screen. It will look a lot better, but the workspace isn't as full as it could be.
You could use it without scaling, but your eyes would hate you.
if you get 4k at 24" you can basically have 1920x1080 in "retina" form.
Whether your operating system will scale stuff properly is a question. If anyone has really good experiences with any Linux distro or Windows 10, I'd like to see screenshots if possible. I have a hackintosh setup running on a fairly powerful desktop purely because it's the only mainstream OS I've found with really good high density display support. Mostly that's because they've been pushing that for nearly 3 years now, but the point remains that Linux distros and Windows generally still seem to be behind.
that's 'retina' at about 22 inches. I sit about 28ish inches away (my monitors are not 'retina' at that distance, and in any case my vision is marginally better than 20:20)
CJK fonts are a reasonable reason tbh.
At the same size, perhaps. Depends on the size involved, also. The two factors that I feel work against this are:
1. If you're moving to 4K, it would make most sense to move to a larger screen diagonal, eg. 30" - thus giving you more work space.
2. Due to the higher DPI, you can use smaller fonts than you might otherwise be comfortable with, and thus get more effective working area that way.
Personally, I was moving from a 24" 1080p-ish monitor to a 31" 4K monitor, which is a huge increase in effective working area.
>If anyone has really good experiences with any Linux distro or Windows 10, I'd like to see screenshots if possible.
Sure, have an example
they're 'retina' at ~36" and I sit about 28" away (and in any case, my vision is marginally better than 20:20)
They're perfectly fine for what I use them for (non-photo/film editing). Sure, I'd like higher res, but there wasn't a reasonable option for that at the price I was willing to pay at the time.
affordable 120hz 4K monitor when?
Anything not terminal-specific? I do a lot of stuff in the terminal but I'd also like to use these monitors as extensions to my laptop, meaning I'd like to use a web browser and all that other shit pretty fluidly.
They're pretty good monitors. I wouldn't worry about maxing ppi, but then that depends on what you're doing with them.
There is the newer U2415, which is also 16:10. I couldn't tell you anything about the monitor price/value market at the moment as I haven't been looking, but I'm aware of their existence, and you might want to look into them, depending on price.
Yes, we do like 16:10. I use 16:10.
I hope I can get a 1:1 monitor for less than $2000 sometime, though. It'd be nice to have as much height as width.
For now, I can drive my CRT from 1995 at 2560x1920i at 90hz, which is pretty nice.
(interlaced video modes are awesome!)
What would you (the ones familiar with said) recommended between these two dell monitors u2412m vs u2414h.
I can get 12m for 60% of the new one here with 2 more years of warranty or 14h for around 10%more than 12m with a year of warranty.
Have a ol 1680x1050 tn Philips
And here's an example of a Gtk+ application
Gtk and Qt etc. are obviously much worse at scaling than text-only programs, which have the benefit of simply needing to increase the font size to compensate. (In fact, stuff like Xft seems to automatically detect my monitor DPI)
Can't show you an example of a web browser, since I run my web browser on a dedicated monitor in portrait mode, which has a much lower DPI.
The problem with scaling anything that isn't text is that you have to use vector graphics or upscaling for everything that's an image. For stuff like Qt or Gtk+, using vector graphics is reasonable - but on the internet, it's not like we can make the thumbnails on 4chan be any less bitmapped than they currently are.
The logical consequence is that you need image upscaling, and that means almost everything under the sun will invariably implement something terrible like bilinear scaling.
I do, which is why I have everything forced to Terminus
> being this new at typography
Terminus is not a good looking font. It is a clear font for certain use-cases where monospace is desirable (and there are other, far better fonts for that), and that's all. You're using it all over the place and it looks like shit.
> being this new at typography
I don't really give a shit about your hobby.
>Terminus is not a good looking font.
I also don't really give a shit about your subjective opinion.
>It is a clear font for certain use-cases where monospace is desirable
It's the best pixel-mapped monospace font I have found that works nicely all the way down to very low sizes.
>(and there are other, far better fonts for that)
You say this yet you don't name any examples
>You're using it all over the place
Yes, I am aware.
>and it looks like shit.
I don't really give a shit about your subjective opinion.
You seem to care a lot given how defensive you are. Typography isn't my hobby, giving spergs constructive criticism is my hobby.
Terminus is objectively bad looking. Its value is utilitarian. You are using a monkey wrench as a soup spoon and that is why it looks bad.
I didn't think there was any real difference until I picked up a dell workstation. Holy fuck it's amazing, It completely changed the way I work on things, I went from always having my main window fullscreen and information/various things on a second of to the side TV to almost entirely running everything side by side on the main display and reserving the TV for like, a movie or chat windows. It's awesome having reference material side by side with whatever I'm working on and not feeling near as much like I've squished everything
Thanks for the input, I do quite like the colour reproduction of 12m, 14h should be just a bit better everywhere except screen real estate, but the thing that's mildly annoying me is the grain due to heavy coating on the 12m. supposedly 14h has a less aggressive coating, but my friend says that stops being an issue when you stop looking for it so yeah there is that.
will probably go for the 16:10 but would value additional experience /advice regarding these two.
>do you care about fonts?
>"being this new to typography [protip: that's fonts]"
>I don't really give a shit about your hobby
okay so in the non-autistic world this makes you a hypocritical asshole.
Courier and Unifont are the best monospace. Unifont if you need really good unicode support. Courier if you like classics.
(btw, most linux TTY's (ctrl+alt+f*) use unifont. That's what it looks like.
Thats a sweet monitor.
Mine (that I get 2560x1920 on) is a lowly 100SF that I got free.
Play with X11 video modes - your CRT controller will probably (if it's at all like mine) prevent catastrophic failures :D
Ive done all sorts with mine n xrandr.
The w900 seems to have a clock limit of around 350mhz as I can only get it to operate at 2560x1600@59Hz but I do have another 4:3 CRT that will go al the way at 2560x2048@53Hz
100sf ill remember that. Thanks anon!
>4k is useless.
Correction: 27"-30" UHD is useless.
I'd rather keep my ~100 ppi and have 2x more space over my 2560x1600 instead of slightly less area and 50% greater sharpness.
8k at 40"+ is where it will be at in like 5 years I hope - 200 ppi and a fuck ton of physical space.
> inb4 muh vidya and muh GPU being abused like a Filipino hooker
>It's mostly not wanting to change from the 90's. I can't stand 16:10.
16:9 for muh cinematic experience is for poor NEETS who cant into a real TV or who play shitty console ports on their PCs.
The big issue is that so-called widescreen resolutions are vertically shortened versions of 16:10/4:3 resolutions, not horizontally elongated.
1920x1200 (WUXGA) arguably was a wider version of 1600x1200 (UXGA), but everything since then in 16:9 or 21:9 should really be considered "short screen".
>16:9 for muh cinematic experience is for poor NEETS who cant into a real TV or who play shitty console ports on their PCs.
All games suck on 16:10; this is not limited to AAA console ports.
>40" monitor seems a bit extreme don't you think?
I have a 20"-30"-20" 1200+2560+1200x1600 PLP right now and would rather have a ~45" inch with the same ppi, no bezels, and 35% more vertical space/25% less horizontal space.
Using Terminus on GUI applications is a cardinal sin. In these cases a humanist sans-serif font like FreeSans is advised.
Using Terminus on CLI applications is a good deed: a consistent monospace font of small enough a size on a minimal interface is helpful.
The problem I have with Unifont is that it's strictly a bitmap font for that size: the actual definitions that are used would be hell to transfer into any other set of dimensions.
At least the standard xterm font and Terminus have different point sizes to allow for larger screens and better detail when needed.
> 29" 21:9 AOC monitor on the bottom
> 23" 16:9 ACER on the top
best screen setup i have ever had, perfect for gaming with maps/tips on top screen or misic player on top screen and a couple pages open on the bottom, also got it with bitcoins so i dont care.
Oh, right, FreeSans isn't humanist: Droid Sans, Roboto, and Ubuntu are.
If you must use a monospace font for GUI apps (which I contend you don't need to), try Inconsolata.
Then let me rephrase myself: You are a fucking dumbass piece of shit for using a bitmap font for a GUI application. Take my advice and switch to something pretty on large-enough displays or beat it you sick bastard.
> implying TV content on a workstation is what even matters
If 16:9 resolutions had a higher pixel count than corresponding 16:10 resolutions, they'd be better for computer monitors, but they don't, so they aren't.
End of story, chucklefuck.
Hope you don't have to scroll down too much on your manlet display to read this.
best pic i have, moved since then though.
Why would you ever want to use anything other than a bitmap font? I can never fucking read anything else, it's either blurry as all hell or full of rainbows.
But go on then, show us your superior font setup and I will see how much it makes my eyes bleed.
I have 3 of those, using my computer gives me boners.
As for the stuck pixel:
Often times you can massage it out by pressing on your screen with your thumbnail and rocking it back and forth.
So based on it not even being mentioned, I'm going to go ahead and assume G-Sync is a waste of money at this point? I'm in the market for a monitor soon, and I was looking at them, but from what I've read, the Asus ones have a really high failure rate?
G-Sync is a module that nvidia puts in monitors that you have to pay $100 extra for. Its only job is stopping AMD cards from using the adaptive refresh rate on the monitor.
Adaptive refresh rates are really good though. So if you get a G-sync monitor and use a graphics card that the module doesn't gimp then the frame syncing is really nice.
Freesync does the same thing but without the whole proprietary module thing. It also doesn't work without support from the graphics card itself, and that list is really short.
The technology is really good, but it's not mature yet.
Get it if you want to be an early adopter.
Get a freesync monitor, but you have to be prepared to get a 290/290X or one of the 300 series that supports it.
Nvidia could support freesync on their cards if they wanted, there's no cost for them to do so.
Will we EVER get OLED monitors /g/?
I'd trade in my 1440p Ultrashit in a heartbeat if they would be available, even if they would only be 1080p at 60hz.
I'm sick and tired of IPS glow, I just want some decent fucking black levels.
2560x1600 is the sweetspot now
anything lower just looks like shit
anything higher would require a lot more screenspace to actually use as a computer monitor
these monitors with over 3k resolution and less than 40inches are dumb as fuck
you cant read shit properly with them, the pixel pitch is too small
the market is going into full retard mode
Was looking into a 4:3 1600x1200 LCD monitor to replace my current secondary monitor.
Thinking about this. http://www.necdisplay.com/p/desktop-monitors/lcd2190uxp-bk?type=support
I've never heard of PVA. Is it any good?
>21:9 = 4:3
Nigga, that's just delusional. I use 21:9 like it's a holy temple for programming.
Maybe if you had like four 4:3 monitors side by side they could be considered equal to a 21:9.
Please enlighten me /g/. Between 1080p 144fps and 1440p +60fps would one require significantly better hardware to push said numbers or would it require more or less the same?
>calling it UI scaling
>not DPI scaling
>still implying DPI scaling is not shit
>using twice the amount of pixels to produce a fraction of the working space
phones get away with it because they are designed for facebook moms
Hi, im currently rolling with one Dell Ultrasharp U2312HM 23" and im planning to get 2 more monitors. What should i get?
Since this model is discontinued, what are my options.
I dont want much diffrence between the monitors and also i need suggestions about 3x wallmounting them.
I don't 16:9 is as much as a problem after getting a 1440p monitor. Plenty of real estate vertically.
1080p never felt really good for work and browsing.
Monoprice has announced an overclockable 2560x1600 27" if that's your thing.
Since the SQT is on autosage... Sorry for asking here.
As I use a 32" TV(Sony KDL-32U2000) as a monitor for a HTPC I have noticed the following:
>The screen is 1366x768
>The VGA clocks at 1280x760 60Hz max
>HDMI supports 1080i 1920x1080 at 60Hz but GPU only manages 30HZ
>Custom resolutions on VGA still let the TV render it at 1280x760
>Custom resolutions on HDMI at 1080i and 30Hz max.
Is the TV a shit or is it the GPU?
I have no problems with living with the restrictions but if the specsheet says it can, then why can't?
because it's supports 60i the gpu puts out 30p. or puts out 60i but still refers to it as "30Hz".
basically, 1080i at 60Hz means 30 full frames a second, so your gpu is confused and puts out 1 of the 2 options above. interlacing is disgusting and has no reason to exist anymore.
As said the panel's resolution is 1366x768, the available resolution goes as far as 1920x1080 however, I could possibly even do more since it apparently downscales the input to what the tv gives.
I don't know myself.
Bought an Ultra Wide (21:9) monitor..
>mfw only mainstream games support the resolution without a great deal of modifications
>mfw half of my movies are hard coded 1080p which means black borders..
Just got my Dell U2515H 1440p IPS monitors and this is glorious. Sooo much better than any TN panel I've ever used, took less than 10 seconds for my old screen to start looking blurry. Kinda wish it was 16:10 like my old monitor but whatever
>my 10 year old CRT doesn't have that problem