Can someone please explain this HDMI shit to me? Why are we replacing what has worked perfect for decades?
HDMI transfers Video and Audio, its the only advantage.. Otherwise, fags will argue that having pure digital signal will be better than having an analogue<->digital signal transfer
It's much easier for analog signals to catch noise, especially if they run close to power that is operating on a different frequency, it could introduce a ground loop. HDMI in particular has sync redundancies built in and sends an uncompressed video signal anyway so the quality point is completely moot. If anything VGA can not have quality on par with HDMI since they both originate from digital sources in the first place.
Analog inferior, digital stronk
Displayport > HDMI > DVI > VGA > S-Video > Composite
>he still willingly uses VGA
>he's actually such a retard he defends it
It was fine in the days of CRT monitors, but now that everything is digital, it only makes sense that the signal sent to the monitor would be purely digital as well. Otherwise, you'd be going from digital to analog back to digital again.
That said, I still use a CRT monitor as a secondary, so VGA still has its uses for me.
am i the only one who's never seen any noise or artifacts using VGA? when using HDMI however it's rare that there isn't some form of noise/artifacting somewhere on the screen after continuous plugging and unplugging that bitch for a few months. 2/3 of the HDMI ports on my TV don't even detect a connection anymore but that might just be the shitty TV.
>are you trying to teach me? nigga, i am an engineer
>An Appeal to Authority is a fallacy with the following form: Person A is (claimed to be) an authority on subject S. Person A makes claim C about subject S. Therefore, C is true.
Ya, and I'm an engineer too. Pic related.
hdmi can do more than just video signal. it can be actually used for networking and other things.
plus, it's a digital signal with data integrity control measures, so it's like improved dvi.
Because monitors didn't have DRM built in, and this was problematic for hollywood.
Also, Since we've mostly transitioned to LCDs, it makes sense to use a digital connection, to avoid an extra conversion.
But don't use HDMI crap.
Use the open display port.
>Linux usually tries to early-adopt new technology standards, especially when their developers willing work with them or give them information so they don't have to be reverse-engineered
> FUCK DEM FREETARDS HERPA DERP
If you really are an engineer kill yourself. How do you not understand the benefits of digital over analogue!? A digital signal does suffer from noise but it can be error corrected provided the digital signal is in some way redundant. An analogue signal gets displayed as is, so you see the noise if there is noise present. Digial signals may also have noise, but after the error correction has been applied to get a perfect signal. If there is so too much noise the signal will fail entirely giving you no pictures. In reality data is send in small packets and decoded packet by packet so when the decoding of a packet fail you get a faulty patch on your screen (eg when you satallite dish is blown in the wind).
The advantage is once you get a sigal with say less than 30% noise it can be corrected back to a perfect noiseless signal. That is why digital is fundamentally better than analogue although certain digital standards are poorly implemented (eg HDMI).
Wrong, that is not the reason
When buying HDMI cables, I would recommend the best HDMI cable possible.
Cheap cables will cause data loss.
You are absolutely being rused. No one would buy that silly AudioQuest Diamond HDMI cable. Everyone knows that you should at least get solid silver conductors and carbon fiber shielding.
Displayport is royalty-free, which HDMI is not. Displayport doesn't have an obnoxiously huge connector, which DVI has.
Displayport can do tricks like connecting multiple monitors daisy-chained to one port. It also tends to support higher resolutions than HDMI, but only because devices are very slow to adopt the newer HDMI standards.
Pretty much, most businesses environments that don't need anything better (most don't) don't give a shit. Most college in the US still use it too. I work part time with my college and we deploy brand new projectors all over campus. We hook every laptop and desktop via VGA despite them having two HDMI channels presumably because VGA is cheaper (also the moron English teachers can't figure out how to pull them out and come bitching to us, unlike DP and HDMI) and there's no real need to give a shit for only 1080p monitors. They're real common too which is nice . Which is the same reason we still use Composite/RCA for all the DVD/VCRs.
Can someone please explain this automobile shit to me? Why are we replacing what has worked perfect for centuries?
>$3,000 hdmi cable
>not the $14,000 hdmi cable
Haha stay poor pleb
hdmi has a few advantages;
- digital, while VGA signal quality can be very good, this is still better
- audio transport, no need for other cable(s) to transport audio, great for things like media centers and bluray players
- CEC, a communication system that allows hdmi devices to communicate, allowing things like controlling a media center by using the TV's remote (the TV can pass things like "play" to the media center through the hdmi cable), as well as power signals (perhaps you want your bluray player to turn on when your tv turns on, and off when the tv turns off, you can do that, too)
in many places VGA is still fine, and it even still has the advantage of longer cable lengths
I just got a gtx750ti has no dvi-i output only dvi-d so i can't use dvi-to vga convertor (my monitor only has vga) i connected to vga port on the card, it's like shit waves on screen colors washed out so stick with dvi or hdmi vga is shit.
There's a reason why the vast majority of servers ship with VGA only - it's more reliable, more widely available (crash carts and KVMs), and POSTs faster (inb4 xFire and xServer c2t bullshit). For those of us who aren't autists or neets and have shit to get done, reliability is way more important than
> muh sharp pixals
XP isn't reliable anymore, which is why it's shit.
This was added in recent drivers, but for some reason it still defaults to limited range.
Don't they know most people won't know to check this shit or what it even means, and they'll think they're stuck with shitty washed out colours?
drm is the annoying side effect.
hdmi can be extended with various data integrity checks and has good symmetric bandwidth.
also, CEC is useful extension and HEC seems like potentially useful thing.
>Displayport can do tricks like connecting multiple monitors daisy-chained to one port
I'm pretty sure you can do the same with a DVI splitter. Granted, that is only two or three displays, but still.
Went to Berlin last summer. The downtown area around Brandenburger Tor looked dandy as fuck, and I really enjoyed the architecture, but the area where our hotel was (Franklinstrasse, at Spree), looked like a run down third world turkish shithole.
>New laptops arrive for our users
>They only have a mini HDMI (wasn't me who ordered or checked the specs)
>We had JUST barely made them understand how you connect a VGA cable
>Now we can start all over with the mini HDMI
God fucking damn it..
There are many more advantages to HDMI over VGA.
HDMI cable runs can be infinitely long with repeaters, VGA suffers from ugly degradation the longer the cable run, not even mentioning the analogue interference most cables can suffer even at >3M lengths.
VGA maxes out at 2048×1536 @ 85Hz (but good luck finding a monitor that can display it)
HDMI 2.0 supports up to 4096×2160 @ 60Hz and is a continually evolving standard so that will only improve as the display technology improves.
VGA supports a maximum of 16 bits of colour resolution per pixel, HDMI supports 24 bits per pixel, with multiple colour profiles such as sRGB as well as 4:2:2 or 4:4:4 YCbCr.
HDMI is a much more compact and easy to use connector too, how many times have you flipped a VGA connector over a bunch of times blindly trying to connect it? And fucking screws to secure it? It's not even designed to be hotpluggable for fucks sake.
Back in 1995 we had analogue monitors which needed analogue signals. Our computers supplied digital signals; the VGA cards had to convert those to digital.
Now we have digital monitors. Sending them digital signals, and not converting d->a->d is the big idea.
On the subject of HDMI why can't my laptop project to HDMI?
Other laptops seem fine and I checked with different cables.
It's HP laptop with core i5 intel cpu and Nvidia GPU running windows 7.
New laptop, and it have only the HDMI port.
When I plug the cable the screen turn off and back on like when you plug it and the screen flicker to change the resolution.
Why don't TVs have DVI ports?
It's literally going to be the same damn signal as an HDMI cable. Hell, DVI can even carry audio (I put an HDMI-to-DVI adapter onto my GPU and plugged an HDMI cable in from the TV and it transmits audio across it).
TVs need DVI ports, there's no sense not to have them.
>buy my first graphics card with a DP connector
>plug in DP and took it up to my old monitor
>six months later, new monitor
>pull out DP like it was an HDMI plug without pushing the button
>rekt my card
>can't used based DP again
think of it like VGA is satellite
It's fine when it's a great day out, but if things get cloudy it creates problems.
HDMI is like watching movies from your laptop via an HDMI cable. If the cables are in place it's only transmitting data, not a disreputable signal.
It's HDMI-0 here.
do I need to run this everytime? or is there a way to make it default?
Also it's a duplicate from the main screen is there a way to make it an extension?
>do I need to run this everytime? or is there a way to make it default?
your DE might have a tool to configure this, in which case it should be remembered and recalled
in any case you can also add the line as part of session startup in any de, or without a de
>Also it's a duplicate from the main screen is there a way to make it an extension?
xrandr --output HDMI-0 --right-of HDMI-1
(again, adjust to suit, i have to guess here)
>am i the only one who's never seen any noise or artifacts using VGA?
You get more of that as the resolution goes up. At 1920*1080 it's super noticeable when the left and right thirds of the screen are blurry.
Some dirt-cheap cables have issues with HDCP. I have two of them, they're the cheapest possible and are unusable on anything with an HDCP signal and the output dies if it has to switch over. If I use a decent cable, it's suddenly fine. You don't need placebo cables though.
About this sorry I looked to the pic when selecting the post to reply.
Hint for anyone running VGA:
Go to this website:
Make the pattern fullscreen and make sure it's at a 1:1 zoom with the pixels. Then run your monitor's auto adjust feature.
VGA connectors worked great, yes, but I've also seen plenty of VGA signals get interference with those micro-swiggly lines.
I think resolution caps out at a certain point, too. I might be wrong.
Can DRM be silently enabled in HDMI? I mean I get that it's there but I don't get if it's *always* explicitly there or only in cases where a manufacturer has installed HDCP or whatever it is.
Have you tried reading the fucking manual you fucking cock-gobbling cunt? Or do you want us to someone fix your simpleton problem with a couple of your shitty explanations and nothing else? This is why you will never succeed in life... you are a fucking retarded faggot.
>I think resolution caps out at a certain point, too. I might be wrong.
I think it's more of a softcap than a hardcap because it's analog. Depending on length, interference, etc. you can probably go pretty high just that it would look really shitty. I've never tried it myself though.
If you can't figure out how to plug a cable into a computer you have no business operating one.
Take a shit, fuck the teachers. If they really need to connect, they will figure it out.
VGA adapters were created for old analog monitors. When you use VGA, you convert the computer signals to analog and then send it to your monitor. Your monitor then displays the signal as analog. Conversion means loss of quality.
On a HDMI plug, the computer output signal is send directly to the monitor where it displays digital output. There is no conversion. This results in higher quality.
VGA quality works fine for low resolution because quality of image isn't an issue on 800x600 screens. However on higher resolutions, quality degradation is seen.