>has almost no advantage over more commonly used hdmi or even dinosaur vga
>apparently 6 gorillion different types of dvi connectors so it's not even universal
>faggots video card manufacturers still put this ugly thing on their cards despite the fact that most monitors that have dvi port have aditional vga port
Seriously fuck this thing!
DVA is shit
It has significantly more bandwidth than HDMI until recently.
Doesn't matter anyhow usb c is the future.
vga is only good for the dreamcast
I haven't seen a DVI port on video cards on years
All of them have HDMI and DP or Mini-DP ports now
The only reason its still used it because most monitors still use it since only gaymers buy monitors that would use something that required HDMI
I still see VGA used quite often among prebuilts, since the ones that come packaged with a monitor usually use the cheapest monitor possible
DVI will become more standard among normies within the next few years
But DVI will not be phased out by HDMI or DisplayPort for several years
Reminder that HDMI has DRM built in, and Display port can support 4 outputs per port
DisplayPort master race
It's robust and more sturdy than fucking piece of shit HDMI. I still use it on both of my displays, right me faggot. Also it was here before all of those stupid display interfaces. I'll be using DVI-D until I die.
iirc nVidia only did away with DVI on some Pascal cards, most other cards have DVI-D but mostly everyone did away with DVI-I support.
RX 400 and 500 series only have DP and HDMI ports
I even have a 7770 and it only has one DVI port, one HDMI, and 3 Mini-DP
hdmi has licensing fees and hdcp drm
It never got a foothold in any industry because VGA works perfectly fine for 15-20 inch monitors and anything bigger just got HDMI, which appeared fairly quickly after DVI. A regular 24 inch Samsung I got a week ago has HDMI and VGA inputs, and I had to get it because my new vidya card has no longer a dvi-i output, just a dvi-d. So basically VGA never died. It's probably cheaper than digital standards
it could reach above 2560x1440x120hz back then
Only the reference polaris cards didn't have a DVI, I know reference 1080ti don't either. But otherwise DVI is still fairly prolific along cards, just it's normally only one port now as opposed to two ports and 1hdmi + 1dp like it was 2+ years before.
>buff as fuck connection that screws in
>everytime i move my pc the hdmi connectes screens flicker as the cable moves
>dvi dont give a shit
>dvi stronk
>dvi carry all what hdmi carry plus analog vga
i like my DVI , but im getting a displayport monitor soon anyways as modern cards slowly ditch DVI , and it can do 4k on a single cable
All of the new generations of GPUs have gotten rid of DVA so I don't know why you're bitching about it.
Display port is terrible. It's just as delicate as hdmi, perhaps more so. Number of times I've seen people complain about display problems just for it to turn out to be the DP cable needing to be lifted to take the weight off the port is absurd. At least with DVI and it's easily secured, and with USB C the cable weight to port strength ratio is much better.
>RX 400 and 500 series only have DP and HDMI ports
Then what's this DVI port doing on both my RX 580 and my RX 460?
see
DVI serviced my 2560x1440 monitor at 60hz very nicely for years thank you.
DVI is GOAT. Screws in. Larger and more robust than HDMI. Doesn't transmit audio. Often better built as in I've had less DVI cables fail than HDMI.
Funny enough DVI can carry audio if your graphics card supports it. Found this out when audio started coming through my monitors speakers and the only thing plugged in was power and DVI.
...
DVI is built like a tank. You trip on that cable and you can bet your ass that the entire video card is being ripped from the case. Nowdays this cheap shit hdmi cables are barely thicker than my penis. A fucking toddler can bend that weak ass chinese shit.
That's almost the point
The cable is meant to break rather than breaking what the cable is plugged into
>no advantages over vga
Get a load of this noob.
HDMI audio is laggy garbage. Optical is superior
TOSlink is only good for short runs. Coaxial is superior.
On my country monitors comes only in VGA or HDMI, there are DVI-D monitors but are really rare. Display Port monitors only the high refresh rate ones or those multi monitor walls for broadcasting and shit.
nothing wrong with dvi. even vga is fine and it does not support drm.
Display port should become standard. Shit have a lock like DVI and VGA, can send two diferent signals to two monitors in one cable, can power USB ports, 4K 60Hz, 1080p up to 240Hz. DP is dope.
these new tiny connectors are garbage. they are always so loose that the cable will fall out if its touched.
My threadly question:
How the fuck does one get high-bandwidth VGA out of a post-2013 or whatever GPU?
I have a monitor with 130kHz/160Hz bandwidth (1600x1200@100Hz) and no idea how to get a 400 MHz DAC wedged between a DVI-D/DP/HDMI port and a VGA cable.
Are the only DP->VGA dongles out there still complete shit that will leave me stuck at 60 Hz?
DVI-D > VGA
>apparently 6 gorillion different types of dvi connectors so it's not even universal
literally three variants, all with specific applications
DVI-D is digital only, used in
>ports on graphics cards that output only digital/don't output analog on that port
>digital input port on display devices
>DVI-DVI cables
DVI-I is digital/analog, used in
>ports on graphics cards that, in addition to digital signal, also support analog signal over the same physical port
>so you don't need a dedicated VGA/D-SUB port for analog signal on the card
DVI-A is analog only, used in
>DVI-VGA cables and passive DVI-VGA adapters that connect an analog monitor with VGA/D-SUB to a DVI-I port on a graphics card
Dual link DVI actually has more bandwidth than most HDMI ports out today. Also DVI works better out of the box than HDMI, if you have an Nvidia or AMD GPU your color settings are fucked with HDMi until you change them manually using your driver control. HDMI simply sucks for computer monitors, it's only good for TV's. If you're on 1440p/60, use dual link DVI, for 1440p/144 or 2160p/60 use DisplayPort. Don't use HDMI unless you have to.
>>has almost no advantage over [...] dinosaur vga
being digital is not enough of an advantage to you?
The fuck are you on about?
The problem with display port cables is that most of them are chinese garbage, and since dp is much less popular than hdmi the problems never get fixed. I had to do a bunch of research to find out who the oem for dell dp cables was so I could buy one that does not have connection issues. It id insane to have a device where 50% of whats on the market will not actually work.
>DVA
the fuck are you talking about there's no such thing as DVA, stop making shit up
>DVA
>>apparently 6 gorillion different types
There are exactly 3 types of DVI connectors. But yes, I was really pisssed off when I discovered that not all DVI are DVI.
Actually there are five types. There's DVI-A (analog only), DVI-D (digital only) and DVI-I (both signals.) The latter two can be either single link (one monitor) or dual-link (two monitors.)
>no advantage over vga
Are you blind? I immediately notice the difference moving from VGA to DVI on any monitor 1680x1050 or higher.
Winky face
OP probably has a 1366x768 monitor or similar, or he'd realize VGA only supports up to 1080i, not 1080p. Also the quality looks much shittier, I don't understand why people complain about no VGA on high end monitors.
another one of those "DVI doesn't carry audio" tards.
Nigga DVI was superior to HDMI. It didn't get updated like HDMI was since our overlords gave us DP instead.
>DVI will become more standard among normies within the next few years
No. Spoken like someone who doesn't understand what he's talking about.
>doesn't know that the video signal on HDMI is literally DVI + DRM
I have had this monitor for so long that it has seen 4 different gpus. the 6800 ultra, the 5770, the 280x, and now the 1060 6gb
1080x1200 with no valid reason to piss away 300$+ on a side grade at best.
because when every other port fails, vga still works.
because very old hardware usually has either some specialty pin out, or a vga out, so on so forth.
If the cost is very minimal, which vga is, I would rather see one there then nothing at all, especially for high end shit, I expect to see every single commonly used format ever made.