*refuses to die*

*refuses to die*

Other urls found in this thread:

en.wikipedia.org/wiki/RAMDAC
m.aliexpress.com/item/32722161675.html?pid=808_0000_0109&spm=a2g0n.search-amp.list.32722161675
twitter.com/NSFWRedditGif

There is nothing wrong with VGA

I blame projectors, as well as HDCP keeping digital switches expensive

this is like every enterprise projector. FML

but it is dead. GTX 10XX series no longer supports it nor does AMD RX 400/500 series cards

Baby, VGA is gone. The best GPU you can natively use VGA with is the GTX 980ti

It's long dead. Just legacy now.

This shit will still be around as long as shitty office "just werks" computers and projectors are around.

I can't count the number of thunderbolt/HDMI to VGA converters I see daily on my hands.

>The best GPU you can natively use VGA with is the GTX 980ti
Did the 980ti actually have a RAMDAC? Wasn't it just a converted DP/HDMI port internally?

My Thinkpad only has VGA to connect to my monitor. :(

>Request to rewire conference room and classrooms have been knocked back 3 years in a row

Unless all our projectors die at once we'll be VGA forever

why is that a bad?

Because Sup Forums wants VGA to die. :(

*blocks your path*

FPBP.

Even if it was a converted port it still uses a RAMDAC either way, you need something to convert the digital signals to analog.

It's OK to be VGA

Even if VGA wasn't bad, companies realised that making high quality cables was a waste and anything that supports it nowadays is only half-baked and signal degradation is bad.

why does Sup Forums want VMA to die?

also, don't be sad. we love you!

...

I never understood the “it just werks” meme.
I only browse Sup Forums every once in a while.

en.wikipedia.org/wiki/RAMDAC

Educate yourself. There is no RAMDAC if the GPU does not have direct access to the RAM for video cards.

lmao get that shit outta here

He doesn't know the fuck he's taking about. Just werks is a meme that refers to any dumbed down machine that trades features by "user experience" and/or looks. Ex: anything Apple shits.

DVI-D has no advantage over HDMI (besides a cable lock I suppose) or Displayport. DVI-I is the superior legacy port.

Let's convert my digital image from my digital pc to an analogue signal and when it reaches my digital monitor I'll convert it back to digital again at a reduced quality

that's dvi-d dl you fucking mong which is superior to any amount of shitty hdmi you want to throw at it fucking KYS. micro dp for life though.

How come there are so few HDMI 2.0 switches on the market?

They are not even more expensive, you just have to search around to be sure it supports 4k@60hz.

It's already dead lol

dvi-d dl is a joke

In what way? You have much lower bandwidth and can't carry sound.

Because royalties. HDMI needs to die already. Free, open standards are the way. DisplayPort is the future.

...

China still has to catch up but once the production lines start rolling there will be nothing but HDMI 2 switches

as if chinks care about royalties and as far as TVs go HDMI is forever going to be around

>Not the R9 290X

>as far as TVs go HDMI is forever going to be around
Why? What can it do that DisplayPort can't?

VGA will die when every projector in every company and school has HDMI, which according to my calculations will be shortly after standard consumer laptops are no longer 1366x768

DP bends pins easier than HDMI
you have to retard-proof everything

Market monopoly
I can see "gaming" TVs eventually phasing Displayport into the market and eventually needing DP to HDMI converters as a best-case scenario

> he doesn't do 50 foot hdmi runs then uses one these to display porn on a 15 inch CRT

m.aliexpress.com/item/32722161675.html?pid=808_0000_0109&spm=a2g0n.search-amp.list.32722161675

No DisplayPort?

>DisplayPort is the future.

No, Thunderbolt over USB-C is.

We haven't even fully adopted USB-C yet. With current Intel exclusivity and the time it takes to adopt things the world may very well end before we get there.

HDMI has been around longer and from the beginning it was meant to be an easy just one plug and you have digital video and audio, display port didn't come until 2006 and by that time HDMI was the defacto standard on TVs

TV makers can choose to have display port they just don't want to spend the extra money to implement it as for 99.9% of people it provides no tangible benefit even for gamers as displayport doesn't have any latency advantage, I'd be the 0.01 percent that would like the benefits of displayport as HDMI 2 can't do 4k@60hz with 10bit color and 4:4:4 chroma

Yeah. It'll likely slowly fade and be remembered as that Apple tech that nobody else really used, like Firewire was in the 00s.

USB-C is on AMD, TB3 isn't.
that said, wouldn't a PCIe 3.0x4 addon card with an intel controller work for TB3?

It would technically but no one has bothered to make a universal one, asus made a thunderbolt card but only for a select few of there motherboards worked with it

I'm pretty sure current AMD 3XX boards don't support newer Intel technologies like Optane and TB3 for royalty reasons. Thunderbolt without Royalties could be baked into the Zen+, Zen 2, or Zen 3 chipsets, but that's wishful thinking. Cards are expensive to make and most people are fine with the theoretical 10Gbps cap of USB 3.1.

It's fine, if people weren't stupid and had demand for it, 1. The connector could be smaller, and it can definitely take 2560x1600@60, but a lot of RAMDACs are fucking trash and can't do it without being fuzzy. We didn't really need anything past VGA, except DisplayPort.

HDMI is trash, and so is DVI.

TB3 is becoming royalty free and relatively open this year, I think next year will be incredible as far as its inclusion in devices.

Tunderbolt is just PCIe over a cable, you can make a tunderbolt card that can work with AMD that even uses an Intel controller it's just that no one wants to make one. TB is going royalty free anyway

DVI-I is the better version of VGA because VGA is either completely useless or completely necessary depending on usage while DVI-I can at least always run a second monitor
High-end boards usually don't even have more than one USB-C right now so good luck with that

USB always adopts like this

that's because you work at best buy and walk through the cables-for-retards aisle on the daily, bitch

>tfw DVI-I died before VGA did

To be honest it's not like desktop even needs thunderbolt over USB C anyway, it sucks that no one wants to add it but there is nothing that your missing out on since if you need something to go through PCIE you just buy a PCIE card

By the time thunderbolt gets fully adopted we might not even be using cables for high-bandwidth devices like VR headsets and possibly even monitors if they find a good way to do it
That was weaker that "poozen" or "jewtel"

TB3 is mostly aimed at external add-ons to systems that have no internal room. Vive is getting wireless (WiGig I believe) soon apparently.

Achtually
DVI can indeed carry sound interleaved with video. Only certain sources, and the receiver may need a dvi-hdmi

Something like an external video card will almost definitely still need a wired connection, but hard drives and displays probably aren't going to be sensitive enough to need that type of connection.
You'll basically have a PCIe port but less useful since few people use anything other than a video card and those who do don't do it externally.
Correct but when it is a solution it isn't optimal

DVI has a better connector, HDMI's got more bandwidth. HDMI's just DVI+Ethernet+I2C+audio so if HDMI can do it there's really no reason DVI couldn't with a spec upgrade.

On that note, would it be possible to write a driver to use a free HDMI port as an additional Ethernet port?

>doesn't support 4k

It supports above 4K UHD, you just use two cables.

I think it would be a nightmare either way and I'm not sure if a video card would be capable of that since the obvious solution is to get a networking card.
If thunderbolt ever does take off and it gets supported by GPUs then possibly.

and halve the amount of ports on my GPU? no thanks

I never said it was ideal.

Can't count how many times VGA saved my ass when hdmi/dvi signals went to shit.

Why do you want it to die?

>Brand new GPUs don't support it! It's dead!
This is what you sound like.

Back to the topic at hand, VGA is going to be around for at least 15 more years for compatibility with servers. I could be wrong, but most Xeons don't have graphics; the motherboard handles the graphics. So they're VGA. That said, xeon E3 v5 and xeon E5 v4 have integrated graphics, but a quick google search didn't tell me what kind of graphics they support (HDMI, VGA, etc). Probably both of those and DP.
The point is VGA is going to be around for a long time yet to come, and there's nothing wrong with that. VGA is good.

because soyboys have a huge hate boner on backwards compatibility and communication options in general
personally i blame apple for this brainwashing

Except it isn't, since the the standard says you're only allowed 165mhz per link (330mhz bandwidth total), so you're forever stuck at sub 4K, since 3840x2160x24@60hz is ~540mhz

Hey don't shit on VGA. My Agilent VNA has a VGA out so I can hook it up to an external monitor which is a lot nicer than trying to use the built in early 2000s LCD screen in the instrument. Barely readable even at max brightness, nothing like modern panels. Would have been better off with a CRT.

Sup Forums was a mistake.

>It's still displayport
Are you a brainlet?

Nigger who the fuck are you quoting

Nah it's jsut expensive and noone wants to deal with it.
Maybe once it's royalty free ASMedia will add it to the AMD chipsets.
Also protip: A320/B350/X370 AMD chipsets are actually made by ASMedia.

You :)

I never said that you retard

Sure you did, I have it on good authority from this post

What's the difference between VGA and GPU?

yes yes good goy use VGA mwahah

The jews would want him to buy a new monitor

VGA could mean multiple (related) things.
- A connector/cable
- The basic "lowest common denominator" standard for drawing things to the screen

A GPU is the physical piece of hardware which handles all of this stuff.

It's a lot easier to generate VGA signals than hdmi stuff.

Mine only has HDMI..

Fucking screen resolution on the main screen renders other screens nearly useless, especially when using MS Office malware. Fucking microshit.

All of this

What are you talking about? RS-232 is the future.
Pic related refuses to die.

VGA is a standard from the early 90s. It allowed resolutions up to 640x480 with 16 colors, as well as text mode with 80x50 characters, custom fonts, and 16 colors. It didn't offer any 3D acceleration since it was from a time when 3D was very primitive, not triangle based, and done on the CPU. It was a good step up from EGA which came before. It also introduced the very forward-looking all-analog 15-pin plug that is only now disappearing from computers.

VGA commonly refers to the display connector. The same plug was used for display adapters and monitors for SVGA, XGA, SXGA, etc. -- advancements that brought higher resolutions and more colors.

A GPU is a hardware 3D accelerator, the thing you know today, that you plug into a PCIe slot, connect your screen to, and can't play games properly without.

*blocks both your paths*

Ew. Ick. Yuck. Silly proprietary dongle for dual monitors on some low profile GPUs.

You always see the GPUs on ebay without the adapters, making them useless. And then you sometimes see the adapters in flea market random-cables bins with no explanation.

If you grab them they're super cheap though.

Jews want you to buy hdmi because hdmi has DRM.

Nigga I'm running a VGA monitor off my 1080Ti RIGHT NOW. All it takes is a sub $10 adapter from DP and you're up and running. Same story for my rMBP.

Please no

(:

I'm running a 1080 on 1080p and I know that's massive overkill
I hope you have a VR headset or something

You can't do any higher than 1600x1200@60hz though ;)
That's the problem - nothing has real VGA now, so you're limited to UXGA@60hz.

Hawaii didn't support VGA either, it was one of the big changes with it.
290/290X/390/390X
I don't think Bonaire supported it either, but I'm not sure on that. (7790/260X)

>active adapter
Enjoy your quality loss

As you can see I'm running 1440p @ 165Hz which is about what this card hits with my current CPU. And yes, I'm saving up for (more like convincing myself to buy) a Vive Pro.