Time to let it go?

Time to let it go?

well, it is technically the best quality we can get with analog signal, so it will be alive as long as analog video signal

Hdmi had too much delay ;(

How else will I get my 3840x240@120Hz modeline to work?

I'll never let you go baby.

It's dead for home displays (except for enthusiasts going out of their way to use old CRTs, I suppose) and has been for a few years now. It will live on in corporate a while longer due to the prevalence of VGA projectors.

tfw still use monitor that only has vga
tfw want a new gpu but they dont have analog out anymore

I saw an HDTV with it

Your screen must be sub-1080p if it only has VGA. Why would you need a new GPU?

>Worries about analog
>Crts haven't been made in 20 years

tfw vga to dp adapter

Current video interconnect cables are getting obnoxiously thick due to shielding and DisplayPort is now at the limit for physical bandwidth and is introducing lossy compression as standard.

Fibre digital video cables when? Would be much thinner and much faster.

Future is parallel thunderbolt

There are zero reason for kill VGA, except if you want digital image in 1366x768 monitors. And VGA can handle 1080 really well.

its a 1080p.
probably one of the last ones that had only vga because there is a blaking plate where dvi would go in.

screw ins can suck my fucking dick, HDMI cable best cable.

You misspelled DisplayPort

Projectors

It can handle 2048x1 536 really well also.

i guess you've never worked with server hardware or had a real laptop

let go, op

980 and prolly Ti has it

Pretty sure VGA projectors are way past their lifespan now. When a cheap office projector's bulb dies, companies don't bother replacing it, they just buy a new projector, all of which have had DVI or HDMI for over a decade now.

>15 pins
>monitors have over a million pixels

Oy Vey! Goyim you better start using these new cables with copy protection™ integrated! You need that HDCP for maximum fragkills, my friend!
Oh, what's that? You are one of those pirates that is still sailing the free seas without having to be cucked in a private tracker?! You hate copy protection? *gasp*
Oye, we better ramp up these VGA and DVI hate posts then!

And now unironically: I'm serious..

From my cold dead hands

>HDCP
we are going to get our HD torrent anyway

VGA is a boon for 1080p TVs, gets past all the bullshit processing and overscan and as long as you use a decent/short cable you'll have a hard time telling the difference between HDMI and VGA

>headphone has 3 pins
>song library is 4 songs

Asking if a technology is dead means you are a retarded shit stain, what the fuck did you expect?
If any device you have still use it, you are still using it.

Nah, it'll just fade out of general use like DB-25 and DB-9.

Never. I'm actually really glad my monitor has two inputs; one VGA and one DVI. I'm able to have my main workstation connected over DVI, and my Mac connected over mini display port->VGA because it allowed me to buy a cheap 2-way USB share switch instead of buying an expensive kvm switch. I just flip a switch on my km and tap the input button on my display to go from being productive to shitposting and back again.

>inb4 you realize your server only has a vga port.

VGA supports resolutions higher than 1080p, retard.

changes interface every couple years
requiring new cables or adaptors
fuck displayport

and so do all switches, etc. thats why good laptops have vga in

I LOVE VGA in the workplace. Its rugged as fuck, shit dosn't come disconnected on it's own. We go thru so many more HDMI cables over VGA.

I just wish DVI was being expanded further as a standard. We need MORE ruggedised cabling, not less. I'm really not looking forward to USB-C being common in the work place, the cables are fucking trash even when built to spec.

DVI was always kind of a stopgap for when you wanted a digital signal since VGA does the same as a DVI single link does pretty much, but yeah I'm with you on HDMI and USB C just not being durable connectors. I've had cheap HDMI fail many of times but every single VGA cable I have still works just fine.

I switched to DVI a few years ago.

You could use a USB3 adapter, good luck if you want to game on it though

Use Display Port, plebs

9xx series of Nvidia cards is the last gen to do analog. Because of this I'm going to be holding on to my 950GTX for a while.

You obviously don't work in IT.
Displayport's locking button is a massive problem with normies. Everyone thinks its a HDMI cable and they tend to yank it out with a lot of force or bash on it till it comes out. Some times this breakes the cable, sometimes this breakes the connector on the host.
Had to replace a projector because someone in the business department pulled on the thing so hard that it ripped out the pcb of the connector on it. It even had a fucking label that said "press on the button to disconnect".
Its too smart of a connector for the masses my friend. Can't do that with a VGA.

Most projectors are barely used so their bulbs last decades.

I think the only time most projectors get replaced is when a company moves to a new location and just doesn't bother taking their old junk with them.

That's why you buy the cables without locks if normies are going to be fucking with them

>Time to let it go?
Time for (You) to go away

If only I was the purchasing manager I would ;_;
The cables we've got really, REALLY locks it in.

HDMI to VGA adapters are cheap now, and work fine.

USB to VGA is millennial tier retardation.
Probably only invented for that "lol one port" Applebook.

it's not (just) projectors that'll keep VGA alive, it's KVM switches in server racks. Those could be switched to DVI, DP, or HDMI, but there's no incentive to, it'd be more expensive for no benefit. An analog signal is trivial to switch and when all you're looking at is the console of a server to try and figure out why it dropped off the network then you don't care about VGA's limitations.

>If only I was the purchasing manager

What kind of beta are you?

Replacing the cables will save the company money over time.
Get your ass to the CEO and borrow his company credit card.

Those won't work because the GTX 1000 series cars don't have an analog signal in the first place
That's where the USB3 VGA dongle comes it, it's basically a tiny graphics card

HDMI to VGA adapters are fucking useless, sure if you have an VGA LCD monitor there fine but for decent CRTs the RAMDAC in GPUs is so much better. max you can do with an hdmi converter is like 1920x1200@60hz while a full 400mhz DAC in a gpu can go the full 2048x1536@75hz.

Most of those adapters are active and they actively convert digital signal to analog.

I suspect it's also to prevent outputs from entering sleep mode or switching to a different screen.

In the extremely unlikely scenario where you're not a system administrator this shouldn't matter though.

>Most of those adapters are active and they actively convert digital signal to analog.
Not if they're cheap like what he's suggesting

>analogue over HDMI

wut?

Fairly sure they are ALL active.
Even the $2 ones from Ali Express.

...

There so mass produced by a billion different Chinese factories you can make an active adapter cheap

nigga most monitors have 1920 pixels max

Can I use VGA for 4k at 60hz? or Do I need DVI?

Let it go Indy.

There are 340mhz ones in the wild now, only like 20 bucks too. Not enough for the full 2304x1440 80hz on my FW900, but enough to max out all the other usefull hor-scan -limited combos of res/hz, like 1200p 96hz

I don't think there is even a hard limit to VGA.

But you'll get more and more noise the higher you go.
Personally I wouldn't use VGA for anything more than 1024x768

You need Display Port 1.2 or HDMI 2.0

>being a monitorlet

>And VGA can handle 1080 really well.
Except that it can't. I have two identical 1080p IPS screens from LG at my office and if I connect one through VGA, even with a thick cable, it has notably worse quality than DP.

DP will carry on strong. VGA is now a legacy standard that ranged from old monitors to Full HD LED flat screens.

When will consumer electronics support the superior Display Port over the dated HDMI?

IT manager at public school
We use white cables and use sharpie to write PUSH right where you put your thumb.

either your DAC is shit or the control board in the monitor translating the analog signal is shit

When Hollywood rolls over dead

I own a Dell ultrasharp with a DP, Mini DP, and HDMI
Cant get 4k 60fps over the HDMI, its not 2.1, just 2.0
Mini DP doesnt support 4k 60hz
Ive been thru 3 cables of regular DP to DP
Only the cheap ones from Frys work, otherwise anytime I load a game the signal drops. Amazing work, bulk cable companies.

KVM switches are obsolete... I neve ruse one, I always just use the built-in IPMI and its java remote client.

Not going to name who I work for, but I will say the bureaucracy around a company's IT thats in the Fortune 50 list is ridiculous.
I swear to god, something has to be egregiously broke for a year sometimes to get the go ahead to swap. Love it when software related tickets sit in the buckets for some dev in india for months... How this company got to be so big with this organization I'll never know.

Fuck off, cocksucker.

HDMI supports 4k 60fps, you just need an HDMI 1.4a cable

Not him but I noticed noisy VGA signals all the time when they were still common.

At least in an office environment with multiple computers in close proximity VGA can't handle 1080p.

Kvm still has its uses.

And it looks like shit. I used a VGA cable on my 1080p monitor for a while, looked like shit.

I used a VGA cable on a 1600x1200 monitor for work. It was fine.

Get better eyes

Still useful sometimes, I'm using VGA and a HDMI > VGA adaptor right now to connect my tablet to a monitor (DVI connection is used by my desktop).

I also use VGA at work for my 3rd monitor, the Dell dock I have only supports 3 monitors when one of them is using VGA. I don't know if the dock/laptop output is shit or the monitor's VGA input is shit, but there's a lot of noise in this case. Still better than just 2 monitors though and good enough for text/email/communication shit.

Understand how tech works.

I guess it depends on your environment. My screen was noisey, and after a while I got this strange effect where the image on the screen was washed out to the left or the right.

Are you telling me that an analog signal won't get distorted, especially when you're pushing more pixels?

KYS

Where the fuck do I buy a displayport 1.4 cable

For what purpose? It's for 8k displays or 4k 120fps displays

I still have and use daily a 1440x900 @75hz VGA monitor that use as a secondary display to my laptop

It starts to really depend on your card's DAC and the VGA cable you're using, since anything over 1080p you really start to need good cables.
I use to have a 1920x1440 19'' CRT and it only had VGA. Shit was awesome, having that PPI level back in 2004, but you honest to god needed nice cables for it or it would be fuzzy.

>anything over 1080p you really start to need good cables
Yeah, like DVI, HDMI, and DP

>2004
Man, even DVI was not that common yet, and HDMI/DP were not on the radar.
VGA was (mostly) still king in the early 2000s.

why are you still trying, everything you say is dumber than fuck

>talking about VGA in 2004
>thread is about VGA being dead now

>Post was about VGA not being able to do above 1080p
>Response talks about hardware that did it, year is not relevant
Dude, read the thread.

It can do above 1080p, just not all that well

user only says it looked fine for him because that was his only option in 2004 and he had no comparison

I use 1600x1200 VGA monitors today and they look fine.

>HDCP
nice try fbi

tfw 2 1600x900 dell monitors from like 7 years ago

Get better eyes

The butthurt and screeching about VGA implies that it's still the best standard to date.

I'm still using it, I was able to get a 1280x1024 dell lcd for $25 from a school surplus outlet.

VGA supports up to 2048*1536 you cuck.