Youngfags will never know the feeling of a single upgrade doubling or tripling your system performance

>youngfags will never know the feeling of a single upgrade doubling or tripling your system performance

Other urls found in this thread:

gpu.userbenchmark.com/Compare/Nvidia-GTX-1070-vs-ATI-Radeon-HD-2400-PRO/3609vsm8931
twitter.com/SFWRedditGifs

Is that a sound card?

i wish we could get a third company to compete in graphics and processors again

not gonna happen with intel and novidia running the show
they just buy/sue the shit out of young companies

Adreno, PowerVR, or Mali desktop GPUs would be most likely

GPUs are still improving quite a lot every gen. It's still easy to see a doubling or tripling in performance for every purchase if you just don't buy a new card every single generation

>Mexico
You have to send it back

>doubling or tripling
if you upgrade maybe every 5 years

I more than tripled my performance in my last upgrade in 2016. You just have to wait a bit longer these days to do it.

>mexico

This triggers the trumptard.

Only consumerists retards would upgrade every generation without a real reason.

what retarded reply is that
go fuck yourself

I never said that you should upgrade more often
learn to read you horrible minus human

the SSD i bought two years ago doubled the I/O performance and now everything runs smoother

he agreed with you, faggot

Going from a 386 sx25 to 486 dx100 was insane

I went from a GTX 660 to a GTX 1070. Feels good.

sounded more like he didn't understand my post

>went from a 4850 + olde AMD dual-core to this

This is like a 1200% increase roughly

No its a raspberry pi

>1996
>games go from 320x200 @ 20fps to 640x480 @ 30fps thanks to the first Voodoo

>1998
>going from that to 1024x768 @ 60fps on the high-end thanks to Voodoo2 and SLI

>1999
>1024x768 @ 60 is now standard thanks to Voodoo3

How the fuck is it still taking us years to get to proper 4K?

>implying AMD is innocent in this behavior

a fraction of intel and nvidia

>update 64mb pc100 to 128mb pc133
Day and night

I recently upgraded from a Phenom II x4 955 BE to a dual Xeon v3 system. I now have nearly 15x the performance in the workloads I use the most.

I also have 64GB DDR4 ECC (and can upgrade to a total of 1TB (16 slots). Finally I have a computer that will last me a decade.

This has/will last me a long time. All my games are old classics from 1996-2004 era so no fancy $400 graphics card required for them. Everything else (photoshop/web surfing/e-mail/video encoding,etc) is handled equally well. Storage needs are another matter, my server is pushing 10TB with maybe 1.80TB free.The Voodoo 3 2000 was a kickass card back in the good ol days.

It's a GPU, are you blind?

>Soundcard with VGA connector

fucking millenials....
he can't even comprehend that GPUs used to not have fans

this has to be bait.

even within a single generation
upgrading from a 25 to a 33 MHz 386/486 non-multiplied CPU also increased RAM and cache speed with it

hard disks improved rapidly in speed, the cost-no-object SCSI drives of today were the budget IDE drives of next year

in terms of what you could get out of it at its peak, your whole system was basically worthless in only two or three years, and you felt it too, today you can still run modern versions of Windows, Office and web browsers on an 11-year-old Core 2 system usably, you'd be lucky if Windows 95 would even get to the desktop before you killed yourself out of impatience trying to load it up on that 386SX-25 you paid $1,500 for three years before

>How the fuck is it still taking us years to get to proper 4K?

>Voodoo5
That was there first mistake... and the last one.
We could have had 4k in 2005~10 if they hadn't fucked-up.

Try to play quake at 640x480

>fucking shit

upgrade from matrox mystique 2mb to 3dfx voodoo rush 6mb

>pic related

320x200@20fps = 1,280,000 pixels a second
640x480@30fps = 9,216,000 pixels a second
3840x2160@60fps = 497,664,000 pixels a second

Ageism is so toxic it has nowhere to go besides bringing us all down

even if this were true why does it matter that they should know?

fuck off you purple hair snowflake

might be mistaken, but can a 386SX run win95? i seem to recall win95 only working on 386DX and up

I hope this is b8

This post gave me aids.

Literally has 3dfx written on it in massive letters

Yeh, ssd is this generations voodoo

>no hdmi
>no heatsync
>no LEDs
>no nVidia/ATi logo
yep, definately a sound card, there aren't any other kinds of computer cards

That's arguably Nvidias fault though, they keep releasing overpriced incremental upgrades.

I get that reference.

>heatsync
>heat
>sync

You seriously fucking think it's a "heatsync" and would have carried on with that had I not pointed this out right now

You're just as dumb, you fucktard idiot

slow clap

>240hz

enjoy your 120hz with fake doubled frames

You can as far as I know, the SX is still a 32-bit chip to the software running on it.

But it's very, very slow.

since a few weeks ago there are real 240hz LCDs

CRTs have always been 240hz capable, and you have always been able to overclock 144hz LCDs to 240-270hz using aftermarket hardware

>tfw built first PC at end of 2011
>chose an HD7870 because it seemed like best bang for buck at ~$250

>decide to upgrade last November
>paid $260 for a fury
>doubled performance at minimum, tripled it more often

I don't know what you're talking about OP

>five years
>merely doubling
OP's talking about a single generation or even a couple speed grades' worth

in 1996 that would have been going from a 286-16 with a shitty 16-bit CRT controller to a Pentium 166 with a Voodoo

GPUs are improving just fine, because of their nature (parallel computing, throw more cores on the chip and you're gucci...)

CPUs are stagnating because you can't just put 4000 cores on the chip and forget everything else, you need meaningful transistor shrinkage to make a faster chip. Transistor shrinkage is halting due to quantum barriers, there's a reason Intel is doing 3 generations of 14nm when they used to do a shrink every other generation.

I have no idea where this inequality of advancement is leading, perhaps CPUs will become deprecated in some way, maybe a new semiconductor that allows further shrinkage will be found. time will tell. Until then enjoy your skylake rehashes and +2% performance gains per generation.

>heatsync
>posts smug anime

Yep, seems about right.

This post rings true as fuck. Newfags will never know the utter pain of having to, no actually being forced to buy a new system every few years. Your lucky these days with your 2500k and your software that is essentially the same as it was 10 years ago. Computer equipment was expensive as fuck and only rich or very serious hobbiest could afford to buy.

Don't forget gpus are about 20 years behind cpu development. There is still some "easy" performance gains to be had.

I don't understand why people can't just accept that eventually this was going to hit a wall and slow down, maybe even halt. Moore's law wasn't an actual law, after all, it was just an observation. You can only improve something so far.

>taking a picture of ULMB

Using an old computer is totally fine tho. I used a system with a Pentium 100 and a 1MB 2D accelerator (no 3D) until about 2003. I couldn't play most games released after 1998, but it was fine.

You can't see blur with a screenshot user. I think you are confused.

not with single upgrades

So why post it?

Because the person I replied to didn't think real 240hz monitors exist and they're all falsely advertised 120hz?

Posting a web app that detects your system resolution is hardly proof, it just shows that you're one of those guys who bought a nice monitor so they could have blurbusters bookmarked for immediate access to show people

>All my games are old classics from 1996-2004 era so no fancy $400 graphics card required for them

You don't need it for modern games neither.

95% of the existing modern games (lets say post 2004) will run fine in 720p and default settings on that machine and look better than the older games. No reason to upgrade seeing you are not a graphic whore.

So what op is saying is that we should feel sad that we can keep our rig longer, compared to when we needed to change them every year?

Based 3Dfx.
Actual first and last company to make a real difference in 3D graphics.

>Posting a web app that detects your system resolution is hardly proof [that you have a real 240hz monitor]
>it just [proves that you have a real 240hz monitor]

>tfw Fallout 4 30fps at 900p on integrated graphics

Using Win95 on a 386 is fucking retarded, I would not even attempt it, that is a machine for MS-DOS and that processor was simply not designed to handle a graphical environment, can that shit even run Windows 3.1 decently?

My best friend back in the day had a 5500, it was like playing on gods own computer.

>have normie computer with no GPU
>get a single GPU upgrade
>tripled performance

lol agp

>mfw I used a Pentium 150 until 2008

Man it sucks to be poor and a millennial.

Still had my fun with it for many years, I just played tons of old games while my friends were playing in their ps2/3 pentium4/c2d, even the poorest of my friends had a pentium 3 at least.

You want to know something funny? Around 2006-7 I had saved some money to get a second-hand ATI 3D Rage XL so I could play GLQuake and some other stuff, and the piece of shit did not work so I got fucked :)

Had to keep using my Trident 9440 until my uncle had pity on me and purchased me a computer (a Pentium III lol, day and night though) then almost reaching 2009 I got a Sempron Thoroughbred, so my jumps really were enormous.

Always wanted a 3dfx card, but didn't have one.

Reaching adulthood and working, first got a Celeron Sandy Bridge dual core, then an i5 3470, I like to only make big jumps.

A luxury, most mortals didn't have an AGP lost and had to use the shared classic PCI bus for videocards.

>thinks this is bad
I used a shitty dual core processor with an integrated GPU and 2GB RAM up until 2 years ago.

I'm ok now, but I was pretty bad on my underaged years (until I turned 20 in 2013 more or less)

I got the Core i5 like a month ago, so up until late 2016 I had the Celeron with 4GB of RAM, I had a Radeon 4670 though, when the capacitors blew up, I replaced it with a GT730 (pretty sure is worse) I still have.

Never had a high-end videocard unless you count the Radeon 9800 Pro, but I had it like 5 years ago because I got it cheap for my previous Sempron, not like I had it in 2003.

Never had a high-end CPU except the one I had now for a month.

I think maybe 75% of the people here have used a laptop their whole life until getting a modern desktop 2-3 years ago

> will never know the feeling of a single upgrade doubling or tripling your system performance

I still remember that feeling. I also know that I'll never feel it again.

Probably the closest I'll ever come is going from a 500 MB/s SATA SSD to a 3500 MB/s PCIE M.2 when I upgrade my Ivy Bridge to an Icelake.

A 7x increase in disk speed plus a 20% increase in CPU speed will probably make my system feel a little bit snappier. But it won't be the kind of massive gains I remember from the 90s.

It was a shit time with underpowered, overpriced radioactive computers that couldn't do jack shit.

Any DX I've ever had will run 3.x pretty effortlessly. An SX is hit-or-miss, not really the best but perfectly usable for a typical unitasking use case.

>toxic

Anytime you find this in a post you can safely ignore it.

sure you could as long as you still ran the contemporary OS and software on it

there's no way in hell you were running XP and Office 2k3 well on that P100 though, for example, while a low-end C2D box will run 10 and 2k16 just fine for the majority of users

you could see it both ways

the way I see it it's just commenting on how funny it is to watch people on Sup Forums collectively jerk themselves raw over 5% gains from generation to generation and whine about "how bad we have it" now

but maybe he could also just be doing that, nostalgia can be retarded that way

You need to synchronize the heat in the system evenly to obtain maximum ch'i. Basic metaphysics.

>he thinks using a modern post-moore's law slowdown dual-core chip with 2 GB of RAM is at all similar to slogging through even the simplest tasks with a 12 year old low-end Pentium that was shit even when it was new

>Sup Forums collectively jerk themselves raw over 5% gains from generation to generation and whine about "how bad we have it" now

Let retards be retards, I think its great that today isn't like in the 90s, I like the idea of buying a computer every 5 to 10 years, I get to make big jumps without going broke.

same, there's never been a better time to be into computers, new or old, than now

you can enjoy all the nicest and most fascinating pieces of the '90s at a fraction of their original cost and get much more use out of them with all the software, documentation and free support you can handle from the internet

all while your daily drive will remain useful for up to a decade or even more and decent upgrades are cheaper than ever

no way in hell I'd ever go back to the '90s, maybe if only to save some of the incredibly rare big iron and enterprise(TM) hardware that were lost to that decade

Just went from an Ivy Bridge i3 at stock clocks and a 7770 to a Skylake i3 at 4.6GHz and an RX470, felt breddy good

I went from a ATI Radeon 9800 Pro to a 7970. Pretty sure that is more than tripling.

>blame it on newer generations being younger instead of on the corrupt companies and the broken economies that support them

Sounds more like you didn't understand your own post.

>it's another "I think moore's law is actually a law and da corporashuns are just sabotaging it for some reason" episode
the laws of physics and relatively decreasing software requirements would eventually catch up, you know

>Youngfags will never know the feeling of singlecore processors
>Graphics cards so shitty they have problems with desktop graphics
Comparing with my 2007 PC, I have increased my graphics performance by about 8000 %.

>My first graphics card was a MagicMedia 256AV
>You can't even imagine how shitty it was
>It ran SubCulture on my 1998 laptop

yea well i'll go from a 5450 to a 980ti

more like almost 10x performance

agreed. computing want really fun when you had be plugged constnatly in to AC power / / hardware was incredibly expensive / google wasnt there to answer your problem and you could get stuck in a dead end / hard disks crashed if you sneezed near them and microsoft had ungodly monopoly.

I don't understand how I survived with this shit until 2013.
gpu.userbenchmark.com/Compare/Nvidia-GTX-1070-vs-ATI-Radeon-HD-2400-PRO/3609vsm8931

>Being a closet nvidia-cuck
>This much denial

Hahaha! Wow.

>i5 3470
truly the best bang for buck CPU

i5 2500k might be a good deal because overclocking, but you need to spend more on a motherboard to go with it, and probably replace the box cooling as well

>Probably the closest I'll ever come is going from a 500 MB/s SATA SSD to a 3500 MB/s PCIE M.2 when I upgrade my Ivy Bridge to an Icelake.

Protip: there is almost no real world difference between these two outside of some very niche uses.
HDD to SSD is a massive improvement in overall system performance. SATA SSD to M.2 SSD really isn't noticeable outside of benchmarks or large sequential transfers.
Just so you don't set yourself up for a disappointment.

This. I also planned a PCIe SSD for my X99 system but don't see a reason yet. I will upgrade once 2 TB NVMe becomes affordable. I would spent around 500 €, not 1300. My 1 TB SATA SSD is also nice until then,

all locked ivy that turbo can be clocked to +4 bins above max turbo and some boards have MCE locking all cores to max multi.

I just put together and ancient asus p67 and it is pushing a 3550 @ 4.1 multiplier only. 4.2 with 103mhz fsb

3470, 4.0 / 4.12
3550, 4.1 / 4.23
3570, 4.2 / 4.36

Is attainable on all ivy chips. up to 105mhz is stable for most and some configurations can go higher.

>tfw buying Voodoo 5 when Geforce was already out
I regret wasting my dad's money like that.

Man, I remember owning one of these. Actually I had one and my dad had one too.