Remember when the 8800GT was king and every new build was recommended this card?

Remember when the 8800GT was king and every new build was recommended this card?

Remember how it only cost $200-250 new?

I had a 9800 GT that served me some good lonng years. I'd probably still be using it today if it didnt burn out.

I remember the simpler times when the personal computer wasn't as "niche" as it is now.

Building a computer today involves paying 200$ for 16gb of ram and $600+ for a "good" gpu that holds up against terribly unoptimized games

I had a 9800gt+ which was a remake or whatever for a long time up until a year ago.

I remember buying a 7800 gtx and paying more than $500 for it so not sure what your point is.

Shit nigga I remember when the TNT was king o' the pile.
I don't recall how much it cost though that was a long as time ago.

earliest card I remember buying was "GeForce 3" on the box.

This was 2002-ish from memory.

I bought the 8800GTX at launch so the 8800GT never appealed to me. It was a good budget card though.

gt series wasn't flagship cards
stop comparing to flagship card prices and you'll settle down

i remember how my 7800gt played crysis at 4 fps and 102c
some things gotta change

the TNT was around $200

gpu inflation is nuts

ram inflation is nuts

ssd, hdd, monitor....

Wait...

Its almost like its easier just to buy a prebuilt and a android phone

The equivalent 1080 is $550usd dood thats a almost 3x increase in price from 200ish

>you had x2 8800GT 256mb
>Crysis ran at 30fps on high
>Bottom card temp 104c

Good times Amma rite guys?

>The equivalent 1080
no, the equivalent is the 1070 and it's also a stupid discussion with mining going on

still more than double the price and is the worst value 10xx imho

argh

>Remember how it only cost $200-250 new?
No, because it cost me something like 650nzd.

>Wanted an 8800GT
>Poor as shit
>Oh man they sell a 256MB version for a lot less than the 512MB version, I'll just get that!
>Performance is complete dogshit compared to the 512MB version even at 1280x1024
why did i do that to myself

monitors/hdd are cheaper and better than ever
gpu and ram are ridiculous, true

yeah but gas was literally $5 a gallon then

i vidly remember spending $600odd aud posted on a 7900gt the year before only to have it die and get rma'd

nvidia QA is like amd's Raedon groups, they are both still terrible

most recently my 1080 freezes if i enable shadowplay with dsr or just at all cant overclock at all

390x fan controller never worked

GPU's are just as shit as they where 10-15+ years ago probably worse
>want to buy a basic 4k 144hz screen that isnt marketed to a retarded gamer
>literally cant buy one
>ones that i can buy have horrible styling
Cheaper sure, you get shit refresh rates and other quality issues

Remember when spending over $30 on ram was just wasting your money

aussie here

no no i dont

i havent seen ram this bad price wise since sdram and ddr1

i remember buying my ati 4850 on release and was amazed by how much performance it had, and how cool it ran for $199. had it for two years before i stupidly made the decision to replace it with the awful 470. ended up replacing it with a ati 6950 a year later because i got tired of how hot that 470 ran. i miss single slot cards.

you can still get single slot cards with decent performance, they just throttle like crazy and are loud as hell. It's just impossible to cool much more than 75-100W in a single slot form factor without tons of compromises.

I have G92 8800GT sitting right next to me.
Planing on flashing it to 9800 and vold mod it to get more clocks from it.

Remember when inflation exists and the price of stuff goes up over time?

>about to buy new pc in late 2007
>been saving for quite a while
>only can afford entry level HD card
>8800GTS is out providing twice the performance at the same price
>sold out everywhere
>mate working at retailer puts one behind the counter for me
>forget to upgrade psu for hungrier card, only notice once computer arrives
>"nah, it'll be okay"
>half the games i play have terrible screen tear and artifacts
>"it's nothing"
>6 months in the card catches fire and fills my room with thick black smoke
>buy 5670 and new psu - typing this on the same pc
still kinda mad i destroyed it due to not paying 20$ for my psu

>bought the best hardware i could afford in 2006
>512M ddr2 ram (bare minimum, got more later, had to compromise somewhere, better to start with less ram since i can always add more)
>7600GT
>Athlon 64 3500+
>new mobo
>kept my current 350W psu, knew it was right at the limit, but hoped it last long enough
>maybe a couple weeks in, psu popped
didn't kill anything though, just got a new psu

> inflation
1080 costs twice as much now being a pre-top.

Oh how I loved my 9800gt

>tfw paid $40 for DDR1 back then

Remember when you didn't have to pay an arm and a leg to play something with your friends?

The cards that lasted most with me were two 560tis overclocked.
Bought the second one like a year after the other used and for pretty much pocket change. Ran everything I needed on 1080p without issues.

Remember the Quadro FX 4600 that came out at the same time and cost 2800$?

I still have and use my working GT9600.

What do you use it for friend

>tfw an RX 580 and 16 GB of 3000MHz DDR 4 costs the same as a GTX 1070 ti and just 8 GB of RAM

Which should I go with?

>Nvidia

They've been raising gpu prices for years and Sup Forums gobbles it down as if it was a gift from god. The titan Z and now the titan V are cards really pushing to see what chunps will pay. Hell the titan line in general is raising prices just so when Nvidia drop the xxxxti equivalent at a silyl high price they can say "look goy, only 10% slower than the titan for only $899.99! What a bargain eh goy?".

AMD isn't exempt from this either but they have been far less aggressive in such pricing tactics.

AMD costs as much as nvidia nowadays due to mining and still manages to be slower than nvidia.

Best bang for fuck right now is either nvidia card with some overclocked i5 for gayman or a ryzen + nvidia setup for productivity + gaming.
When a vega costs as much as a 1080ti you know there's no way to defend AMD m8.

Biggest reason for AMD: Linux.

You have missed the point entirely. Mining causing gpu prices to skyrocket is a relatively recent trend (and "slower than Nvidia" is a meaningless claim as that blatantly isn't a universal truth). Nvidia has been raising the pricve of every flagship generation after generation - sometimes by huge amounts - and suckers buy into it generation after generation and actively praise Nvidia for it precisely because Nvidia created hte titan line with an absurd price just to make said fuckinbg of consumers feel better because you have something to compare it to.

I can run anything perfectly fine on linus and I'm using a nvidia card.
I don't see where this meme of nvidia doesn't work with linux comes from. Just because amd has "free drivers" doesn't mean nvidia won't work you know else no one would use nvidia for research work and gpu computing.
You will need a better excuse unless you just love going for the underdog which costs more than the top one.

I literally only moved up from an 8800GT because it didnt support the newest Direct X that vidya was moving over to.

Played Fallout 3 and Oblivion on nearly maxed settings.

>linus
kek.
I meant linux but whatever this is a funny typo.

Prices of everything are going up my dude and anything that uses ram gets even worse like gpus.

>buy new fx 8350 because A8 wasn't the one
>already have r7 260x
>literally the shittiest 500w PSU I've ever seen, came with the case but never thought to replace it
>obviously 500w is hard to give out
>using PC in morning
>PC turns off suddenly, sparks fly out the back and the room begins to smell of melted solder (like electronics class at college or something)
Literally never buy anything from CiT ever. Fortunately all my parts still work as if this never happened, but I was down a PC for about a month

Thats why the best setup right now is cheap amd gpu for linux + nvidia card for gpu passthrough in windows.

The AMDGPU driver has been built into the kernel since 4.10 so every new distro has support for amd gpu's out of the box, while a complicated setup is required for nvidia drivers on most distros. It probably just seems easy to you because you use a distro which nvidia has allowed to install the nvidia drivers automatically (ubuntu, linux mint), otherwise you'd have to go to nvidia's website download their linux driver, open up the tar file, and then run the sh file from a tty terminal with X disabled to install it. There are many bugs with nvidia drivers (I remember my gtx 970's hdmi ports not working for some reason.) also performance is subpar compared to the amdgpu drivers especially with wine because amdgpu drivers use mesa which has the galium-9 compatibility layer for near native directx9 performance. Not to mention nvidia's shady practices with sabotaging the open source nouveau driver by making newer nvidia cards require signed firmware so the nouveau team could not write their own firmware for the cards, also nvidia refused to give any documentation or help to the nouveau driver team, while AMD gives ample help to the AMDGPU driver team, even having many AMD employees working full time on the open source driver, and as a result the AMDGPU driver flourishes, outperforming AMD's and NVIDIA's proprietary linux driver.

I fucking despise nvidia, but damn, that's a sexy card.

I feel like the 980ti is going to be the new 8800GT, when overclocked to about 1500mhz or more, which almost any card can do with good cooling, it most of the time performs on par with the gtx 1080. And it only costs $350 used right now.

>otherwise you'd have to go to nvidia's website download their linux driver, open up the tar file, and then run the sh file from a tty terminal with X disabled to install it

That's not really hard to do and I have been using debian at home and fedora at work for ages.
Downloading drivers and installing have to be done if some guy is using windows too I don't see this as some huge issue for someone who already knows how to use linux but I also understand some people want it to just werk without having to press any key I guess.

Its a pain in the ass especially when your doing a graphical install from a live cd which cant distribute the nvidia propietary drivers because of Nvidia's EULA, so before booting you have to blacklist nouveau in grub (or you'll get a blackscreen since nouveau supports no 10 series cards and never will, and only some 900 series cards), and have to do the whole installation to 800x600 because your using the EFI's CPU rendering for graphics, until you can download the drivers for nvidia and install them. Unless your using a very easy linux distro like ubuntu or mint (which i like to call windows imitators), the amd cards are far easier to configure than a nvidia card.

It has some stiff competition from the 7970 and 290x - both cards are still great many years after release. Hawaii in particular is aging well as it handles high(er) resolutions better than it has any right to.

Yep
My 8800GT still blows most integrated intel gpu

Remember when GPUs werent as useful or powerful?

>3d accelerator cards

>AMD costs as much as nvidia nowadays due to mining
What 3rd world shithole do you live in? USA?

>to about 1500mhz or more, which almost any card can do with good cooling,

>overclocking a card that has been possibly been burned in at the original clock for years
>having to add a 50-100$ cooler shifts the price/performance ratio a decent bit
Doesn't seem very prudent to me; I don't know that much about GPUs, though.

Most 980ti's came with pretty beefy coolers.

1070ti has like 40% extra performance...

>q6600+8800GT
Those were the days.

Better buy 1060 at 299, goy

I jumped from that to an i7 960 with a 570.
In all these years I only upgraded the GPU and storage with faster SSD but still using that mobo and cpu.
For whatever reason haven't felt I'm lacking speed in doing anything so I'm still using it, it's a weird thing considering it's going to be 10 years soon.

>NVIDIA BIOS Editor
>free overclock
>native support for hackintosh
>crossflash to 9800gt
>crossflash to quadro fx3700
it was the goat

whats with the bios editor?

Similar situation, bought a q6600 with a 9600 gt in 2008 then in 2010 bought a i7 950 with an x99 sabertooth mobo, only thing ive upgraded is the ram and gpu.

As far as im concerned since ive completely given up on gayming, this board and cpu should last forever since programs are never going to catch up

Besides ive bought a 2nd hand r720 for christmas, ill never buy new stuff ever again with the ridiculous amount of power available on the 2nd hand market.

When the fuck is someone going to seriously use an i9 completely that wouldn't be better fulfilled by a dual cpu virtual server box, just utterly ridiculous computing power for the consumer market

Yeah I was lucky at the time since ram was cheap around here and I maxed it out from the start (24gb 2800mhz OC) I probably will buy a cheap X5650 from ebay since it's the i7 extreme 6 core xeon equivalent of that gen but even so I always end up asking myself, why I need 6c/12t if my 4c/8t still can handle anything I need.
To make it worse the only two games I played in the last 2 years were sleeping dogs and civilization 5.

my dual 8600gt 256mb handled crysis a bit better boi

Remember how 8800GT was the LAST mainstream gpu from nvidia, recommended at its launch price before the GTX 460 1GB, which happened 3 years later.

i got a 980ti for christmas

I used mine from 2007 (preorder before release) until 2015. Lovely card.
Sold it working.

I am still using the 9800gt. If not for the lazy developers i still could play the newest games. (i don't mean AAA, but a Overwatch or fate/extella should be able to run. But crashes with videocard problem.

No driver support from Nvidia.

Yeah, and I remember paying $300 for a FX5600 and having it run like UTTER SHIT in everything too.

Still run 9800GT and GTS250 at work.
Have a 9400GT at home.

>It's just impossible to cool much more than 75-100W in a single slot form factor without tons of compromises
Aren't newer architectures supposed to be more power efficient? The RX470 and 1060 could have single-slot cards, they consume practically the same than a 5770.
The RX series is pretty decent anyway power-consumption related, meanwhile the previous R9s were just some disgusting monstruosities surpassing 250W, I would never ever put something that takes over 150W on my computer.

same