Tfw 280x is faster than that glorified turd keppler 780ti. Jesus, keppler was an embarasment

Tfw 280x is faster than that glorified turd keppler 780ti. Jesus, keppler was an embarasment.

Attached: Old.png (1320x2887, 132K)

Other urls found in this thread:

en.wiktionary.org/wiki/OWO
youtu.be/sv_RqSJF2sI
twitter.com/SFWRedditVideos

k

Novideo is slowing down the 780ti on purpose. Keppler's great besides that.

what year is this?

OwO were are all the other cards? you know, the not-for-poor-people ones?

Used to have a 750ti. Mane that shit just refuses to become obsolete.

[spoiler]I'm an AMDrone now[/spoiler]

kys owofag

Attached: guru3d-far-cry-5-benchmarks-pc-1920x1080.png (682x909, 47K)

Nah, it is just that Kepler's architectural weaknesses are being exposed (weak at shading when compared to GCN 1.0 family)

>tfw my 1070
>payied 800 NZ dollars for this
>Get BTFO by every mid randge gpu release on the past 2 years

I-I'm not u-upsett I. ;_;

Attached: 1492110426147.jpg (796x805, 159K)

You have no idea how much I regret not going with the HD 7970 back in the day. Can't be helped, at least my Kepler card is still alive many years later.

Attached: Dark Saber.png (2053x3507, 3.76M)

It's not.

I could have told you in 2013 that you'd be better off just getting a 280X.
Has nothing to do with drivers. AMD simply had a better architecture, despite it being older, and it took until Maxwell for Nvidia to catch up and finally have something worth buying.
It's down to the lack of async compute, and very low compute for the cost, and poor memory/cache controllers. The only reason they did do well on some games of the time is that they tended to have good ROP performance and games of the time didn't use as much compute, but it was clear all the way back in 2011 that games increasingly used more compute compared to rasterization every year.

Anyone who bought Fermi or Keplar was an idiot.

I paid $525 for mine plus $90 for artic accelero. It's so nice to be able to still run every new game fine on a 7 year old GPU.

Though now I regret not getting a Vega 64. I should have known it'd age with better programmed and higher compute use games like the 7970 did. There's been a number of games released in late 2017 and 2018 that run extremely well on it for the $600 cost if I had gotten one at that.
I was disappointed in its fp64 performance, though, and lack of fp10/11/11 math packing.

Kepler lacks register files and shared memories and its cuda cores are really inefficient.

>Jew3d

Keppler still sold way better than the 200 series.
This is why AMD will never come back with another competitive architecture.
No one but miners buy them, even if they are good.

>This is why AMD will never come back with another competitive architecture.
They will, out of sheer boredom.

>280X literally beating out its successor 380X
Was the 380X just shitty, gaddamn

>bought sapphire 7970 GHz 3gb pretty much when it came out
>only replaced it this year and only because it died

2027. A time of great technological innovation.

After Nvidia driver update

Attached: farcry5.jpg (1015x1567, 177K)

Don't user, the AMDfag has it hard enough, be magnanimous and give him this little 'victory'.

That is much better for Nvidia, though the 780 still looks terrible.

AMD still has much smoother framegraphs and better drivers/software, though, so I really don't care about higher average FPS.

the 380x was a 285, it has trade offs in games where the 280x is better and the 285 is better, but generally the 280x had more raw power then the 285, if I remember right, the 285 was a half step to test some fury elements.

>280x eats over 300w of power
Is it really economically viable?

the 285 was better at tessellation but thats it.

>[spoiler]you gay[/spoiler]

>OwO
en.wiktionary.org/wiki/OWO
>Initialism of oral without: oral without; in prostitution, signifies performing oral sex on a man without using a condom.

>Get BTFO by every mid randge gpu
what are you talking about?

I don't like nvidia cause their power consumption is always like 100w more than of their market rival.

Attached: 86529.png (650x500, 38K)

i shARE an island with you, how embarrassing

380x was an oc'd 285 which was the first gcn 1.2 card which had less shaders than the 280x but had more features/better tessellation. not surprising to see the 280x faster than the 380x at all.

>7970 STILL getting over 50 FPS in AAA games

Fucking GOAT gpu

>The Radeon HD 7970 was a high-end graphics card by AMD, launched in December 2011

Attached: 1439086513052.jpg (4500x4334, 1.09M)

farcry 5 is AYyMd optimized game

Name one Nvidia GPU that aged better than their AMD equivalent

I think the market has changed. Every little dum-dum won't be going directly to the store and buy the shiniest grabix card now, he will google up benchmarks and decide after. That's exactly what happened with Ryzen, nobody except amdrones had actually believed in them. It would be the same if Vega wasn't such a flop at every regard.

Wait, my 660ti can still run new games?

>muh power consumption
It's not a laptop, so why would you care about how fast it eats electricity? Even as far as money if involved, it works out to pitiful amounts. I will never understand this.

>7950 and 7970 still going strong.
They used to be like 50% the performance of a 780Ti. I remember struggling to choose between a 660Ti and 7950. Whoever got a 7970 back in the day is still going strong to this day.

980 ti

$100 in a years means something to poor fags. I feel a need to save to fund my alcoholism and smoking.

Do you think I can sell my broken 7970 to a museum?

Hardware Unboxed shows only like a 3% performance gain with the Nvidia driver update. Vega 64 is still much closer to the 1080ti than 1080.

Holy fuck why would you buy a 660Ti over a 7950? Even then the 7950 was clearly better.

Maybe because they used different CPU to test?

actually, it's not quite so simple. as i understand it, AMD creates a new architecture/core/whatever a whole lot less frequently than nvidia. kepler as an architecture is different from maxwell and pascal, and since nvidia's focus is inherently on end users with more modern cards, they have no reason to optimize for kepler and thermi. since AMD is still making drivers for the rx 400/500 series which still runs on (albeit heavily modified) GCN, the older AMD cards age far better than the nvidia cards.

Thanks for pointing it out. The fine wine shit makes sense now

Attached: aVKfoMY.png (1023x868, 160K)

there was a lot of shilling back then. At the standard 800mhz clock the 7950 was 10-15% faster but had lower minimums and worse frame pacing. AMD fixed that in about a month but it was enough ammo for nvidia shills to convince a lot of people that the 660Ti was the better choice (even though the 7950 could be overclocked to 1200mhz and btfo the 680 even back then). Luckily I wasn't one of them.

I kept waiting and hoping for AMD to step up to the plate and was dead set on getting a Vega 64 but cryptominers happened and then I found out that NoVideo laptop graphics are the same as desktop ones this round add that to finding a laptop with a 1070 for $999 and I joined the dark side.

Still got a 280x in my desktop which is still good outside of BE, guess I'll wait for navi to release then drop in price

Oh yeah AMD drivers were definitely bad back then, and that microstuttering problem. And that fucking cursor problem.

But now days AMD GPUs are on average at least 10x smoother than Nvidia ones, but Nvidia ones aren't so bad that you see much of stutter with that difference.
And now the situation is reversed where Nvidia drivers and software is awful, yet you still see people shilling so hard for Nvidia.

that card is still legit for 1080p gaming, and it doesn't even require a separate PSU connector.
a great secondary GPU to have desu

>he will google up benchmarks and decide after.
Fuck no
We're talking about mostly underage normies here
They will visit their favorite tech "reviewer" like Linus shill Tipps or jayz2shekels and go with whatever they pick up, even if it's shit
It will go like
>Whoa, jayz said he prefers Nvidia
>That must be because Nvidia is better
>He must be right, look at how many subs he got
>I want to be like jayz. I want to buy Nvidia only
>Whoa, Luke benchmarked a GTX 480 against consoles and it won
>Impressive! This must have been a milestone of Nvidia engineering!
>Too bad I wasn't alive when it launched :(((((((
>Whoa, Linus benchmarked a shitty AMD gpu that isn't any worse than NVidias counterpart
>Laughable! I want to buy from a company that has never released bad products
>Whoa, jayz2cents wants to boycott AMD because of some fud he picked up somewhere
>Unbelievable! AMD must be an anti consumer corporation run by Jews
>We must boycott AMD to save gaming!
Ryzen had one point going for it: reviewers got sick of Intel after countless 2% IPC increases and new motherboards and gave Intel bad press for it.
No one is giving Nvidia bad press other than maybe one literally who tech reviewer.
The GPU market is fucked and will stay fucked

Had to play KCD on a 770 while my 1080Ti was being replaced.
Down to Medium settings, 900p upscaled to 1080p.
1080Ti comes back.1080p Ultra.
Doesn't look or feel any different.
I'm sure it is, but it's so subtle, it doesn't registers to the brain.

Mong

I wish Vega wasn't so fucking expensive.

It's rally laughable.
Considering both companies have very good products in the mid range.
And honestly saying, I don't feel that anyone has the upper hand about drivers. Maybe AMD, because of better interface.
You're definitely not gonna go wrong choosing from either for a 200$ ish budget.

>Game prefers amd
Good thing this isnt the case 90% of the time

>970 besting 390x
Told you fags years ago 970 was the best

But your chart doesnt show that. Did you forget to green text?

why would they even care for amd when 99.95% of the new cards bought for gaming are from nvidia because amd is so overpriced it's not even funny

Looks like amd got btfo as soon as nvidia updated the drivers anyway
970 beatinf a 390x is not what i wouod call optimized for amd

>that min fps
I'm an Nvidia fag, but I'd rather take a variance of 4 fps vs 15. 970 was DOA

Idk how much the 390x can overclock but 1316 for a 970 is really low. Mine it at 1490mhz. Stock on my ssc was 1420. 970s were notoriously underclocked at stock

>750ti
1080p on low-medium mostly. It's a bit above a 1030 and well behind a 1050.

RX570 is the minimum that you really want for 1080p at 60fps+ on higher/maxed settings.
A 750ti is well behind the 1050ti and the 1050ti is still a bit behind the 7970.

1.08GHz on the 390X. They would hit anywhere from 1.12-1.2.
My 7970 non-reference was actually stable at 1.32GHz. I'd think some 390Xs can go higher than that given the more refined process, but I don't have first hand experience so I'll say 1.2-1.3 from what I've read.
Also you have to account for how AMD overclocking scales nearly 1:1. 10% higher clock is usually around 8-9% more performance. For Pascal a 10% higher clock is usually more like 5-6% more performance and Nvidia heavily gimps overclocking them. Buildzoid goes into more detail on that. I'm not sure about Maxwell.

I would have been okay paying $700 if I could still get one with the sexy silver shroud.

Maxwell and pascal are really similar. Likely isnt too different.

Even amds CPUs provide a smoother gaming experience than Intel. They're hitting some good points but people just want 'the best'

Someone post the graph that makes it look like the GTX780ti is way ahead of the R9 290X but its only .1fps

Currently running my Vega 56 i7 on a 4770K with 64 BIOS at clock 1450Mhz/900mV, memory 945Mhz. Power consumption at the wall is around 350 watts but I have a lot of HDD's in there. Taking an educated guess on deducting the CPU etc I'd guess around the 180 watts mark.

KCD isn't exactly a graphical marvel though. So I'm not surprised you don't see a difference. KCD graphics are on par with a ps2
Its the price you pay for greatness. With drivers and such its already made great improvements in performance and power consumption. Its guaranteed fine wine tech

>180w
That's not to bad ey... Actually surprising since the 1060 can suck out 160W.
Have you been messing with that new crimson live shit? The control it gives you over your GPU looks amazing.

Yeah good fun. I like all the overlays.

Most people are dumb I guess.
I prefer the smoother framerates of AMD CPU+GPU, the better software/drivers. Not only that, but some Nvidia GPUs really shit themselves on some games like the 970 and 3gb 1060. Hell the fucking 7970 roughly matches the 3gb 1060 in a lot of games. That's pathetic considering you could get a 280X for $300 4 years ago when then 3GB 1060 were $200 new.
There aren't really any games where AMD GPUs completely shit the bed unexpectedly like you see with Nvidia ones. That's a problem that I don't understand why people put up with it.

I have friends with 970s and 2-3GB Nvidia cards and they always have to go into options and figure out
>what setting do I change to make my FPS go from 40 to 70?
Like they often can't handle shadows and mundane shit.
Meanwhile my 3GB 7970 has no trouble with any specific settings so long as the AMD driver has tessellation turned down. Just set it to medium/normal or medium-high on newer games and it's fine. Don't have to fiddle with individual fucking settings trying to figure out what the busted unbalanced GPU can't handle.

>tfw clocking 55-60 fps maxed out on a sapphire 290
Hitman in DX12 gives me 80-90 fps.

Attached: vapor x 290.jpg (1000x999, 105K)

What's really fuck up about the 7970 is that it came out 7 years ago. Truly stunning stuff. Though I must say personally I haven't had any problems with the 1060 3gb so far. Playing lots of shit at max with 60fps+ though I can see its vrams hitting walls. I've nearly finished downloading ffxv and I'll have to turn down the settings to medium it seems or go high and fiddle around. You should test your 7970 on that cause it seems tessellation is a big thing in that game.
Though a lot of tech reviewers are right I think. Nvidia won and its game over. Though it'll be interesting to see navi. But I don't see why amd would bother now. You know that Nvidia GPP? They're already getting companies to downgrade and change the way they sell and products. I don't k ow what I'm saying but check this its pretty short
youtu.be/sv_RqSJF2sI
It looks rather underhanded of nvidia to do this. They don't even have to but you can see something is up.

Actually the in game overlay reads 1490+Mhz so even better.

Your average Joe won't give a shit. They will just grab whatever is on the shelf that has Nvidia branding. I doubt half of them even know who AMD is.

Yeah MSI is removing branding logos like aorus and such from and GPUs. There reasoning is that amd just can't compete with nvidia so they'll rather promote nvidia because its 'the better product'. They say they're doing it for the customer see my YouTube link to see what I'm talking about

The 1060 3gb is still good enough for today's 1080p gaming, though it's definitely been hurting on newer titles it still usually manages 60 or close to 60fps.
But.. that's a new $200 card. It shouldn't just be barely handling modern games. It's not going to be looking good in 4-5 years like the 280 and 280X still do.
It works. I'm not saying it doesn't. It just was never an ideal choice for anyone.

It's pretty fucked that you could get an RX470 4GB for ~$95-$125 and people still chose 1060 3gbs over them.

Hardware Unboxed did a 1060 3gb vs 6gb test. In some games, the 3gbs FPS was over 30% lower, but their conclusion was still that it's "still good". I disagree. Their difference in price is less than 30% and things will only get worse in the future. Those cards aren't that old and a card in that price range shouldn't be suffering so soon like the 970 was. It's basically a scam.

Attached: 1518391700444.png (902x726, 148K)

What games does the 970 shit itself on? I have had it for 3 years and have never seen it happen, you wouldnt be a lying amdrone would you?

I've put my 970 under a lot of stress since it's launch and I have no idea what you're talking about

RE7 with certain settings gets terrible minimum fps. like 10fps minimums. Stutters like crazy.
Interestingly, the Fury GPUs also shit themselves on the same settings in that game yet the 7970 is comparably fine despite the 7970 having 3GB of VRAM and the Fury 4GB.

There's a ton of examples, but I can't think of them off the top of my head.
Reviewers do tend to avoid the settings that cause trouble for the 970.

You don't know what you are talking about. Like i said, i have had it for 3 years, that doesnt happen with the 970. There are not tons of examples and reviewers dont do anything special for the 970. You're making shit up.

It doesn't make any sense that they put 3gb on it. But atleast it mad it cheaper.
>rx470
I was planning to get the 580 the whole time but then the mining fiasco started and I really needed to get back into gaming cause my gtx650 1gb just couldn't cut it. The 1060 will last me abit atleast I'm not too much of a graphical autist.
That 30% loss I'm sure is due to the 10th CU being disabled(for God know what reason) in the 3gb. It is a scam but what you gonna do. Nvidias just gonna keep getting insanely rich. And its not like their products are bad. But I can see them overpricing their next line of gpu's for the simple fact that they can get away with it. Like the 1080 is suppose to be their flagship but then the 1080ti. Then the titan then 1070ti then whatever the fuck else they pullout their arses. They're overly segmenting just like Intel does with their gold platinum blah blah and so forth. Last year they released like 30 CPUs pic related is just a sample of their cpus from last year

Attached: b5d1ba793575.jpg (4000x2250, 955K)

What the hell, I thought my 280x was ancient enough not to like ubisoft games anymore. I guess it's only ass creed.

>3.5gb
You really shouldn't be defending that card. Buyers remorse?
Did you get your money for that lawsuit by the way?
Ass creed syndicate and unity were a disaster for everyone

Would have gotten a used 750ti imo.
Think you could still get them for around $75 during the mining craze.
Shit ass card with only 2gb of VRAM, but better than getting a 1030 or rx550 for more money.
R7 265 or 270X I think you could also still get for a decent price.

But yeah, understandable. At points during the mining shit you could still get 1060 3gb for around $200. It's a lot worse of a buy than the $170-$180 RX570s you could get, but those were only at that price for like 3-4 weeks.

And there it is. I knew you were just repeating the 3.5 meme and nothing you said was based on fact or documented evidence.

>r9 280 is just rebranded 7950
>7950 is faster
????

The 1060 3gb shouldn't exist anyways. Nvidia needed to push it out because of AMD's lineup of Polaris. 1060 Ti would have happened otherwise.

Yeah I was out of playing a lot of games for 2 years. I missed out on a lot and wasn't prepared to wait the mining craze. Don't regret it though. It'll last my like I made my 650 1gb last.
Lol I'm not that guy you were talking about. Just pointing out that nvidia screwed their customers then just claimed 'lol l, it was a marketing confusion'. You'd think someone at nvidia would have seen the ads and thought hmmm... Thats not true.
I was actually sincerely asking you if you got your lawsuit money though?
3.5gb though. Lol

Piece of shit rx 460 and 560 that high

HD 7000 from amd from 2011 can keep up with 2017 cards

>i don't even

No i didnt get the money. It was like 20 dollars and it wasnt worth my time. I lnew it was 3.5gb when i bought it. What people say is the card will go over 3.5 and stutter and thats not true. The driver wont allow it to do that

>HD 7000 from amd from 2011 can keep up with 2017 cards
>>i don't even
It tells you about the dire state the tech industry is in by allowing monopolies to thrive

>tfw put a $250 7970 in my brother's pc 5 years ago and he can still play any game he wants

Attached: nvda_stock_price_today_-_Google_Search_-_2018-03-28_22.33.22.png (634x508, 28K)

cheap gpus soon?

vega12 = vega 32/40?

They will slowly return towards MSRP, especially when it stops being cost effective to mine eth. However, expect high prices through 2018.

It's Vega Mobile aka Vega "we don't fucking know actual what".

Yeah, I guess you're right there.

AMD made a godly GPU 11 years ago, but retards still bought more 580s/680s/780s despite them being inferior for 4 years running. Thus RTG didn't have much funding. We should already be on Navi by now if not for that.

It also goes to show just how long it took Nvidia to catch up with Maxwell.
I used to rate the 7970 as the 3rd best GPU for consumers relative for the time it was released, after Khan (Radeon 9700 Pro) and GeForce 6000 series. But the past few months, shit, the 7970 is actually on top. It's the best GPU relative for its time ever made.

I wouldn't be surprised if Vega12 is literally 12 CUs like how Vega11 was 11 CUs. A bit above RX560 performance but smaller dies and thus cheaper.

I do expect we'll see Vega M GH and GL, which are 32 ROPs and 24 or 20 CUs respectively (though I've read sources saying that GH is actually 64 ROPs. That doesn't make sense) as discrete GPUs. Vega M GH at desktop clocks should be roughly RX570 performance but the die is about 30% smaller. It's actually the highest performance/area GPU in existence currently.