THANK YOU BASED NVIDIA

THANK YOU BASED NVIDIA

Other urls found in this thread:

youtube.com/watch?v=bNCfn4y8dBw
twitter.com/NSFWRedditVideo

...

>Thanks for saving me £2 a year Nvidia...

DELETE THIS

>performance per dollar
has anything ever even came close to radeon hd 6870 in that regard?

doubtful.

In the electricity bill.
Save a load more in the heating bill though ;)

BEND ME OVER AND FUCK ME NVIDIA I'M A FAG AND I LOVE IT

why delete it? new card has better per/watt than old cards.

Polaris will do the same thing. altough nvidia has performance crown, i think the Polaris (480) might actually bring the best perf/$ and perf/watt.

9 days to go til we find out :)

>poorfag

I didn't know you guys cared so much about climate change and environmentalism. :^)

>using the smiley with a carat nose

What are you even trying to show here? That new cards are better than old ones?

If you're so rich then why do you care about your electricity bill?

>9 days to go til we find out :)
xfx was gonna leak something on 26, but still, no press release till june.

>pay 600$ for a graphics card
>huur duur muh electricity bill i cant afford it aaaaa

ay lmao

>new cards are better than old cards

at this point when everything is maxed at 1080p, unless you have a bigger resolution there is very little reason to get "the best" instead of the best performance per $

I'm curious to see what AMD has to offer for a monster hackintosh FCP machine. If the new AMD cards perform somewhat comparably to the 1080 or 1070 but have a serious advantage in OpenCL performance I'll most definitely be buying AMD instead of nvidia.

Less energy used means less heat output, better overclocking, and smaller (less VRM space, less necessary heatsink size) and potentially means you can have smaller and smaller ITX rigs since a lot of them are constricted by the PSU size.

found the guy whose not upgrading to pascal, what a loser.

>Less energy used means less heat output

That explains why the 1080 quickly spirals into the 80s and starts thermal throttling. :^)

>Liquid cooled + non-ref cards vs ref card
nice

>Totally makes up for the burnt down house

>Not wanting a free heater in winter
fags

What's the difference between the $700 and $600 versions of the GTX 1080 and which is better?

You get to buy $700 earlier than $600. Founder Edition = Early edition. Since this is a paper launch, they're trying to make the most money out of it. Suckers will buy it

founders edition = normal edition fucking retard....
you pay +100$ for nothing
youtube.com/watch?v=bNCfn4y8dBw

>not using the smiley with a carat nose

,':^)

Am I the only one that's bugged by the "Performance" per Watt charts.

HOW THE FUCK IS PERFORMANCE MEASURED. THERE'S NO UNITS FOR THAT SHIT.

Its not even actual efficiency, nothing can be 100% efficient. It would make sense if it was Calculations per Watt. But this performance meme smells like a lot of bullshit.

the graphs are based around a central card. 100% is the baseline card, and other cards are measured via average Framerates in games in comparison to that card.

Then you can take that and divide it by the wattage of the card, and curve the results based on one card as a metric.

This is Sup Forums dude. Nobody understands math or graphs around here.

How do you get that much better performance per watt and still get a card that over heats as much as it does?

Shitty reference cooling and boost clock.
>overheats

I can only guess that they massively downgraded the cooling solution because evidently that cooler is like half as efficient as any other cooling solution on the market, and at least 40% worse than the titan x's cooler.

What data are you referring to? Looks like it stays the same temp while doing twice the work to me.

>HOW THE FUCK IS PERFORMANCE MEASURED. THERE'S NO UNITS FOR THAT SHIT.

Look, guys, this faggot never heard of "frames per second".

>Less energy used means less heat output, better overclocking
Both wrong. Heat output isn't just affected by power consumption, it also depends on the chip's leakage, which can be deliberately designed to some extent. You can have a card that consumes little energy and releases a lot of heat by designing it with higher leakage (which also means it will overclcok better) or you can have a card that consumes a lot of energy and still manages to stay cool (but will be less forgiving to overclocking, and retaining heat inside the chip for longer also means a lower lifespan).
Like everything in life, the ideal approach is a balance between the two.

...

Thanks Nvidia, maybe it is time for another mortgage.

I can't wait for AMD to eat their market this time around.

amd already gained back 30% of the market so it shouldn't be hard considering amd is going to be completely uncontested at the midrange which is what most people buy

Did they at least cap the frames at 60? Otherwise it would be totally meaningless since monitors with mre than 60 Hz are just a meme.

>Reference model
>More expensive than what custom 980ti sold for brand new

youtube.com/watch?v=bNCfn4y8dBw

>but muh founder's edition!

:3333333333333

Thank you based nvidia!

THANK YOU NVIDIA.

>TWICE AS FAST AS TITAN X!!!

poorfag

>4 FPS faster than 980Ti
>cost 150$ more
I can afford that!

...

>meanwhile all AMD will have is mainstream poolaris cards with last years performance

How impressive!

I'll give it as long as I goddamn can before I upgrade.

Most cards at this point are either memes, rip-offs or straight-outta-the-loo scams.

Maybe I should've just gotten a new graphics card back when I upgraded pretty much everything else.

I just wanted a 960Ti last gen.
>nope, can't have
Will 1060 come?
Yes?
Then it's down to determining whether it's meme, shit, or both.

>you will never play GTA V at a stable framerate above 10.

>Most cards at this point are either memes, rip-offs or straight-outta-the-loo scams.

That's just Sup Forums memes though, when you have a card as old as yours you can pretty much pick up anything and expect a huge upgrade.

Try a 2nd hand 970 or r9 390 or example

>Fury almost beats 980
Whoa, I thought that thing was a power draining sinner.

HBM helps

In that case, can we expect Vega to absolutely destroy with perf/watt?

Not absolutely destroy since it will be a big chip, but significantly better than last years big chips yeah.

Who gives a shit about performance/watt though, in the end it's about performance and it will be awesome in that department

>Who gives a shit about performance/watt
People who don't want their houses to burn down and Europeans with heavily taxed power bills.

Get the highest utility cost in the world and calculate the break even for paying the Nvidia premium. The card will die before you do.

>People who don't want their houses to burn down
So properly cool your card
>and Europeans with heavily taxed power bills.
Did you ever do the calculations? Unless you game 24 hours a day the difference is a couple of euro's per year.

If you have to blow on a 800 euro graphics card you aren't going to worry about that.

have money to spend on~

when will you fags understand that the GTX 950 is the only /nvidia card you'll ever need????

>So properly cool your card
Sure, you can add all the cooling you want, but it's going to be a lot harder on a card that is not efficient. Even if the max performance is lower on the efficient card, it will be able to reach a target frame rate without you having to watercool it or listen to a fan at >90%

Housefire


>card gets hot to 80-85C after hitting 4-10 minutes in game
>it downclocks and throttles
>$700 DOLAROOSSSSSSSS

nvidia got away with 10/10 reviews kek

So what this chart is saying is that I should get a R9 290, not 290x?

GTX 950 if 1080
R9 390 if 1440

>960 lower than 970
Man, good thing I didn't wait for that card, what the fuck went wrong?

>can't handle 80C
gpus should be able to hit 95 before throttling and 100 before shuttoff..

Since most benchmarks done in reviews only tend to last 3 minutes or less Nvidia tends to have their reference cards boost as high as they can go for this initial period and then throttle down because (shockingly) the card can't sustain those clocks for long. It is however just long enough to pass muster for review.

The 290 took enormous price cuts before it went EOL and due to its performance that meant it annihilated performance per money metrics.

What is the 8800 GT of today?

>performance per watt
Only matters for mud farmers living off photovoltaics and retards buying expensive electricity

No. Maybe upcoming Polaris.

>free

hey bro my graphics card does 100%fps how much does yours?

>1060 worse than a 970
JUST

I'd be more concerned about having 24gb of vram on the titan but only 1/32 DP.

Its easy to guess performance of new novidia cards. Id be ok with this if they didnt charge so fucking much money for them.

1080=980Ti
1070=980
1060=970

Would those ram even help with gaming? I very much doubt it. 8GB seems like a good cap for gaming right now. 6GB is a good softcap, 4GB was the standard but limited in todays world. Are those 12/24GB pandering to 8K resolutions?

fps divided by watts = number

pick a card for baseline.

numbers higher mean its more efficient per watt
numbers lower mean its less efficient

>12+12=24. Twice memory of old Titan
>inb4 Twice as fast as OLD TITAN
>please buy!

>Would those ram even help with gaming?

For the most part there is no downside to having more memory (under the assumption you aren't taking enormous latency hits when accessing it all). Whether you actually need that much vram? Unlikely for the forseeable future but the mere fact the technology (GDDR5X and HBM) can scale that far without chugging down a million watts has plenty of applications for non gaming workloads.

>8GB seems like a good cap for gaming right now. 6GB is a good softcap, 4GB was the standard but limited in todays world.

8gb is still massive overkill until you really start hammering away at 4k and even then good memory management means even 4gb is plenty.

>Are those 12/24GB pandering to 8K resolutions?

No because you'd still be limited by the shaders.

When it's time for HBM2, how the fuck will they add 24 of it on the next generation Titan? It wouldn't be nice marketing if it had less memory than its predecessor.

taking what was 400-600$ and putting it to 200-300... yea, thats impressive.

granted i don't buy high end cards, not sense the 5770 when damn near everything within 2 years can be maxed and then you can get to 2-3 more years if you reduce some settings.

the only thing i won't compromise on is texture detail, and almost every card has 8gb now that i would chose to use.

if you dont game you have little use for a new card.
if you game you have little use for above a 200$ card
the only thing the newer cards will offer is a new way to render a scene, so you aren't rendering 2 completely different view points separately, this is what gives them the disproportionately high vr performance, and i assume that this also can be extrapolated to just 3d content too.

that said is the new method of rendering allows you to render one scene then its possible you will see multi screen performance also get a fairly significant boost.

1080 is basically overclocked 980ti ($650) and putting it $700.

Thats nvimpressive.

AMD WINS AGAIN

EAT SHIT NVIDIA FUCCBOIS

lets just say this doesn't bode well for nvidia.

but we will only know how bad it is in the comeing months as the cards get benched in every way possible, and once amd drops their cards how much nvidia fucked up.

>you post this ironically
>nvidia shows its investors graphs exactly like this one at conferences
oh the irony

>The best performance for your money is usually in the mid-tier cards. The higher-end the card, the larger the price increment. The priciest cards usually cost significantly more for comparatively smaller improvements in performance.
I see no reason to pay more than 200$.

UK price is almost $900. This is radical even for nvidia.

>the power of marketing

stop being poor and get polaris when it comes out

>energy output is different from energy input because jews
What you're mistaking as jew magicks is known as "energy efficiency". Either way, regardless of how energy efficient a GPU/CPU/, the energy it consumes turns into heat. Heat whose energy is perfectly equal to the energy it consumes. Without fail because physics. Energy consumption == heat output. Regardless of brand or design.

By what logic are you excluding the leakage from the power consumption?

Leakage is a meme only relevant for phone chips

28nm to 16nm? This is the best Dear Leader can do?

Congratulations, you failed Physics.

900 series
>cut down on features for compute
>introduce new architecture that is more efficient
>release mid endcards as high endcards because AMD is only doing a semi-refresh of their 200's
>call your 3.5gb card that comes on a cutdown pcb with horrible thermals a 4gb a x70 model
>Ignore where the api market is going, instead just pay off game makers to put in buggy code AMD can't avoid or optimised code AMD can't use
>gimp the previous generation of cards with driver updates to make your new ones look more attractive

1000 series
>ignore the need for dx12 features that AMD will destroy with
>introduce new lithography process to shrink the transistors
>hold an event to brainwash the public with new of very selective performance characteristics
>release midend cards as highend cards because AMD might not bring anything better according to rumours

IF AMD drop a bomb on the GPU market the 1080's will soon drop to about 1/3 of their current price and put that hardware where it would have belonged during more competitive generations. All this Titan business are just ti cards that could milk for much more because of lack of competition throughout the years.

AMD can't even afford its own employees toilet paper. My friend works there and he's been told to bring his own.

AMD is currently in the process of replacing all employees with Indians to cut down toilet paper costs. It's the next step in the path of destroying Intel and dominating the tech industry.

Looks like nvidiots ran out of arguments