...
Untitled
HBM2 cards when?
not for another year
Lol
Like DXTC wasnt' a thing for 10 years.
NVIDIA GeForce GTX 1080 NVIDIA GeForce GTX 980
H.264 Encode Yes (2x 4K @ 60 Hz) Yes
H.264 Decode Yes (2x 4K @ 120 Hz up 240 Mbps) Yes
HEVC Encode Yes (2x 4K @ 60 Hz) Yes
HEVC Decode Yes (2x 4K @ 120 Hz /
8K @ 30 Hz up 320 Mbps) No
10-bit HEVC Encode Yes No
10-bit HEVC Decode Yes No
12-bit HEVC Decode Yes No
MPEG2 Decode Yes Yes
VP9 Decode Yes (2x 4K @ 120 Hz up 320 Mbps) No
FUCKING FINALLY
...
Posting in an epic shill thread keep going with those fucked graphs boys!
up to 10b HEVC (H.265) encode/decode (plus 12b decode) is welcomed and puts it above Intel's Skylake processors that encode/decode 8b.
holllyy shit
kek
I love these
you mad, AMDead?
IT'S OVER, AMD IS FINISHED & BANKRUPT
That this thread is always at the top of page 1 means that you need TO FUCKING THROW AWAY THE GPU PACKAGING FOR FUCKS SAKE
>1.7% GAINS
FERMI 2.0 CONFIRMED
Do shills even know what this stuff means?
A moron like you wouldn't know
T. Idiot
What does this mean???
I'm buying a 1080 on release day, so it doesn't matter hahaahha
That has to be one of the strangest and most useless graphs I have ever seen.
Using 11-28% less bandwidth than the previous card really does not mean much of anything if bandwidth was not an issue in the first place.
One only creates graphs like this if they are desperate to show an improvement over the previous generation.
It's also completely pointless for anyone with half a working brain cell since the 4096 bit bus of a 4gb hmb2 rapes it to dust.
What's up with all this HBM hype
The Fury X had HBM and if I remember correctly it wasn't amazingly better at anything than the GTX cards
Why are people desperatly waiting on something that's not going to translate into real life performance? Not memeing, geniously curious
...
HBM2 supposedly will have some tangible benefits. They said that about HBM though, so who knows? As always, I'll wait for the benchmarks to show up.
The fury X did really well on 4k though, which is where I think HBM shines, on lower resolutions the cards aren't really that bandwidth starved
DAYAM
Those graphs they posted are shit.
>vr performance
Seriously?
Nvidia has nothing this generation. Just pretty words to make their shit sell to retards.
If the leaked AotS benchmarks are legit, then even a basic Fury isn't far behind a 1080 to where it's considered a massive goddamn upgrade
wccftech.com
This also assumes DX12 (and Vulkan to a lesser extent) get widespread adoption to where this matters, because I can't imagine words like "Windows 10 only" and "OpenGL successor" are positive phrases to publishers, who have only just started making DX11 mandatory for the last couple years
Does HBM have ECC by default?
So what's the verdict? 20% better than 980ti?
From what I have seen, 20% at best but overall no difference to a 980ti.
So 20% upgrade for $700.
Fucking wonderful.
buy polaris nvidia is dead
man, GTX 1080 looks worse and worse every day.
>Fast Sync
Oh thank God I didn't buy a Gsync monitor.
You can still get $400 for your Ti. But yeah.
>12-bit HEVC Decode Yes No
Thank Daiz.
rip nvidia
That graph it for kids.
finally? nobody needs 10/12 bit hevc encoding and decoding. it's a dead meme, and maxwell already has hevc decoding/encoding support up to 4k 60 fps.
no. ECC is a performance reducing meme
They're really trying to fill the gap between now and a new architecture eh?
Jesus christ how many times are these companies gonna release rebrands already.
No and stop posting about things you don't understand
GM200 & GM204 does not have HEVC hardware decoding
>No and stop posting about things you don't understand
you are projecting; it is you who does not understand what you are talking about.
>GM200 & GM204 does not have HEVC hardware decoding
all gm20x chips do, only gm107 (750ti) doesn't have HEVC encoding and decoding hardware.
en.wikipedia.org
>Introduced with the second-generation Maxwell architecture, third generation NVENC implements the video compression algorithm High Efficiency Video Coding (aka. HEVC, H.265) and also increases the H.264 encoder's throughput to cover 4K-resolution @ 60fps (2160p60).
Gotta sell it to gaymerz, right?
Pascal still looking like a complete and total disaster, no reason to upgrade from 9x series until HBM 2 versions
>mfw gaymeryz think 2x perf think 2x fps
You're a fucking retard
NVENC is for HARDWARE ENCODING
GM200 & GM204 DOES NOT HAVE HEVC HARDWARE DECODING
Let's face it, systems running gm204/200 doesn't need hevc acceleration. Those in the extreme niche who need it would do fine with a 960.
Don't even know why you guys are making a big fuss about it. It's a standard "next gen" feature. The upcoming snapdragon 820 smartphone processor will have high profile hevc decode and encoding.
Wrong, again please don't post if you don't know shit
HEVC decoding is very taxing even on the fastest CPU
>Snapdragon 820
>Upcoming
Pretty much all flagships are already using the 820
are you retarded? they always provide ASICs for encoding and decoding together. GM204 and GM200 both have ASICs for encoding and decoding HEVC.
When is that again? I've heard 6 months but that sounds too soon. I thought it wasn't coming until around Q1 next year?
>HEVC decoding is very taxing even on the fastest CPU
i'm using a core 2 duo laptop from 2008 and have no problems watching 720p and 1080p hevc video, it's not taxing at all for any semi-modern CPU.
Again you're fucking retard
Hardware decoder is DIFFERENT from NVENC encoder
anandtech.com
>Finally, and somewhat paradoxically, Maxwell 2 inherits Kepler and Maxwell 1’s hybrid HEVC decode support. First introduced with Maxwell 1 and backported to Kepler, NVIDIA’s hybrid HEVC decode support enables HEVC decoding on these parts by using a combination of software (shader) and hardware decoding, leveraging the reusable portions of the H.264 decode block to offload to fixed function hardware what elements it can, and processing the rest in software.
probably this time next year
So what is the verdict? I have a GTX 770 right now, is the 1080 a valid upgrade?
I've read people say it's a glorified 980ti.
But I don't follow this scene too closely.
Also I'm still rocking a Core i7 920 @ 3.2. Am I going to be bottlenecked?
Gm200/gm204 supports hevc main profile encode/decode. Only the the gm206 supports hevc high profile. To put it in stupid terms, the titan x/980 ti(gm200) and 980/970(gm204) fully supports 8 bit h265.
Nobody gives a shit about chink phones
>GM200 & GM204 does not have HEVC hardware decoding
>all gm20x chips do, only gm107 (750ti) doesn't have HEVC encoding and decoding hardware.
It seems you are both partially right and partially wrong.
>The seventh generation of PureVideo HD, introduced with the Geforce GTX 960 and also included in GTX 950 and GTX 750 SE, a second generation Maxwell (microarchitecture) GPU (GM206), adds full hardware-decode of HEVC Version 1 (Main and Main 10 profiles) to the GPU's video-engine. Feature Set F hardware decoder also supports full fixed function VP9 hardware decoding.
>Previous Maxwell GPUs implemented HEVC playback using a hybrid decoding solution, which involved both the host-CPU and the GPU's GPGPU array.
post some proof that doesn't come from a corporate-owned shill website, raja.
Why not stay with x70 series if you are poor? x80 is always just a stop gap until the good cards come out and you have to sell it if you want to remain a top dog.
Did you read your own citation?
>NVIDIA’s hybrid HEVC decode support enables HEVC decoding on these parts by using a combination of software (shader) and hardware decoding
>and hardware decoding
Seems to contradict what was asserted in>GM200 & GM204 does not have HEVC hardware decoding
I am right, GM200 & GM204 does not support HEVC hardware decoding
Only GM206 has the full HEVC hardware decoder
So many retards here don't even understand simple English
Pick up a 290/390 or 980ti like a month after the 1080 comes out.
I bought a GTX980 for $350 the day the 980ti came out from some guy who upgraded.
>if you are poor
When did I say I was poor?
>Only GM206 has the full HEVC hardware decoder
>full
Moving the goalposts there skippy.
your CPU is ancient. better off getting a 1070 or a 1060 and investing in something better on the CPU front.
It doesn't contradict anything, it's not a full hardware decoder but you're too stupid to understand that
> x80 is always just a stop gap until the good cards come out and you have to sell it if you want to remain a top dog
>I bought a GTX980 for $350 the day the 980ti came out from some guy who upgraded.
pretty much this
>full
Yeah, no.
>Let me try and stick an extra word in the original assertion and hope no one notices!
>Damnit! They noticed. I'd better try and claim it was there all along and anyone who tried to read what I literally wrote literally is an idiot!
No one is moving goalposts
Hardware decoding means full implementation, if you're too stupid to understand that not my issue
>Hardware decoding means
Nice semantical argument there skippy.
Buttmad faggot that was proven wrong but can't accept the fact and has too much pride in himself
>your CPU is ancient.
Yeah but so far there hasn't been a compelling reason to upgrade. Until my CPU starts to be a limiting factor in gaming, which so far it hasn't, I don't see the reason to upgrade yet. I've thought about it multiple times. The only downside is that it runs hot.
It's not full because it only supports up to 10bit. So you're wrong there too, even after trying to move the goalpost.
>Buttmad faggot
Correction
>Damnit! They noticed. I'd better try and claim it was there all along and anyone who tried to read what I literally wrote literally is buttmad!
Buttmad faggot that can't win by facts by tries to shift goalposts by semantics
Wait for Vega or 1080ti
1080 still isn't a 4k 60fps card. It's a future proof 1440p 60fps card
Wrong again, try again when you actually have real facts, faggot
>It's a future proof 1440p 60fps card
thats called the 980ti
>Buttmad faggot
>buttmad faggot
>Buttmad faggot
What's the definition of autism?
>Autism is a neurodevelopmental disorder characterized by impaired social interaction, verbal and non-verbal communication, and restricted and repetitive behavior.
>impaired social interaction
Clearly.
>non-verbal communication
Unquestionably.
>restricted and repetitive
Obviously.
Go take your meds and take a break from posting on Sup Forums for a while. It's for your own good. But given how you are not only suffering from autism but overly investing yourself emotionally in your posts I'm sure you're going to disregard this advice.
>>>>>>>>>>>>>>>>
Yeah fucking right. That barely scrapes by an it is. Wait til ue4 gets widely adopted or cyberpunk 2077 comes
I think Polaris might be a better way. We should be getting a 390x with half the power consumption for less than $300.
I was trying to do the math whether 1080Ti will be single card 4K 60fps. I figured it will need to be 100% more powerful than the 980 Ti to do that. 1080 is looking like it might be 50% better than 980Ti at the very best so will the 1080 Ti be 50% better than the 1080? How much better than the 980 is the 980 Ti? ~50%?
This post is the most autistic in the thread. And I wasn't involved in the shit flinging.
Thats what AMD said they are targeting Polaris to be, a 290x/970 performer for under $150
If this card exists, it would sell a ton.
>Until my CPU starts to be a limiting factor in gaming,
it has been for a while now m8
>I'm sure you're going to disregard this advice.
Now I expect you'll claim to be a different poster as if somehow it is plausible to believe a completely uninvolved anonymous who is not emotionally invested in his opinion, just decided to be an ass and insult another anonymous for no reason rather than the autist trying to claim he is a different poster to distance himself from his own immature conduct.
problem is, I think it's Gimped, less CUDA and less Transistors are preparation for far superior Ti version in half a year.
I don't think the 1080 Ti can do it. If you want 4K your best bet is to get two cards now. Wait to see how well the 1070 overclocks and ideally you can get a away with two of those. That's what I'm planning for right now. If the 1070 doesn't do much better than 980 Ti then you'll have to get two 1080s which won't be cheap but at least you'll be set until 4K 120 Hz or 5K or whatever your next monitor will be.
gp100 only has 1000 more cores than the 1080, unless nvidia releases a different 600mm2 die with the fp64 support cut out again then it's probably not going to be much better than ~25% over the 1080.
I'm failing to see the practical difference between 80FPS and 120FPS when my TV is 60Hz though.
"Only 1000"
1000 is huge ammount
Not the same poster but if you are gaming on a 60hz TV I don't think the problem is your video card.
per core performance was reduced in pascal, so 'only' 1000 cores isn't going to be as impactful as it was with the 980 and 980ti.
nvidia needs to tack on 1500-2000 more cores with the 1080ti if they want it to fly off the shelves.
OK so I guess I need to upgrade my TV, my CPU, and my GPU. I'm going to drive all of the new equipment to my newly-built home in a brand new car. Anything else I need?
>OK so I guess I need to upgrade my TV, my CPU, and my GPU.
Only your TV from the sounds of things. It seems you are just trying to make it sound more extreme than it really is for some strange reason.
I don't think you get the joke buddy
>more extreme than it really is for some strange reason
Yeah, how about that.