These kinds of threads are just spam at this point, correct?
These kinds of threads are just spam at this point, correct?
Other urls found in this thread:
newsroom.intel.com
twitter.com
Intel owner here
You ass holes said the same about the 7700k
15% increase you said
Shit was exactly the same as the last gen
>Coffee Lake
>15% improvement
>this means they've already benchmarked these chips
Basically, Coffee Lake is already in production and buying Kaby Lake at this point is fucking retarded.
...
THANK YOU BASED AMD
INTEL IS SHITTING ITS PANTS
(((((((((((((((15%)))))))))))))))
so i guess we will see laptops using coffelake by 2020
i'm still waiting for kabylake, but the amount of laptops using it is limited and most shit you find in retailers is skylake
well it's a good thing they're exactly the same except skylake is more power efficient
Not for laptops, Kaby Lake has Main10 HEVC decoding. Most laptops come with no dedicated GPU whatsoever so this is a welcome feature.
Press F to pay respects to AMD
>it's a good thing
as good as when nvidia drops to legacy their cards as soon as they announce the new ones. sure, we are at a point where it really doesn't matter anymore, but as Terry said, intel are jews and it cost them the same to make celerons or xeons to produce them.
the 4k netflix thingy exclusively for kaby lake, tho
Another ((((((((((((15% faster))))))))))))))) housefire.
How to interpret intel's improved performance figures:
Divide them by 10.
So a good 1.5% increase should be expected.
>the 4k netflix thingy exclusively for kaby lake, tho
also only on windows 10 and only on edge.
It's is almost 10% if you overclock the hell of it
d-delet dis
IF you win the silicon lottery
+ delid for sane temperatures
Any 7700k does 5.0 on air and 5.2 on water
Is it just another refresh? Or is moar corez confirmed?
>Any 7700K does 5GHz on air without deliding
That's impressively not true. I don't even have a good comparison for how bullshit that claim is.
lel
they rush it, it's not just moar coarz
it's coarz on FIRE
it's bentium all over again
coffeelake is moar coars.
if it's more cores, how would 6 core skylake x fit in so clsoe? you pay $400 for pci lanes at 140w TDP?
Kaby Lake is MOAR GIGAHURTZ
Coffee Lake is MOAR COARZ
Cannon Lake is MUH 10NM but with a (confirmed by intel) worse performance than Skylake and the desktop will only get to see it in god know when.
Choose your Intel poison.
interesting question would be how MOAR COARZ would coexist with MOAR GIGAHURTZ at reasonable temperature
They won't.
They won't.
they can't regress performance on mainstream?
it would be much bigger regression than 4790K->6700K if they can't pull off at least 4.2 all cores
Are you stupid or retarded?
Cannonlake 10nm is better than Skylake 14nm
Only 14++ process which is what Coffee Lake is on is better than 10nm atm
14++ is the same process as 14. Just refined.
>it would be much bigger regression than 4790K->6700K if they can't pull off at least 4.2 all cores
Waretcooling included?
>Are you stupid or retarded?
Are you?
>14++
Okay, you've answered my question.
Skylake's 14nm process is inferior to 10nm, the slide pretty much says it all
Don't post if you don't know shit
>using intel's marketing slides as fucking proof
MY FUCKING SIDES. kys pajeet.
newsroom.intel.com
Always ignore any morons that post about Intel's 10nm process, they don't know shit
Only Intel's 14++ process which Coffee Lake uses is better than 10nm process
Even Kaby Lake's 14+ process is inferior to 10nm process that Cannonlake uses
>only 14++ process which is what Coffee Lake
broadwell 14
skylake 14+
kabylake 14++
cofee 14++
meaning we pretty much know how it performs
>BUTTMAD AYYMDPOORFAG GOT DESTROYED WITH REAL FACTS AND UNABLE TO REBUT
>PAJEETS IN FULL DAMAGE CONTROL
Kaby Lake is 14+, don't post if you don't know shit
remember what happened to x1000 faster than SSD?
>the 4k netflix thingy exclusively for kaby lake, tho
That's because of the Netflix jew, not because of Intel. Netflix does not want you to watch their videos unless you have the hardware DRM that only Kaby Lake has. They could let you watch it on whatever CPU, but they don't want to. If Netflix didn't want its ultra-AIDS DRM you could watch on pretty much anything as long as it had the required performance.
The fact they're relying on these miniscule die shrinks to get any sort of improvement and they have to put all these +'s in front of them just shows how much of a dead end they're at.
BUTTMAD AYYMDPOOJEETS IN MAXIMUM DAMAGE CONTROL AS THEIR RYPOO IS GETTING DESTROYED IN PERFORMANCE & POWER EFFICIENCY
Just like that sweet 15% performance increase from skylake to kabylake.
>& POWER EFFICIENCY
unless its 6 core 12 threads, i really dont give a fuck
in 2~ years 4core/8threads wont be able to perform properly because
holy shit guys..
I just realized..
CONSOLES ARE PUSHING FOR MORE CORES
PRAISE CONSOLES!
Hmm, I wonder who's CPUs are in those consoles. Really makes you think.
I just want at least 6 cores, with at least 32 PCIe lanes and single threaded performance better than OC'd Devil's Canyon. Ryzen disappointed on 2/3 counts, will this shit disappoint as well?
AMDs
But in the long run, for multi core, its a good thing.
but least ryzen doesnt stutter
>but least ryzen doesnt stutter***
>
>
>
>
>*** only applicable sometimes
FTFY
soooce pls
out of curiosity what you need 32 lanes for?
>single threaded performance better than OC'd Devil's Canyon
at least 3 more years when 10/7nm becomes mainstream on both sides
applicable always, 6850K/6900K doesn't stutter either
6800 does for some reason
sounds like proof that 4 physical cores arent enough any more.
Or Cache, how much cache does the 6800 have, and the 6850/6900k?
>No graphs
>No sources
>No proof whatsoever
even day0 graphs were on ryzen scale, it performs much better today
why are you so against it? I'm not denying that OCed 5820K if you can get it cheap is superior to Ryzen 1600x
you have to ignore TDP but it's like, meh
6900k got 20mb
6850k got clocks to supplement 15mb cache
6800 got no clocks and no cache
There are games where Ryzen performs worse in terms of frame time variance compared to Intel, no idea what you're smoking. I found this after a 2min search, didn't even know where to look. Anything beyond 16.7ms introduces some stutter if you're aiming for solid 60FPS.
>out of curiosity what you need 32 lanes for?
4K 144Hz SLI when those Asus and Acer monitors come out this year. I guess I should mention I actually want 32 lanes available as 2 x16 PCIe slots, not 32 in total but some lanes separated for something like various forms of PCIe-based storage.
wrong pic.
that chart makes no sense with the 1800x vs 1700x
Its the same cpu with 100mhz more clock speed
why the fuck is it less than half the time compared to the 1700x?
>day 0
>dx12 on nvidia
>worst performing ryzen games after primal
fyi day 1 memory latency was in 90ns range
Because it's a shit day one benchmark before all the bugs were ironed out. Shills still love spamming them everywhere.
4ns
its fucking nothing, but its better than regression
makes more sense, because the numbers on that shit does not make any sense.
captcha just asked me to find all sinks
Use legacy captcha noob.
...tell me more friendo
cause i have no idea what youre on about
There are all kinds of threads that are basically spam but get posted on a regular basis nonetheless. The private tracker retards come to mind.
it's not 4ns, they cut about 20ns in a month
its 4ns from the previous release tho
but hey, ANY improvement is good.
76.9-72.2=6.4
more than advertised, all ryzen 5 reviews were done without new agesa btw
Will there be a 6c/12t coffeelake to compete against Ryzen?
well, keep them coming, because im still on a 2500k, and its getting long in the tooth
If they keep improving at this rate, when ill upgrade, its gonna be good
Maybe even Zen2/Ryzen+ or whatever.
I have no idea, I searched for "ryzen frame time" and results like what I remembered seeing immediately came up. If you know about updated results after whatever magic updates, please do go ahead and post them. I haven't looked extensively into Ryzen after it came out since it doesn't really fit my requirements anyway (single threaded performance isn't amazing due to low clocks despite good IPC and it doesn't have enough PCIe lanes anyway, those can't be solved through updates).
Pretty sure that's DX11. It's not like it's a real argument either, NVIDIA is currently the only option for high performance, can't exactly buy something else because Radeon doesn't currently offer any high-end products.
For me personally I already run NVIDIA and Vega is unlikely to be fast enough to be a compelling update, I don't plan on changing anything related to graphics cards until 2018 at least, or whenever Volta and Navi come out.
>use AMD
>run out of battery
LOL !
Im on intel/Nvidia though..
Projecting much?
>it doesn't have enough PCIe lanes
kaby 16 lanes
ryzen 24 lanes
unless you do some pro grade video edition on 20 SSDs at home I have no idea why youd need more
Already mentioned why
little late here
But PCI-E 3.0 x8 is still within margin of error of 3.0 x16
So in theory, there is no performance loss going to x8, as of current graphics cards.
you know that SLI has no benefit from 2x16?
not sure if it even supported anywhere
...
>was ready to order my 7700 + 1070 dream pc
>suddenly ryzen
>4c isn't cool anymore guys!
>w-well, ok, but what gpu can you offer instead of 1070?
>just wait for vega
>o-ok
>vega is amost there
>but amd isn't cool anymore
>look at these new shiny volta and coffeelake
>just wait few more months!
Fucking jews, stop this shit already! Do you want my money or not?!
>graphics cards
Meanwhile Intel chipsets support PCI based disks that are faster than traditional dram.
AMD will have it in a few years just wait.
he wanted it for SLi
Also AMD does have m.2 now.
also, who gives a fuck, as long as its decently fast and has low response time its good
>that 1700
Oh lawdy I need to sit down.
You know damn well I'm talking about xpoint, AMD won't support it for years and the bandwidth for it doesn't exist on their chipsets.
>15%
just like skykaby lake
best plan get 1070 +R5 1600 on good motherboard and 3200 memory
proceed to upgrade cpu every year for four year cheap($220) gaining performance without wasting money on new board/ram
>xpoint
>faster than DRAM
wew
What's a captcha
It has faster throughput the controllers just aren't out yet.
When is AMD supposed to push out a HEDT plaftform? I want to build a workstation with GPU rendering in mind.
Ryzen is a non-starter because of lack of PCIe lanes and the chipsets only support a maximum of 64GB RAM.
Broadwell-E/X99 doesn't seem like a such a great buy either with Skylake-X/X299 around the corner.
Is AMD going to have something that competes with X299 this year?
>j-j-just wait
Thanks Intel.
Naples is going to blow the fucking door off x99. I'll take 16c/32t any day of the week.
Optane was a huge letdown desu.
why u think before Crysis 3 oced 2 core shet on 2600k. And now i5s are doomed while the fx 8/i7 bros laught at the fags that fell for the muh i5 for gayming meme.
The difference really isn't that small for SLI. SLI cards talk to each other over PCIe (and the SLI bridge of course) and at high res a lot of bandwidth is required. I've benchmarked shit myself at x16/x16 and at x8/x8 and I've seen performance increases of like 15% and SLI scaling improvements from like 60-65% to 85%. It's pretty ridiculous to think that something like motherboard features can improve performance to such a degree in modern times, but SLI can really choke on low bandwidth.
I've personally tested Witcher 3, R6 Siege, DOOM and Dishonored 2 on x16/x16 and x8/x8. The first 3 straight-up scale better and run faster (DOOM basically scales negatively on x8/x8 but it's an extreme example) and Dishonored 2 stutters way less. Other games show no improvement at all or already scale well on x8/x8, but it's not universal. Getting like 15% higher FPS out of a different motherboard (with a PCIe switch - which doesn't appear to exist for Ryzen currently) or out of CPU lanes is nothing to ignore, especially when I want all the performance I can get to aim at 4K >60FPS.
Don't be ridiculous, of course it's supported.
Yeah that's great. When are the desktop chips and platform coming out?