Wow is this just not even funny anymore, what the fuck happened intel?

Wow is this just not even funny anymore, what the fuck happened intel?

Other urls found in this thread:

web.archive.org/web/20170925171339/https://www.techspot.com/review/1493-intel-core-i9-7980xe-and-7960x/page3.html
forums.civfanatics.com/threads/memory-fix-by-harkonnen-is-out.146309/
twitter.com/SFWRedditGifs

photoshop happened.

>its not real rreeeeeeeeeeee

>7980XE
Fucking lmao, there is literally no reason to buy one, not even the "muh sc perf" and "muh games" excuses.

>worse for gaymen and single core than ryzen
CANT MAKE THIS SHIT UP

>loses literally everywhere else
ok

web.archive.org/web/20170925171339/https://www.techspot.com/review/1493-intel-core-i9-7980xe-and-7960x/page3.html

AHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAHAHAHAHAHAHAHAAHAAHAH

do they have turn time benchmarks? fps doesn't matter on civ that much. It's how long the turns are when playing ai

Its faster overall on Vega64, regardless.

Weird.

Holy shit what's wrong with this guy..?
His conclusion sounds like this:
>Threadripper is b..better for the price. B but I w wish intel to succeed
>I L..love intel, b..but threadripper is b..better for the price
It's like he's apologize for Intel's CPUs

Shit article.

>I'm sorry master, but I'll have to go all out.
>Just this once.

Is Vega secretly incredible in DX12? Or is this just AMD writing better drivers for Ryzen/Vega combined?

That's some edge cases.
Vega is bad because it's unfinished.

It depends on the setting. Which AA preset is used, which shadow preset, etc. Some AA are more optimized than others. Some are heavy and require strong single threaded CPU to utilize it. In this case, its a "HIGH" setting, not "ULTRA", so the game probably uses highly multithreaded AA/shadow to achieve the most optimal quality/performance. If you turn up to ULTRA, it would then try to go for most quality over performance.

AMD's videocards are definitely should be better in DX12, but the thing is that the game is Civilization, which developers are famous of fucking their games up in technological meaning. Like they had a fuckin memory leak cause of DX and they couldn't fix it, but a guy with no source code succeeded doing it.
forums.civfanatics.com/threads/memory-fix-by-harkonnen-is-out.146309/

Ye, Firaxis are literally worse than pajeets.
Vanilla XCOM2 was so fucking BAD they literally rewrote over half the engine for addon.

>firaxis are pajeets because they partner with amd

how about blaming faggots like tim sweeney for why unreal engine is a flaming turd

No, Firaxis are WORSE than pajeets.
Partnership with AMD has nothing to do with them ROYALLY fucking up vanilla XCOM2.

>ROYALLY fucking up vanilla XCOM2
thats epics fault not firaxis

it's UE3, probably the most widely used and well documented engine in the last 10 years
how in the fuck is it epic's fault if firaxis fucks things up this bad on an ancient engine

Epic game engines are designed for nVidia cards. Was your card by any chance an older nvdia card (when the game was released) or AMD card?

That's probably why.

>1080p
>""""high"""" """"quality""""

BUMP

Are you fucking retarded?

Vega gets smoked by the 1080 Ti (and usually the 1080) in other DX12 titles, so no. Even AMDrone favourite Hitman. Plus of course the 1080 Ti can overclock, whereas Vega can't.

...

I think the issue is that AMD drivers tend to have more CPU overhead than Nvidia's, so AMD cards run like shit on cheap or LCC CPU's but when you pair it with a higher end CPU or increase the resolution AMD GPU's stretch their legs a bit. Nvidia GPU's tend to scale linearly with resolution while AMD GPU's tend to increase in performance with resolution. It's possible AMD achieves the same effect with core count.

no 7700k result.

really makes you think

there's a 7740k which is faster you autistic brazilian faggot

there is no 7740k in there

then you didnt look hard enough you dumbass

theres a 7740 X no 7740 K

fucking retarded ass nigger

he meant 7740X

>Skylake X's changes to cache cripple it's performance in gaming
Wew lad, hope bingmesh is worth it in the server space.

>have high end top modern nvidia gpu
>game still runs like utter garbage while other UE3/4 titles that are 10 times more demanding run like butter
Firaxis are fucking worthless idiots you faggot

VEGA DRIVERS ARE GIMPING THE SWEET INTEL PERFORMANCE OH MY GOD BACKHAND TACTICS

Monkey meme is for Intel buddy, not just for anyone you think is a shill
Lurk more

More like Nvidia drivers are gimping AMD performance, retard. Check the frames for the Intel processors, they barely change from 1080Ti/Vega64.

>what the fuck happened intel?
Should be obvious

It's a joke, not a dick
Don't take it so hard user

autism speaks

>other UE3/4 titles run 10x better
Ark
PUGBG
Batman Arkam Knights
Aliens Marines something

Runs like complete garbage on any system. Worse on AMD.

x models are not made for gaming. They are workstation CPUs.

no I just have a modest 1440p monitor.

Not him, but those are all pretty shitty games with worthless devs though. If you look at Epic's own games as examples, they're rather well optimized. Unreal Tournament and Paragon run real smooth.

>It's OK when intel says it

Intel gambled too much on its fab tech to overcome the difficulties in making massive monolith chip.

The horrible yields of Broadwell-E and Broadwell-EP show them that the future of high-core count chips is going to be multi-die.

Skylake-X just continues the trend.

>Suddenly its no longer about muh multitasking, but gayming now matters
Amddrones never change.

Probably couldn't render the game properly on vega ryzen combo. Use your brain sometimes.

multitasking is still better on threadripper because of the laughable base clocks on high core count SKL-X cpus unless you do a mild overclock and melt your shit if you don't delid it too, pic related

Also fortnight battle Royale which is exactly the same as pubg yet runs and looks 100x better

>Wow is this just not even funny anymore

wrong

...

>Vega was intended to go head to head with the GTX 1070 and 1080 respectively.
>Posts 1080Ti bullshit.
When the big October driver release comes it will be interesting to see what it brings to the table. Currently anyone on RTG's Vanguard program (access to beta alpha/drivers) are under NDA not to leak anything incliuding any results.

>'I-It's OK when Intel shills do it' the post

>RTG niggers are NDAing testers
Gay.
Very gay.
I mean the paranoia is sorta justified since paranoia saved Eyefinity once but whatever.

No wonder AMD is so shit they spend all their money on shills

this way they can at least have an advantage out of their incompetence/low budget

NDAing the testers leads to nothing more than suspicions.

It's better than claiming something like a 30% performance increase and not be able to deliver it when the time comes. Showing hard results is better than making empty promisses

Yes, but they should be more open on development of wangblows drivers.

480p is hardly standard now but it's still called 'Standard Definition'
It's just a name, you absolute retard.

>no 7700K
why? cuz they know it will obliterate them.

>switch from intel to ryzen
>friends don't get it because intel has been the only gaming choice for years
>explain it's not 2011 anymore and right now ryzen is doing it better
>friends still to scared to switch and think I'm crazy
brand loyalists are a strange breed.
I've used every combination of intel, nvidia, AMD (cpu) and AMD (gpu) whenever it was convenient.

>7740X

>implying they are same

Lol the shit you people spew sometimes

brand loyalists are at best naive and just go along whatever other people said to them years ago or they're straight up retarded deniers and will defend their preferred brand all the way to the ground even when presented evidence that other brands are better

>that pic
wat
How much more power is a 7900x using vs a 6900k?

a lot, broadwell-e cpus actually sticked to their rated tdp

Wait for Icelake

Hey that's pretty good