Huh? Huh

wccftech.com/amd-bids-farewell-to-crossfire-mgpu
>CrossFire was meant for DefectX 11
>DefectX 12 and further is "mGPU"
Why the FUCK couldn't AMD come up with a better moniker? How the FUCK can I differentiate between AMD's mobile GPU and multi GPU now? FFSFDS!

Other urls found in this thread:

3dmark.com/compare/fs/13683022/fs/13507179
anandtech.com/show/1698/5
gamersnexus.net/guides/2886-crossfire-rx-580-rx-480-benchmarks-vs-single-gpu/page-3
twitter.com/SFWRedditImages

It doesn't matter since multi-GPU shit is garbage.

?

>62572511
You again, stupid uneducated shithead?
I've told you already that only HD 7xxx line had massive problems with CFX, when AMD fully moved to XDMA design quality of CFX jumped up BIG TIME, both frame pacing AND scaling-wise. Stop spreading your FUD BS. It was noVideo's SLI which became much worse with time, but AMD's CFX did not.

Just ignore the stupid inexperienced newfag.

3dmark.com/compare/fs/13683022/fs/13507179

???

Isn't it just that game devs are the ones who have to code their games properly so the games support multiple gpus? I mean in the games that do support multiple gpu it works wonderfully. Why blame the gpus?

CFX was super-bad in HD 7xxx days, and it was AMD's fault, not game devs', it improved drastically after AMD fully moved to GLORIOUS XDMA (while nGreedia shitheads STILL EVEN UP TO THIS VERY DAY continue on using fucking bridges and wires), but some people are SO HEAVILY BUTTMAD about HD 7xxx CFX fiasco that they STILL cannot let it go and forget about it in this modern day and age.

You don't even know what alternative frame rendering is, do you?
Look it up. Your extra frames aren't new frames.

Nice try, but no cigar. You need to try way harder than just that, kid.

Oh, you mean the 4000 HD series CFX fiasco too? And the 5000HD one, and the 6000HD one.
Let's not forget Nvidia's attempts, because they were also shit.
If you don't understand AFR, you don't understand why multicard was always retarded.
Long story short, your fps went up, but frame latencies did not improve.

Think about it; why is a higher fps better? It's because at roughly equal frame latencies, you get a new frame that updates you about what the game is processing. Now what happens if the frame latencies are unequal? What happens if you get two frames right next to each other in 1/60th of a second, then you have to wait 1/30th of a second for the next pair? The effective frame latency is limited to the power of a single card in AFR. Your FPS goes up but it's still just as janky as as when you didn't.
Nvidia mixes in some smoke and mirrors by ARTIFICIALLY ADDING FRAME LATENCY to make the frames be delivered appear smoother, but the frame is still one that was rendered for part of the game that happened 1/30th of a second ago- effectively you get the same frame twice.

shitting on a thread while not knowning the difference between sfr and afr

There was nothing wrong with HD 6xxx series, it worked perfectly fine.
And HD 4xxx/5xxx series were ATi, not AMD. And that implementation of CrossFire was in it's rudimentary, very raw stages (almost like with 3dFx, WHICH CREATED MULTI-GPU TO BEGIN WITH). SLI on the other hand, got a situation completely in reverse - when ATi/AMD's CrossFire was mediocre or downright bad nGreedia's SLI was either decent or good, but nowadays XDMA CFX is very good and SLI sucks major donkey balls mainly because noVideo's R&D has their heads stuck up their asses with all those legacy bridges and wires. Sure, noVideo tried to move wireless SLI too just recently, but they're NOWHERE even remotely close to being as good at this as AMD already became.

>frame latencies did not improve
Wrong. As already been mentioned above, modern XDMA CrossFire has very low latencies and frame-pacing issues are essentially nonexistent on the latest Radeon GPUs. It was the worst on HD 7xxx series, but those times have long passed, and nowadays Radeons actually have better multi-GPU frame-pacing than GeForce does.

>the Vega 64 lauded as *the* 4k/60 FPS GPU doesn't even get close to 60 FPS
>JUST BUY ANOTHER ONE GOYIM xD

Wow, thank you merchant, you truly are my greatest ally.
brb, buying a 1000W PSU.

It's all in the drivers. It IS a 4K 60FPS video card, but they need to release fucking fully enabled drivers. Still waiting.

Crossfire is shit because it doesn't work in windowed mode. SLI beats it hands down for working in windowed mode in generally every game. If AMD can change that, easily superior.

>2017
>Not playing in actual full-screen
Windows store-pleb.

>Windowed mode
Lmao

>Fire Strike
>Having anything to do with mGPU
are you actually retarded? That shit is running Crossfire

AYY

ad-hominems don't win an argument. I have a 290X, and it continues to impress me with it's ability to stay relevant. If I could crossfire it with it's high efficiency, and still have the versatility of alt-tabbing borderless windowed mode, that would be perfection. But it doesn't and proven versatility wins over an efficiency benchscore in the real world when doing shit.

>playing in Windowed mode

Go be retarded somewhere else.

what type of mobo is that?

>borderless windowed

Tfw have to use borderless windowed mode on some older games because they'll crash if full screen loses focus

>290x
>high efficiency

...wut? Get your shit together and stop dropping so much acid, mang.

It's an older mobo with a newer daughterboard innit. Probably socket 754/939 (at least that's what [insert search engine here] tells me) on the mainboard and AM2 on the daughter.

Just look at the photo. Clearly a clown mobo made for clowns.

Serious question: Why can't just have the second GPU render every other frame?

The 1080 Ti is more like a 4K/50 FPS GPU. At least its playable. Feels much better than my 1070.

>50FPS
>playable

It's not like it'll matter once Navi drops

Because video cards render things on-the-fly and, until there will be a high powered AI on board, they can't predict what next frame will look like. They can buffer/pre-fetch only very minuscule things, while "big" graphics are rendered in real time. GPUs don't have such a kind of cache and preemption abilities as CPUs and RAM have. Why most modern YOBA games require large amounts of RAM (Star Citizen, for example, utilizes up to ~12GB if 2560x1440 while everything maxed out in 60Hz) is because they store all the required data there first, and ONLY after that send that data from RAM to GPU's memory, AFTER which GPU's memory poll gets emptied by loading up all the stored info into work-bee processing units (textures, geometry, shaders, physics, etc). GPUs can't predict (at the current time), CPUs can.

For me yes. I play more than a year 4K30 on a 780 Ti. 50+ FPS seem butterly smooth when you were used to this.

Don't expect a 144 Hz 4K at all, they will now release in 2018, still cost 2000 € and maybe release even later because there is no GPU that could power them.
>Implying I would run AAA titles in 4K and play on medium to archieve these framerates

I haven't looked into crossfire or sli since I had my dual ATI 3870HD. But as I remember it it was a simple tiled renderer back then, at least in marketing material. And it worked fine for me. I experimented quite a bit. Not every game would have it enabled because it took some programmer effort in the programming pipeline presumably.
Reading the modern api it seems more oriented towards dividing work loads rather than partitioning work. That'll take a lot more programmer effort surely.

Also found this:
anandtech.com/show/1698/5

> Newest Vega64
> Can't run Witcher 3 and Metro Redux above 40fps on max settings.

Just kys AMD, fucking gtx 970 can run higher fps on Witcher 3 ffs.

I hate this meme

every new series someone says AMD fixed crossfire. HD 5000 series "fixed crossfire scaling" then HD 7000 series "fixed crossfire framelag" now

then you go fire up a simple game like killing floor, there's no crossfire profile at all and only 1 GPU is used

And here's what I was talking about. Apparently both SLI and crossfire did this?
So are you liars or did they change things for the worse? I can see how some effects would be very difficult to do and you don't utilize the cards fully because they both need to cull all the triangles. They probably even clipped all the geometry to the checkerboard pattern.

Who should KYS is such noVideot as you who ignores the fact about Witcher 3 being a nGreedia GayWorse cum-filled title with such "features" as by-default silently cranked up to x64 HairWorks and other PhysX shit, deliberately crippling Radeons in favor of GayTeaAss. Performance was way worse when Witcher 3 just came out, so there's nothing wrong with AMD or Radeon - it's just a shitty GayWorse-gimped title. Either fully enabled upcoming drivers will improve situation a bit, or a NaVi will definitely overcome Huang's asshattering by mere brute force.

Its 4K. A 1080 would archieve similar FPS.

Thats normal. Such games don't get mGPU profiles because normal people have enough performance in their 100 € desktop GPUs. I had that horrible 755M SLI. Nothing worked. I could not archieve 60 FPS in Warframe on a single card but if I used both it shuttered like hell and crashed.

It's not a meme, just facts.
1. HD 4xxx and 5xxx were ATi, not AMD.
2. HD 4xxx and 5xxx had rudimentary CrossFire, so their poor performance can be easily accommodated to raw experience with the technology and sheer lack of experience on the matter.
3. CrossFire on HD 6xxx worked just fine, for it's time.
4. HD 7xxx completely fucked up CrossFire and it was worst ever back then and still stays most worst multi-GPU experience even today. HD 7xxx was very bad at CrossFire. Super-bad.
5. When AMD moved to XDMA, quality of CrossFire arose greatly and frame-pacing latency dropped significantly, and it only got better and better as XDMA and new GCN architectures matured.

Checkerboard is used only on consoles, because that's where games benefit the most from it. Checkerboard is not used on PC, because there's no need to.

That's what crossfire and SLI do. And it sucks because that means highly variable latency.

Depends on what you mean. The modern use of checkerboarding you see now can be a multitude of things.
But what's talked about in that article is definitively PC/sli/crossfire related. And no it's not unnecessary/necessary on console/pc. Not a correct blanket statement.

>means highly variable latency
Not on XDMA Radeons, motherfucker

>Why the FUCK couldn't AMD come up with a better moniker
Because they're not implementing it anymore. It's the job of the developer. Why should they give a name to somebody else's work?

Fucking AMD, I just bought an AMD CPU and a Crossfire motherboard.

They're not disabling or killing off CrossFire, they're literally just changing the name of the brand. It'll still be same thing, just called differently.

They're doing a piss poor job of supporting Crossfire as it is, it seems like they're going to just let it rot now.

Fucking hell I just for two Vega 64s, and not they kill of xfire?

(You)

Stop spreading FUD BS, kid.

>memeing this hard

gamersnexus.net/guides/2886-crossfire-rx-580-rx-480-benchmarks-vs-single-gpu/page-3

CF is a clusterfuck right now.

>GaymurrNexass