How can anyone compete?

how can anyone compete?

What?

Fun fact: pic related can be adapted to most desktops with at least a sandy bridge i3 and a 430w PSU and give you 200% or more performance than the PS4.

>pictured: the two worst console libraries of all time

fucking xbone, that shit sucks, i got one as a gift in 2015, that shit is boxed.

Which one has more waifu games?

>give you 200% or more performance than the PS4.

Is this bait? The PS4 have twice the memory of that card.

>with negative 200 percent performance in-game
Thanks PC optimization!

Easily the Playstation.
How do you like weeb games and not know that by now?

Ahhh but only 4GB of it is used for vidya and the iGPU on the piss 4 does less than 2 TFLOPS of FP32 while the RX 480 does more than 4 TFLOPS of FP32.

source: my ass

Also maybe just me but games like gears of wars are fucking dogshit on console or pc.

>source: my ass
Source: my experience running games above medium settings

Also, everyone always shows this CONSOLE KILLER i3 + shit AMD GPU build but I never see anyone use it in practice. Show me your specs right now or shut up.

One more thing,
>TFLOPS
Only nerds care about that shit, I'm trying to play games not have a dick measuring contest.

>>source: my ass
>Source: my experience running games above medium settings
invalid source

>>TFLOPS
>Only nerds care about that shit, I'm trying to play games not have a dick measuring contest.
Right but what it translates to is laggy gameplay. Show me a piss 4 games that doesn't stutter like crazy
>protip: you can't

>Show me a piss 4 games that doesn't stutter like crazy
I told you to post specs and you didn't. How about you show me that first?

The burden of proof is on you child, not me.

Is English your fucking second language? I am asking you for YOUR PC specs, which I did in my last post that you clearly didn't fucking read before parroting your PC is superior religion bullshit.

If you actually own a PC that shouldn't be too fucking hard for you, unless you're full of shit (which would be a big fucking surprise)

>Get PC with the same specs for like half the price
>Stick a decent USB controller in it
>Download Steam
Wow, it's fucking nothing.

I'm still waiting for you to show me a piss 4 game that doesn't stutter and lag hard as fuck.

How about you just go back to Sup Forums and keep playing with your fisher price game box and never come back here?

Either one is fine by me.

i3? Intels new Kaby Lake pentium is only 65 dollars and works fine for gaming.

>I'm still waiting for you to show me a piss 4 game
I asked first genius, wait your fucking turn. And why the hell would I even give you anything if you're already proving yourself to be a waste of time?

All you've done so far is say "ps4 has shit specs blah blah blah specs", and you won't even show me your "superior PC specs"? I'm LMAOing at your life.

xbonx
>one
playstation
>four

nintendo
>SIXTY FOUR

consoles are shit since 9/11

Lliterally any desktop from 2016 or 2017 built with that year's hardware. How dumb are you console peasant?

poorfag reporting in

built the machine around the A10-5800k back in May of 2013 for around $350 then grabbed an R9 270X for about $200 in December of the same year, and a Hyper 212 Evo for $25 used/on clearance. In 2016 I found a 670 at a thrift store for $25 and it works.

In 1080p I managed to play a fuckton of Dark Souls 3 in a mix of low-medium-high usually 30-40 fps in most areas, solid 60 in the least-active areas with very rare dips beneath 25.

Since getting the 670 I'm usually in the 50s everywhere but with rare drops to the 30s.

Skyrim SE (low-medium, max TAA, it looks fucking fine) with the 270x was usually 40s and 670 its locked 60 with frequent dips into the 50s.

I could probably cap all my shit off to 30 and not care but this thing served me at least as good as a PS4 for the majority of its time with me.

Hopefully killing it off into a non-poorfag station soon. But for the overall $600 I paid I am more than satisfied with this piece of shit.

P.S. Speccy is measuring CPU temp wrong, it does that with AMD APUs from what I've read.

meant for