AHAHAHAHAHAHAHAH, AMD FineWine™ and Radeon "AgesWell™" are back, baby! Take that, you stupid green-assed shitmonkeys! And this is only the beginning, since third party models coming in December and at New Year's Eve AMD will drop a Crimson ReLive Redux bomb, increasing performance across the board even further! Literally EACH and EVERY nGreediot that bought 1080 Ti, is now drowning in SALT and their own feces! This is just GLORIOUS! Simply GLORIOUS! Oh fuck yeah, WHAT A TIME TO BE ALIVE, mateys!
Cameron Morris
I'm buying one soon but I can't decide on the processor. Would it be wrong to get an 8600k?
Jaxson Nelson
And it's not 64, either. 56 gets a massive boost too, completely obliterating the the piece of garbage shit that is non-Ti 1080. And 1070 with 1070 Ti are now both literally dead-in-the-water, abso-effing-lutely DOA.
Christopher Nelson
Read this thread
Bentley Reyes
So at 1440p as far as gaming goes it makes fuck all difference with gpu bottlenecks and I should favour hyperthreading/productivity? That's what I took away from those benchmarks
Lincoln Diaz
>Beating nvidia in an AMD optimized game
The opposit would be embarassing
Kayden Mitchell
>1 shitty game
Liam Sanders
>If I'd wanted to "boost productivity", I'd just get Threadripper, not Coffin Fake. We're not talking about "productivity" here, however. Both i5 2500K and i7 2600K are GAYMING CPUs first and foremost, so is 8700K (it's being advertised as 7700K's successor. which is already utterly retarded in itself since 7700K sucks ass in games due to INEPT stuttering and other problems). That's why their performance in games is all what matters, NOT synthetics or any other shit. And that performance difference is ~8% between the two, regardless of GPUs and settings used. That's 6 years. In 6 years Intel only managed to increase performance by measly ~8% (and that's in best, Intel-compiler biased cases. In many it's actually no more than ~4% usually). And this is with "two more cores" while being horse cum-splattered, with RF ID under the lid and with hidden Minix, and costing more money. Literal DOA garbage.
>And the most hilarious thing about that is the Witcher 3, which is Intel compiler-fucked as hell. Witcher 3 should theoretically get the most benefit from new Intel processor, but it actually doesn't and difference is so negligible it's downright laughable considering that Sandy came out 4 years before Witcher 3. 2 FPS difference on 1070 and 5 FPS on 1080 Ti. And Deus Ex is 1.5 FPS on 1080 Ti and on 1070 2600K actually BEATS the fucking 8700K! EL-OH-EL! SIX COARZ, HIGHER FREECUMZEES! AYY LMAO!
Michael Bennett
Not only 1440p, but 1080p also. It's barely noticeable (margin of error) regardless of resolution, settings, or GPUs. Increase from 1070 to 1080 Ti is obviously noticeable, sure, but in lieu of comparison between same GPU/RAM setups with 2600K and 8700K - it's almost nonexistent. The pic-related also shows us that: >Lower "average" FPS on RyZen, but it must be noted that: a) RyZen doesn't stutter at all, while Inturd stutters like fuck all the time, b) RyZen has much smoother overall experience because minimal FPS is much higher than on Inturd, c) RyZen has much more accurate and better load distribution across cores (which adds even more to the improving the overall quality of the playing experience, alongside two of the previously mentioned factors). Basically what this means is - higher average FPS doesn't mean jack shit in this modern day and age. Only frame-pacing and minimal FPS matters, and both of these are way better on Zen than on Inturd. In other words - if you're buying a CPU for quality gaming you have to be a total idiot to buy Inturd instead of Zen. But dirty kikes would try to sway you into thinking otherwise, of course. Do NOT get pixie-dusted by Jews.
>P.S. >And if it's productivity - Zen still completely and utterly obliterates Inturd. This is truly a bad time for anyone to buy anything Inturd-related or branded. Just don't. Don't be a moron. Know better. Get Zen.
Easton Sanders
But it runs well on everything. idTech6 is a good engine.
Colton James
...
Daniel Ross
>M-M-M-M-MEGATEXTURE!
Joseph Green
>The point is - you won't see a big difference even with 1080 Ti. FPS will be higher between 1070 and 1080 Ti, obviously, but ~8% difference between CPUs is same across all GPUs and settings. It's downright disgustingly laughable, as it just shows that i7 2600K STILL DOESN'T BOTTLENECK EVEN THE MOST HEAVIEST OF MODERN YOBA EVEN SIX FULL YEARS LATER, so 8700K's "relevance factor" is literally NONEXISTENT as "better productivity" can be gained with much cheaper Zen.
Brayden Johnson
>...@wolfenstein Literally "nothing" the thread.
Daniel Jackson
Do it improve the performance on good games too?
Hunter Cook
Yes. CPU doesn't matter for jack shit for gaming past a certain point.
Kayden Cruz
...
Joseph Harris
I would actually buy nvidia card if they had any linux drivers that don't break after a system update.
William Garcia
Not an Nvidia fanboy by any means, but wolf2 is not a good game.
Jonathan Sanders
Checked 'em. Ironically enough, this is currently going on:
Ethan Butler
thanks mate
Andrew Moore
Can you please stop shilling unfinished products?
Jeremiah Clark
noVideots are literally ETERNALLY BTFO'd, lel.
Jace Perez
...
Nathan Green
nvidia has no drivers
Nicholas Gonzalez
Really. Stop it. When AMD finishes it - come shill; but not now.
Lucas Martin
What makes it even more hilarious, is the fact that absolute majority of professional emulation scene (coders, hackers, etc) sits and compiles their software on Linux, WHILE largely ignoring Radeon and preferring to optimize GayForce instead (which was proven by Dolphin, Citra, RPCS 3, and Cemu, for example). This is downright autistic echo chamber'ing, considering the latest developments. Just imagine their faces after /Ve/Ga/ becomes fully supported by Linux (and it soon will). These stupid furfaggotring fuckers will be BTFO'd out of their goddamn asses.
Elijah Jackson
They don't like GeForce. They like NVidia's OGL implementation.
Lincoln Martin
>in one game Wow it's nothing
Levi Wilson
...
Christian Hernandez
If there is anything nvidia does well is Open GL support. It's spot on and clean as fuck and supports everything.
But most of the emulators you cited are moving to vulkan, because there are things that are just not possible on OGL, and this might change the game.
David Cook
They (the scene at large) schlick to GayForce super-effing-hard, as evident by almost-nonexistent Radeon support on Cemu and RPCS 3 currently.
Brody Evans
you're trying way too fucking hard
Cameron Rodriguez
Well, if there's anything AMD does well - it's Vulkan support. Because it was Mantle once.
James Stewart
AMDs OGL implementation sucks cocks. It's unrelated to hardware itself you annoying babby.
Cooper Smith
Not even trying, kiddo. Just stating the harsh truth of actual reality.
Jason Barnes
nah you definitely are. it's like reading a Sup Forums post circa 2009
Christian Wright
should i get rx 560 2GB or gtx 1050ti 4GB?
i only want to play ace combat 7 in 2018.
Isaiah Hill
OGL is GARBAGE. Always was, always will be. Since they're schlicking for OGL - they're faggots by default. And since they're schlicking to GayForce, they're faggots x2 times over. And so are you if you're defending that shit in any way whatsoever at all.
Levi Torres
What's the alternative on linux?
Chase Evans
RX 560 is weak as fuck and too expensive for what it is. Either go cheapest 570, or 1050 Ti.
Nathan Martinez
vulkan
Logan King
-->
Joseph Rodriguez
Now it is an option. Now. Not 10 years ago, now.
Owen Reed
This shilling is so boring.
Dominic Brooks
It's fiji xt and fury all over again.
Does well in amd titles and ahit in everything else and works out as fast on average
Charles Sanders
>nah you definitely are. it's like reading a Sup Forums post circa 2009 You hit the nail on the head.
Justin Watson
...
Isaac Scott
He hit a nail on his own head
Adam Davis
>DOA
They've been out for a while now, you're dumb
Owen Turner
see this >Not an Nvidia fanboy by any means, but wolf2 is not a good game. This was like a few months ago when aots benchmarks was a "thing". So is wolfenstein the new pc gaming abortion "go to" benchmark for amd?
Jordan Butler
Indeed.
Oliver Ramirez
see
John Long
And yet they're still pushing OGL and GayForce by largely optimizing and compiling only for them on Linux, while almost completely ignoring Radeon EVEN TODAY. Literal furfaggotring echo chamber. I won't be surprised even one slightest bit if most of these fucks are regular posters/readers of NeoFAG.
Jeremiah Bennett
No, they aren't. As i stated before, all those emulators are moving to vulkan due several reasons, and not only getting speed boosts as precision boosts as well. Other fun example is that parallei, that uses the flexibility of vulkan to actually emulate the N64 video chip correctly. If AMD delivers a better vulkan performance, AMD will be the future of emulation.
Lucas Miller
>1070 Barely a year >1070 Ti Less than half of a month >Been out for a while now Autism = the post Typical autistic buttblasted noVideot.
Colton Carter
>AMD will be the future of emulation. Sure they will. It's not like I've seen this posted everyhere for the past decade, or anything.
Aaron Cox
Go back to your echo chamber, faggot.
Justin Ramirez
I love how people always say that Nvidia wins because the 1080Ti is still ahead of Vega 64 in many games, while completely forgetting the fact that Vega 64 is 499MSRP while the 1080Ti is 699MSRP. A 200 dollars cheaper card getting this close and even beating the more expensive card so often, and yet somehow that is not a clear AMD win? It's not like all the Nvidiots saying this have a 1080Ti themselves either anyway. I really don't understand this thinking.
Angel Roberts
You probably can test it by now. Several emulators already support vulkan and are faster while using it.
John Gonzalez
Vulkan support (albeit rudimentary as of now) is there, but optimizations are not. Have you been following Cemu's and RPCS 3's progress for the last 8 months? They've been largely ignoring Radeon support et al, including drivers and new features of any GPUs past R9 2xx series. That's all while constantly improving GayForce all the time. This is literal bias against AMD.
Oliver Campbell
Don't worry, when /Ve/Ga/ starts (and it will) owning 1080 Ti in ass in two months from now (after third party versions and Redux come out), buttmad noVideots will immediately shift focus on "better temps and lower power consumption". They always do that when their garbage starts losing massively. Each and every single fucking time, lol.
Jordan Gonzalez
>Go back to your echo chamber, faggot. (trying to fit in on 4 channel this hard.) Sure, buddy. >Several emulators already support vulkan and are faster while using it. Which ones? I have an AMD GPU currently installed in a pc, so I could do that.
Sebastian Morris
Dolphin, Cemu, that PS3 emulator think. Also test parallei and see if you're one of the few lucky that can actually into proper N64 emulation.
Jacob Edwards
>Dolphin, Cemu, that PS3 emulator think. Thanks for the info. I'll give these a try. I haven't used emus in a while, so it'll be fun.
Nicholas Hernandez
It's a trap.
Noah Carter
Xenia (X-Box/360 emulator) also has Vulkan support in the going (actually, it was THE very first emulator out there in the world to implement it, before RPCS 3 and others started doing it), but emulator itself is in a very early stages of development and is almost unusable (even less than PS 3 or 3DS emulation currently).
Gavin Scott
I am confused. AMD on its website says that vega 64 liquid needs 1000w. Anons say it is not true. So what is sufficient PSU for liquid vega? The one with which it won't stutter or shut down?
Oliver Morgan
>Red beats green >it's 88 For truth
Jordan King
AMD overestimates the wattage, on my Vega 64 Air box it said 750W PSU required, but Guru3D testing and article said that 500W is sufficient.
Henry Myers
cheaper than gtx 1050 2GB and around the same performance.
Ayden Brooks
This guy said that system crashes sometimes and gives stutters with anything lower than 750w.
Ethan Jones
>AMD on it's website says that Vega 64 Liquid needs 1000W A 1000W PSU for the entire system in a worst case scenario, you dumb fuck. And it's only a "to be safe" estimate, companies always do that so that they can't get sued or something. In reality, especially with all the latest developments on the drivers and power profiles, it by itself doesn't consume more than 500W while OverClocked and 100% loaded, meaning that a 750~850W will be more than enough for as long as it's of "80+ Silver" or even better rated efficiency. Also don't forget that pump consumes additional power to work (that's not counting fans), unlike with air cooling where only fans consume power outside of card itself. All of this adds up to those 500W. Third party versions coming in December and it all will be much better. And then there's also Redux at New Year's Eve.
Jace Hall
He fucked up by using shitty PSU with crappy efficiency and sub-par components, it's a known fact.
Noah Fisher
Well why don't we look at the actual wattage drawn by the card itself? You can look up benchmarks of wattage and at most Vega 64 460W (total system power) according to Anandtech in BF1. When I use RivaTuner to look at statistics and usage the highest power consumption I've seen from the card is 330W, with an overclock to the limit on both the core clock and the HBM and with the power plan set to +50%. I have a 750W Gold plus PSU and a Ryzen 1700, and I've had zero issues with the system being starved for power. Liquid cooled Vega has higher base clocks but I think it also caps out at 1732Mhz (just like my aircooled Vega) and it shouldn't draw more power than that.
Wyatt Brooks
>you dumb fuck How rude. Are you compensating for something? Will 850w be enough for OCed 1600, 1 ssd, and liquid vega 64? Won't it crash on me? >Third party versions coming in December and it all will be much better. How are air coolers going to be much better than liquid one? Especially if they all have 3x8 pins. I thought the liquid one is the best vega you can get.
Blake Brown
2GB is below even the pleb tier in late 2017, matey. 4GB is as low as you should ever get these days, and it should be GDDR5 at the very least (not GDDR3).
Jeremiah Torres
>fury x
Kevin Lopez
Go with what the manufacturer says, and not what some random jackass suggests. Would it probably work with a lower specced psu? Sure until it needs to pull in more juice, so don't listen to some rando unless you plan to undervolt, and keep mindful of your usage.
Logan Brooks
>Will 850W be enough for OC'ed 1600X, one ssd, and Liquid Vega 64? I have the legendary Corsair AX 850W (the last line before they've stopped OEM'ing from SeaSonic and turned into shit), two 2TB SSDs (Crucial MX300), i7 2600K that's been constantly OverClocked to 4.8GHz 24/7/365 for the last 5 years, NH-D14, two 8TB Seagate enterprise-tier HDDs (ST8000NM0055), a case fan controller (Scythe Kaze Master PRO), a sound card, two Blu-Ray/DVD-RW drives, five case fans, and a Liquid Vega - whole system barely touches 800W during heavy gaming (X3 and X Rebirth maxed out at 2560x1440, Witcher 3 maxed at 1440p with GimpWorks on, FF XIV maxed at 1440p, ELITE Dangerous maxed at 1440p, BIOHA7ARD maxed at 1440p, Morrowind and Oblivion with best mods and overhauls maxed at 1440p, GTA V with best mods maxed at 1440p, OverWatch maxed at 1440p, Divinity Original Sin and Original Sin 2 maxed at 1440p, Dragon Age Origins and Inquisition maxed at 1400p, Star Citizen and a crapton more) at most intensive loads.
Connor Fisher
+15 shekels have been deposited to your account, mister salesman
Tyler Price
I heard it sometimes momentarily jumps to high voltage consumption, causing systems to crash. Did you experience any crashes?
James Moore
>shitenstein 2
Samuel Parker
That is nothing more than just BS FUD. If there might've been some potential issues at start, they've all been/being ironed out with new drivers and power profiles.
Jace Butler
That's not only the most stupid, but also the most uneducated response. Do you know how id makes their games? Welp, the right way. They take in regard the average current gpu horsepower, they build the game. Every fucking gpled previous version of id engines haven't showed any code written specifically for any vendor. ...and guess what? Innovations made by id engines are welcomed by gpu arches build with rendering in mind and not for running bloatware libs.
Charles Davis
>+15 shekels have been deposited to your account, mister salesman Are these amd shekels, nvidia shekels, or intel shekels? I'm not sure who I'msuppossed to be shilling for nowadays.
Adrian Gray
>That is nothing more than just BS FUD. If there might've been some potential issues at start, they've all been/being ironed out with new drivers and power profiles. Manufacturer's specs is bs fud? Do we believe AMD the manufacturer, or the amd markerter?
John Fisher
>What is RAGE >What are "drops to ~14 FPS on most powerful GPUs of that time and then two years more after that"
Ayden Edwards
>sudden voltage spikes and crashes >manufacturer specs KYS already.
Jace Campbell
Rage also had other problems too, e.g. with textures mostly on amd cards. Rage had also new features regarding ai on id tech engines. Rage was a testbed for various features carmack was experimenting on. Rage also was pushed by zenimax as a title to fill the dev costs. If you check the timeline, rage happened first, all issues were resolved and after that every major title started using the id tech 5.
Ethan Richardson
>manufacturer suggests a certain wattage psu >faggot itt tells people to just use a 750-850 watt psu Sometimes Sup Forums posters just fascinate me.
Nathaniel Adams
I really wanted to go with liquid vega. But it is already expensive and I don't feel like dropping $150 more for a fucking 1000w psu. It is just 1 gpu. how can it consume so much?
Xavier Evans
IT FUCKING DOESN'T
Connor Perez
Vega 56 looks pretty good in a lot of benchmarks desu
Luke Rogers
>all the retardation ITT about wattage
Parker Martin
>M-M-M-M-M-M-MEGATEXTURD!!!11
Easton Evans
Buy it if you want it that badly, and if you have issues then upgrade your psu later. Worse case scenario is that you either have to buy a psu later, or underclock your gpu. Just note that upgrading your psu is probably a needed option if you buy hat lq.
Tyler Cooper
>manufacturers always over recommend wattage for worst case scenario computers like retards who have a housefire skylake CPU that draws 500W to cover their asses and not have to deal with people who buy a 500W PSU for a 460W GPU >some retard user doesn't understand this and thinks that marketers and manufacturers are lying to him >can't even do a simple google search to look at TSP charts of the card he's looking into
Certified 80IQ or less. Just buy an Nvidia card, you mongoloid.
Andrew Walker
--> .
Grayson Cooper
>Certified 80IQ or less. Just buy an Nvidia card, you mongoloid. I buy whichever brand I want when I want, and I'll go with what the manufacturer states, and not whatever some chump who is way too eager for people to buy an amd gpu says online.