AMD ON URGENT SUICIDE WATCH
>Titan XP and next gen in same year
>* blocks ur path *
tweaktown.com
RX Vega soon right? Right?
AMD ON URGENT SUICIDE WATCH
>Titan XP and next gen in same year
>* blocks ur path *
tweaktown.com
RX Vega soon right? Right?
Other urls found in this thread:
eurogamer.net
twitter.com
rip AMDead
AMD is tearing up Intel but Nvidia is totally decimating them. How can they ever recover
THIS CANT BE HAPPENING
Vega finds a way.
Sad thing is this tells us more or equal information as to what we know about RX Vega which is supposed to be released anytime now
Was gonna buy a 1080ti, is it better to wait?
Memes aside does anyone here actually have loyalty to either manufacturer or just buys the best one? (I am a nvidicuck if it matters, but I hope ayyymd gives them some competition or even btfos them with their next release)
Mfw MSI makes the best looking gpus
So does Nvidia basically keep their new hotness under wraps until amd even tickles the idea of coming out with their new architecture so they can drop a bomb on the market and steal the wind out of amd's sails? I mean since they have such a secure hold on the market they don't need to do anything until amd tries something.
The chip hasn't even taped out, this is wccftech tier clickbait.
And Vega still waiting™
For fuck's sake AMD!
pretty much. in the few cases where amd has had a competitive product it made nvidia drop a new card early, like they did with the 980ti.
I've had both amd and intel cpus in the past but always nvidia gpus because I need proper OGL drivers.
>proper OGL drivers
>exit a "full screen" application in linux, one third of the panel looks corrupted
Proper OGL drivers, they should get their 2D ones working first.
Yeah, it would be a manufacturing miracle to have good Volta silicon by the Q3 of this year.
Fake news.
>my father is an employee of Nvidia
Isnt it late to still be working at AMD offices?
>Chinese rumor mill
>GTX 2080
>GTX 1030
You fags will believe anything if it shits on AMD.
Oh, another GP100-tier paper launch? Go fuck yourself novidea.
Nvidia already confirmed volta will be 20x0
>thinks regular mooks at Nvidia know what tapeout means
This isn't Intel or Samsung, who got their own fabs and can keep a tapeout secret, when AMD and Nvidia's chips come from the oven as its from a third party fab.
Not only did we hear fucking nothing, there's also no zauba information yet.
>said the amdrone while nervoulsy waiting for volta
You mean Vega? And no, i have novideo card. But yes, i will buy Navi or whatever AMD will release after Vega, at least they have drivers. Like, actual drivers.
About a valid a rumor as Vega being 30% faster than the Titan XP
That's pretty sad.
Eleven eighty or twelve eighty sound much better that twenty eighty.
Source me, no sane company announces marketing names a year before launch.
मैं आप सभी को भारतीयों का मजाक बनाने के लिए नफरत करता हूं हम दोनों कंपनियों का प्रमुख होगा
I doubt it, the source if finky at best and even if they did, it would be the lowend shit like the 750Ti was.
Furthermore the TxP and 1080ti are too fresh.
>too fresh
GP102 is pretty old.
And we only got a full GP102.. a month ago?
This explains why King Poojeet wants to sell RTG to intel.
Yields are ass. That's also probably the reason Vega is so late. Big die, and they refined the process on poolaris.
What do Kyle's wild fantasies say about Kabylake-G using AMD's GPU as a MCM that's been floating around the last few weeks?
Kyle pls.
Sounds like a custom apple chip.
Sounds weird considering they can just opt for Raven Ridge.
Not if they want a powerful GPU, that's iMac and MBP territory.
Raven Ridge is still a small die, won't fit over 800 shaders there, even if they are NCU shaders
They can just order some custom design from AMD then. Remeber it's AMD who designed current console hardware.
Not enough time for custom designs, maybe next year.
Who knows, it's all Vega-related and ayymd is surprisingly silent on it's technical details. Like, what the fuck is NCU? What did they change? What is HBCC? I wish they showed us anything.
AMD's only custom design this year is Scorpio, there's simply not enough time to get a custom design for Apple when F4 Zen stepping came out literally in the middle of February. not to mention Vega which is still in clockspeed and driver hell.
>AMD releases Ryzen
>intel proceeds to shit itself and suddenly moves up Coffelake by 3+ months
>AMD is about to release Vega
>nvidia shits itself and starts hyping Volta release
Thanks AMD, if not for you we would be paying 1500$ for a gtx 1050 and another 2k for an i3
Scorpio is designed by MS themselves, and is fabbed on 16nm TSMC. AMD only helped with its design. But yeah, this year is surprisingly busy for AMD, so many big releases.
You'll probably get to see pictures of GV100 at GTC, but availability no sooner than Q1 2018 and that's starting with GV102
I wouldn't be surprised seeing GV106 come out sooner.
So Maxvell 2.0: electric boogaloo? I doubt it.
The rumor before this was that Volta will be 12nm, very reliable, before that, it was to release in May.
12nm is a meme anyway, almost no density increase. I hope Volta will have some proper hardware scheduling.
Yes
Are AMD and Nvidia competing on who releases their next gen chip first?
We had the RX480 released 8 months ago, AMD is refreshing it with the RX500 series and later this year Vega will launch. Polaris will get in total 1 year on the market then it's out the window.
Pascal doesn't even get a new series, GTX2000 will be Volta. not even 2 years on the market.
As a reference Fermi was released in April 2010 (GTX400), refreshed in November 2010 (GTX500) and replaced by Kepler in March 2012 (GTX600). 2 years on the market.
On AMD side the RX200 was released in October 2013 and replaced by the RX300 in June 2015, 2 years on the market.
Albeit, it is the best time to push for GPU development since games are the number 1 reason people are buying powerful GPUs, we can always create games with better graphics and higher resolution and the CPU, RAM and storage don't really bottleneck the GPU in games.
It's literally a gold rush right now.
Why do you think so?
ML is growing. That's another reason for both competitors to make more and more powerful GPU's.
Volta is supposed to be more akin to Vega, i.e. somewhat major architecture change. They'll paperlaunch it as usual with GV100 that you'll never see.
yea, GPU compute has been booming since 2010
Yeah, but the P6000 was released like literally a month or so ago, that's enterprise hardware and it's not replaced fast, so that makes no sense.
Same with the Vega Mi25, was revealed in December but released a few weeks ago as well.
Both are ML cards.
There is a sourced benchmark putting Vega at 35% faster than a 1080, so near 1080ti/XP range
It's going to be the fury for the next two years
Isn't P6000 a Quadro and Tesla is NVIDIA's pure compute line?
I know what you mean, but anyone can make pretty graphs.
>AMD only helped in design
All Ms did was I want x for x watts to run x resolution can you do it?
What did they gimp this time? Weren't Quadros FP16 or did they move that to Tesla?
Tesla have historically been FP64 and the P100 is FP64
I wonder if consumer Vega will have non-gimped compute.
Everyone would just whine about TDP if it does.
225w for 12.5tflops fp32 is too much? Sup Forums is weird.
Enable FP64 and that goes straight over 275W
And that's bad considering it could do 1/2 FP64?
>MS designed an APU and chipset to accompany it
yeah, no they didn't. They may not have even designed the mainboard, although they do have some experience in that area. But even if they did design the mainboard, that's comparable to being handed a jet engine and strapping it to a chassis and saying "LOOK I BUILT THIS"
There's no reasoning here, higher TDP == bad, even if the ASIC power is 20-30% smaller outside of FP64.
Faggots don't know, Tahiti actually suffered for this, it was 1/2dpfp
They should just use the kike Intel version called SDP, that can be anything you want.
Kinda like Core M's rated at 4W SDP pulling over 22W under boost, or the 15W ULVs going over 40W
When you got money you can get away with everything
>new cards already
It is like they enjoy pissing off their customers.
>when you got the money
And AMD has no money. Well, at least yet.
Since when did nvidia care about their customers
Is this real? I was thinking of getting a 2nd 1070 somewhen this summer but a Volta launch would come handy. I would even consider the 80 depending on the price. Also my budget depends on the capability to max out ME Andromeda.
Good point. Many things including the new titan X being crippleware show they give no fucks.
About as real as Intel's 6 cores at 5.0GHz
It'll probably be possible but that shit will pull over 300W alone, like only 3 motherboards won't spontaneously combust with it
I want to get a 4K TV first, hoping to get a 1070 FTW for around 300 € then. I'll decide when I have the money, probably September. 1080Ti might be an option if the prices drop, but I think my card will be worthless until then and it is only three months old...
>Bought a 1070 for 470 € weeks before 1080s for 519 € were avaiable
>New and we're talking about FTWs
unlikely. The instantaneous heat generated by a chip like that would make it horrendously unstable. That is to say, even if you could dissipate 300W and the chip never exceeded that range, the spikes would cause short lived voltage fluctuations and crash the system.
That's not even mentioning how difficult it would be to maintain a constant voltage across the chip in the first place.
> It's literally a gold rush right now.
Whoever puts out the first card for stable 4K 60FPS gaming at a non-enthusiast price point is going to make a shitton of money.
i was going to get a gtx 1070 in about 10-11 days, is it now a bad purchase?got a 660 atm
Don't forget that if your GPU can drive 4k, it can drive 90fps VR at 1920x1080 per eye as well. There are multiple markets to consider right now, although the VR market is still small. If the total cost for VR drops below about $1200, I think we'll see widespread adoption, especially since compelling software is beginning to be announced now.
you would need a platuim certified PSU aand those are expendive but its do-able
the Raedeon Pro duo uses 500W and that is just fine.
But that and a OC'd CPU would be questionable.
X370 MOBO's on the highend have extra Power motherboard for exterme set ups
the downside being you only get 2 dimm slots because of heat output is HUGE
If your on a 660 then any Opition is good heck even a 480 will BTFO off the GTX660
you are really far behind user.
>tfw just bought a 1080 a couple weeks ago
guess im waiting for a 3080 then
ahh unfortunately i cant deny being behind, i suppose the difference between a 1070 and 2070 jumping from a 660 will be negligible anyway, im also unfortunately running an fx 6300, i plan on upgrading to probably the 7600k though, reckon a 1070 paired with a 6300 is a bad idea? ill be playing at 1440p
The issue is not power delivery. The issue is that voltage does not scale linearly with frequency. As you increase the frequency that you're running a processor, the voltage required increases exponentially and it becomes increasingly difficult to maintain the same voltage across the chip. This isn't related to power consumption, it's related to frequency and voltage. AMD's Fiji chips have no issue drawing that much power because they're designed to do it; they keep relatively low clocks in the 1GHz range so that voltage is easier to control. For reference, the highest clocked GCN chips we've seen are the current RX 580s, and those don't really go much farther than 1500Mhz. Since desktop CPUs can't benefit from parallelism the way GPUs do, they don't sacrifice clockspeed in order to operate thousands of cores. A desktop CPU is designed from the ground up to operate at the highest clockspeed it possibly can while generating a manageable amount of heat and staying very stable. The 7700k is tapped out, and since a quadcore is barely stable at 5GHz, not because of power usage, but because of voltage management, we know that adding two more cores into the mix at the same frequency would be impossible. Things might change with a node shrink, but based on all the issues we're hearing about from the Intel camp regarding node shrinks, I don't expect much.
Don't become a corelet, get a r5.
Do you think my GTX 1070 will last me another 4 to 5 years assuming I stick with 1080P?
which r5? ill be honest i havent got the biggest budget and wasnt very interested by ryzen but heres your chance to change my mind, why would it be a good decision to choose an r5 over say the 7600k
To an extent? I buy AMD products because I consistently have fewer issues with them, because I want to support competition and because AMD legitimately seem like the less shitty company with all their open initiatives (CUDA vs. OpenCL, FreeSync vs. G-Sync, GPUOpen vs. GameWorks, Mantle, etc.).
I still buy NVIDIA and Intel products from time to time when it makes sense, but I try not to.
>VR
>widespread adoption
We'll only see that when celebrities start releasing virtual sex slave games.
Nvidias are not made to last
Well it is my first. I had all AMD cards before this one.
The R5 1600 is the best price/performance AFAIK. The benefits are that it has far more cores and threads than anything Intel offers in the same price bracket, meaning it's significantly better at multi-tasking while not being significantly worse in single threaded workloads.
The AM4 platform (the motherboard basically) it's also scheduled to last at least until 2020, unlike Intel platforms, which only last two years usually.
on my home pc i dont really often do anything to do with 3d work, the most stress i usually put on my cpu is a bit of video editing and photoshop, a 6600k was probably a bit overkill for this but i hear its a solid choice that definately wont bottleneck a 1070, i wont run into those issues with an r5 1600 right? also is the 1600 still a good choice considering my light cpu loads?
I can't see any reason why the R5 1600 would bottleneck that. I mean sure, compared to a 7600K or something similar it will, but it's only visible in the ultra high frame rates. It isn't going to inhibit your ability to comfortably play anything. To use an example, in DX11 Far Cry Primal 1080p, the R5 1600 loses 14 fps compared to the 7600K, but the drop is from 108 to 94.
Also it's widely believed that games will continue the trend we've seen over the last number of years and continue to use more and more cores (particularly with the introduction of the Vulkan/DirectX 12 APIs that make it easier than ever before) making the Ryzen chip the smarter long term purchase.
As for Photoshop and video editing. Video editing should benefit from the increased core count. Photoshop however underperforms on Ryzen because it's pretty single threaded, I believe the same goes for similar Adobe products.
If your CPU loads really are very light, you might want to go for something even cheaper than either the 7600K or 1600. There's the R5 1400 but I haven't heard glowing things about it. Tons of options on Intel's side of course, right down to Pentiums. All in all, don't worry about CPU too much for frame rates, if you need any proof that it really doesn't matter that much, here's some Intel Skylake Pentiums benchmarked and managing to run games pretty competently when paired with a powerful GPU.
eurogamer.net
I hope that the Vega card will be the new 290X. I have a FreeSync monitor and I like AMD. I think it's impressive how the 290x still holds up. If the Vega is the new 290x it will be fantastic
So the important questions stands,
How do you pronounce '2080'?
Twenty-Eight? or Twenty hundred-Eighty?
Since when does 20 come immediately after 10? This is going to be the 11xx series, dummy.
Twenty Eighty