AMD Ryzen for workstations and Serious Business™

Ryzen is out! Wow, the Ryzen 1800X looks like a great workstation chi-
>no dual CPU motherboards
>IOMMU groupings place PCIe slots in with SATA and everything else, preventing you from passing through a GPU
o-oh..

Other urls found in this thread:

youtube.com/watch?v=G_6rs9cBzvE
foros.3dgames.com.ar/threads/914551-thread-oficial-amd-zen?s=5a92fe368c8825bc07df44e91230cc47&p=21188900&viewfull=1#post21188900
twitter.com/AnonBabble

Source on the PCIe grouping? And consumer chips don't get dual socket motherboards since like 10 years ago.

He's talking about the morons hailing Ryzen as a great workstation chip

Can you answer my question?

>And consumer chips don't get dual socket motherboards since like 10 years ago
Right, but the main selling point for these chips right now is competition with the higher core Intel chips, which do have dual CPU motherboards available.
And this.

I mean, if it's no good for gaming and can't really be used for a proper workstation, then it's sort of useless.

youtube.com/watch?v=G_6rs9cBzvE

Where does that pic come from?

Also I too entertained the idea of using an AMD CPU for a workstation but apparently it's pointless to think about it before Naples.

>Right, but the main selling point for these chips right now is competition with the higher core Intel chips, which do have dual CPU motherboards available.
X99 motherboards don't support multiple sockets.

>youtube
Specify a time point because I ain't wasting my time watching all that

I think they do, but only on Xeons
The consumer CPUs don't support multisocket

Are you retarded? The 1800X isn't the Ryzen chip that would go into actual workstations, just as the 6950X isn't the Intel chip that fills that purpose. Go find me a dual socket X99 motherboard if you disagree.

But it is possible??

i really dont understand op's pic

>The 1800X isn't the Ryzen chip that would go into actual workstations
Ok, so what is the 1800X good for, then? People who are too 1337 for 7700K performance but too Pajeet for a Xeon setup...?

Arguably the consumer who does workstation shit as a hobby and can't afford a proper workstation, but even then I would think that using a Quattro and hardware acceleration would be a better solution. But I dunno, I'm a gaymen manchild

I'm really disappointed with Ryzen desu.
>bad gaming performance
>doesn't really make sense for a workstation
m e h

Luis is a fat cunt

This teacher is a fucking retard and needs a math lesson.

good goy

Graphic design, cleanup, video and music editing, decryption/encryption, database, com/decompression, virtual machines, transcoding, SAP, streaming, compute, BOINC, encoding, rendering cloud, BLAS, LAPACK, sorting, compiling, hosting? Heck, even regular software decoding and browsers benefits nicely.

There's plenty of shit you can use more cores on.

>Graphic design, cleanup, video and music editing, decryption/encryption, database, com/decompression, virtual machines, transcoding, SAP, streaming, compute, BOINC, encoding, rendering cloud, BLAS, LAPACK, sorting, compiling, hosting? Heck, even regular software decoding and browsers benefits nicely.
So all things you can do with two old Xeons for 1/3 the price?

>So all things you can do with two old Xeons for 1/3 the price?
Used hardware is always better value, but it's a good thing only manchildren on Sup Forums buy used hardware and not OEMs

>but it's a good thing only manchildren on Sup Forums buy used hardware and not OEMs
>OEMs are going to buy a consumer CPU with the problems outlined in the OP
any more stupid shit you have to share with us today, 'Jeet?

>are going to
They already did.

>outlined in the OP
You haven't given a source on that yet.
And you haven't showed me a 6900/6950 in a multi socket setup.

>You haven't given a source on that yet.
it was posted above
>And you haven't showed me a 6900/6950 in a multi socket setup.
I'm talking about their low-end Xeons and/or what said

Serious question:

How is this a better value proposition than picking up a Quadro for under $500?

The point is that they have no incentive to upgrade to this model in many cases.
>OEMs with dual- and quad- CPU configurations
>replacing everything at great expense for a single CPU consumer chip which likely performs worse overall

>used hardware

Look ma, I got this overclocked Haslel for gaming for a fifth of the price of a 7700k, best value.

>it was posted above
Show me

Why would I use the 1800X when I can buy the E5-2670 for $50 each?

>and/or
Take your childish biases elsewhere. I'm interested in discussing hardware here, not playing some egotistical cat and mouse game with someone who is emotionally invested in a consumer product.

>that picture

lol fucking stupid kid

Because a Quadro is a really good CPU.

...

if you want dual sockets wait for Zen Opterons.

No you're not, you haven't shown me a 6900k in a dual socket setup, you haven't proven the IOMMU issue.

The only thing you've told me is that old used hardware is cheap and you're ready to sacrifice performance because you're poor.

ikr kids are so dumb

>No you're not, you haven't shown me a 6900k in a dual socket setup, you haven't proven the IOMMU issue.
Did you even read the thread? No, because you're interested in defending your position, not in discussing hardware. Just fuck off.

>The only thing you've told me is that old used hardware is cheap
>and/or
>still running with this
Either you're too stupid to comprehend what was actually said or you're being disingenuous. Maybe both.

It's simple, direct me to the post with the IOMMU issue.
It's not hard.

I have no difficulty comprehending, what you think, you think used 6 year old hardware is a good substitute for a modern chip.

Which it is only if you're in a tough economical situation.

If the AMD kids weren't autistic computer illiterates they'd know massive multithreading isn't the job of the CPU but of the GPU:

X99 does not support more than 1 socket, C60x supports upto 4 sockets
and currently Ryzen is a consumer only product. so it makes sense to be single socketed.

>It's simple, direct me to the post with the IOMMU issue.
>I have no difficulty comprehending, what you think, you think used 6 year old hardware is a good substitute for a modern chip.
>and/or
>still running with this
Either you're too stupid to comprehend what was actually said or you're being disingenuous. Maybe both.

Just going to ignore you now

I think the students answer was more reasonable than the teachers answer.
Its definitely possible Martys pizza was bigger and that could definitely account for him having eaten more

Hell the question is asking how its possible, not whether or not its possible and why
I think the students answer was a lot better
In fact I think that teacher shouldn't even be teaching if they can't see how that answer is not only clever but absolutely correct

kid's answer was the only valid answer desu
teacher's answer is obviously retarded, that's the joke

You posted a 20 minutes youtube video? Mind telling me WHERE in that youtube video that IOMMU information is?


Yeah, I'm too stupid to comprehend that you're arguing used hardware is better, it's only better if you have money issues.

Better call amazon and tell them they can dump their 80000 CPU nodes and replace them with GPUs, since GPUs are real good at managing storage and hosting.

American education system, everyone.

I expected more as well, but I can't complain as they didn't market it to be a gamer chip. I disagree on "bad" gaming performance, but I'm tired of arguing with people on Sup Forums.

I will probably get an 1800x in a few months to use as a home workstation, as mentioned.

I believe laptops and pure gaming/entertainment rigs should stick to i5s for now

we're talking about the 1800X

There aren't much CPU heavy tasks you can put GPU to work on, and even at that you're likely looking at lower precision than CPUs.
Not to mention if you don't have opencl/CUDA folks who're on a 7x higher payroll than regular c++ programmers your GPU is standing there looking pretty.

In most countries, teachers are those who could accomplish nothing with life and barely passed high school.

Yes, 1800X is a CPU, it's quite fine for doing work at a low cost.
Are a dual socket xeon boards better? Yes, but they also cost way more than even the 6950X which is already obscene in price.

>that pic

I'm not that guy, but what are you getting at?

First post suggested the a gpu should be used for large-scale multi-threaded work over a CPU

The reply was a sarcastic remark stating GPU are good at handling storage and hosting (they're not).

What does mentioning the 1800x have to do with this chain?

Replying to the wrong person?

It supports ECC RAM

I highly doubt you'll have money for a 2x32core Naples motherboard, maybe the 16 core Snowy Owl platform will be better for you?

>we're talking about the 1800X
This is called context. It defines the scope of the discussion.
>Better call amazon and tell them they can dump their 80000 CPU nodes and replace them with GPUs, since GPUs are real good at managing storage and hosting.
The poster was suggesting that GPUs are not suitable for Amazon, which is obvious. The problem with this statement is the fact that the person asking about GPUs presented his case within the context of an 1800X discussion. The 1800X is also not suitable for Amazon's use case here.

I'm looking forward to watching you move the goalposts as most people here tend to do.

OP's pic shows that student was thinking outside the box. He should be applauded. I bet he goes on to be a billionaire.

D. Trump

>>IOMMU groupings place PCIe slots in with SATA and everything else, preventing you from passing through a GPU
I use a Piledriver system and it works for me(tm) without any patches

>I use a Piledriver system and it works for me(tm) without any patches
How is that relevant?

your pic triggered me.

>>IOMMU groupings place PCIe slots in with SATA and everything else, preventing you from passing through a GPU
Can I get a source? I'm not finding anything on Google.

>IOMMU groupings place PCIe slots in with SATA

>google several variations of this
>all relevant results end up on Sup Forums

Really makes you think.

>new product which almost nobody has their hands on
>being a monolingual brainlet
>getting your almonds activated

foros.3dgames.com.ar/threads/914551-thread-oficial-amd-zen?s=5a92fe368c8825bc07df44e91230cc47&p=21188900&viewfull=1#post21188900

It still highlighted the difference between a CPU and a GPU when it comes parallel workloads, which is the point of the exaggerated sarcastic remark with Amazon.

>It still highlighted the difference between a CPU and a GPU when it comes parallel workloads, which is the point of the exaggerated sarcastic remark with Amazon.
But the point is invalidated by the fact that this person (you?) completely ignored the scope of the discussion.

>Party A suggests that object B is more suitable for purpose A and seeks a rebuttal
>Party B responds by suggesting that object B cannot fulfill the requirements of purpose B

No, the first post wasn't talking about 1800X. He said that GPU 's are more fit for heavy multi-threaded work.

The thread is about 1800X, no debating that, but this discussion is separate. Our OP is claiming that GPU's are more fit in this "heavy multi-threaded" scenario, which they're not.

The scenario laid out by the reply is certainly better for a CPU, regardless of whether it's an Intel/AMD

Neither were talking about the 1800X. People are capable of having a separate discussion in a single thread.

I look forward to you making shit up and attempting to assert that there was more implied when the context was pretty straightforward (that our OP believes that multi-threading is only for GPU's)

>Yeah, I'm too stupid to comprehend that you're arguing used hardware is better, it's only better if you have money issues.

respectfully disagree
buying off-lease hardware is a basic tenet of many tech savvy people who do more than game or whatever, it's a bit shortsighted (and needlessly rude) to say that this is about being poor rather than making a sensible purchase
>t. sysadmin; there's a good chance that I make as much or more than you, especially when COL is taken into account

And here we witness an Intel shill so incredibly butthurt by AMD'd amazing new OPTIONs that he memes used Xeon DP as an alternative work station, knowing full well that it won't even compete with Ryzen on SC perf (which is still needed on workstations for parts of their workload), AND will offer vastly inferior energy efficiency.

Since when did Intel shills think energy efficiency didn't matter?

>trusting any kind of tech site from Niggertina

Person A didn't seek a rebuttal and there was no purpose B suggested.

Person a was spewing incorrect bullshit and person B called him out on it

Since Zen is about 40% higher perf/watt than Broadwell at higher clocks, they don't want to talk about it.
muhgyms

I want to punch that teacher directly in the face for being so stupid.

A fucking child can think better than her. Yes, that must be a woman.

>No, the first post wasn't talking about 1800X. He said that GPU 's are more fit for heavy multi-threaded work.
>more fit
More fit than what? This is a comparative statement. In this context, we're talking about the 1800X. He might be wrong on an objective basis or in certain circumstances, but in this context, he's addressing the notion that the 1800X's main value comes from its relatively large number of cores and threads relative to its price point and potential use cases (it's a consumer CPU). You can't simply ignore the entire context of the thread and thus ignore most of the relevant parameters just to make a counter-point that has little or no relevance. I mean, you can, but doing so is logically flawed.

Tell me the time where they talk about the issue, I'm not going through a 20 minute video.

You're probably an autist that doesn't get the point of exaggeration and believes that every single analogy must be analyzed with the utmost scrutiny for the analogy to hold up.
The point was that CPUs and GPUs are optimized for different workloads. You can't argue about it being "within the context of an 1800x" when you can't even get the most basic facts right.

all they mention is what has been said in this thread a few times already, there is nothing more to discuss beyond that. save yourself some time

>You're probably an autist that doesn't get the point of exaggeration and believes that every single analogy must be analyzed with the utmost scrutiny for the analogy to hold up.

You're making a stupid non-point while being a smug jackass and you got called out on it so you resorted to the autist meme. Being less intelligent than the other guy isn't an excuse. You're still wrong. People above were right to ignore you.
>Most of the shitposting ITT is just one guy

Point me to it, I'm watching a diagram of the connectivity and I'm seeing nothing there.

And this is what happens. While Intel faggots on this board retreat into the exact same arguments and faggots used to make, real people everywhere else will buy AMD when it makes sense for them to, which will be in most cases when building most workstations, and some cases when building high-end home productivity PCs.

Then R5 and R3 come out, and most people just get AMD for the next year until Intel can even compete.

you've been given the info, multiple sources reference it, this is a first party source, i'm not going to do your homework for you. either watch the whole vid or give up

>So personally affected by the insult that he only replies to the insult without addressing the argument that follows it
The point was, and still is, that GPUs and CPUs are meant for different workloads, even if we're talking about parallel workloads. There's no need to bring up the 1800x when we're talking about this basic fact. That was the point of the exaggeration, to show that the CPU and GPU are meant for different workloads, no additional information concerning the 1800x is needed to prove this point.

Than CPU's. He was including Intel chips. It's like you can't read. He even said CPU. He didn't say Intel, amd, or any other brand or model. He said GPU > CPU for a type of task. Nothing more, nothing less.

A logical flaw would be making an incorrect assumption when OP was explicit about saying CPU and not AMD 1800X. If he meant something else then he should have said something else.

He made a statement. He is wrong. That's all there is to it. He hasn't stepped up to clarify either. In fact he is copy-pasting that reply in other threads as well

I see, so you don't know, you could have posted the time 10 times already instead of making excuses in your post.

>The point was, and still is, that GPUs and CPUs are meant for different workloads, even if we're talking about parallel workloads
context of the thread: AMD offering, 1800X
some relevant posts:
you can't just selectively ignore context to make a counterpoint which isn't valid in the context within which the original statement was made, that is a fallacy
brainlets pls go

>I see, so you don't know, you could have posted the time 10 times already instead of making excuses in your post.
you could have watched the entire video by now but instead you ask me to load up a youtube vid and seek around for you, how about no, fuck off retard, i don't give a shit about you or your ignorant opinions and neither does anyone else

Nope, that's a different person. I don't believe in insulting people while arguing.

There are just multiple people realizing you aren't right

s/you were/he was
It doesn't matter.

>25 IPs
>+50 posts
>still 25 IPs
>same 2 to 4 people posting
>turns into semantic bullshit and ego battle rather than discussion of hardware
k I'm out enjoy your thread

No, this is a completely separate discussion. None of us are talking about the same thing as them.

Just because someone is in room full of people talking about hentai, doesn't mean they are talking about hentai. You, me and the others telling you you're wrong aren't talking about the 1800X. You keep trying to defend our dumbass OP, when all we said was that his statement was factually incorrect

You're right. It doesn't matter.

Our OP made shit up. He was called out for it. Then you defended him and made shit up

Watch Apple refresh their Mac Pros with dual Ryzen 1800Xs. Calling it first. Maybe Apple was waiting all along for Zen like everyone else.

but Marty's pizza could still be bigger then Luis's pizza

Not seeing anything here, PCIe and SATA got their own lanes, meaning IOMMU(PCIe) won't interfere with the SATA

Yes, but doesn't need the 1800x as context to highlight the basic fact that GPUs and CPUs are meant for different workloads, which is the point of in an exaggerated manner. That is a BASIC fact that doesn't need the performance of an 1800x to explain. Regardless if we're talking about Intel or AMD cpus, the fact still remains that CPUs and GPUs are meant for different workloads. That was the fucking point you autistic retard. I don't fucking care if the context is calculus or trigonometry if you can't even get your fucking algebra right.

None of these threads about ryzen are ever any good

>IOMMU groupings place PCIe slots in with SATA and everything else, preventing you from passing through a GPU
Source?

So can I run a M.2 SSD, a GPU and still have IOMMU?

>ust as the 6950X isn't the Intel chip that fills that purpose
I7s arent workstation parts you retard. Pic related is my workstation. Keep on larping that your $1k shitbox is in the same class.

>dual socket board
>only 4x PCIe slots
>No onboard NICs
>6 CPU power connectors

>Graphic design
Photoshop is comparable to 7700K for less.
>decryption/encryption
Depends, in some cases loses to 7700K as well.
>streaming
Done on GPU.

Teachers are always retarded

Not all photoshop or adobe workloads are single threaded, user.

>only 16 PCIe lanes
>only dual channel memory