Ok I have 8/16 ready to use now what? How do I have fun with this?

Ok I have 8/16 ready to use now what? How do I have fun with this?

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=N82E16820232181
youtube.com/watch?v=EhwOZyfjon0
reddit.com/r/Amd/comments/64r0hj
hardocp.com/news/2017/03/16/amd_ryzen_5_processors_core_ccx_allocation
twitter.com/AnonBabble

just imagine how many exotic condenser toddlercon watersports visual novels you could simultaneously play with that bad boy.

well, you can certainly have fun with it if you like 3D modeling and animation.

>60597103
>i spent too much money on equipment i dont need, or able to utilise the power of
what do 4ch@ng?

Now leave your browser and other programs open while you play gayms and watch Intel corelets commit sudoku.

I'd buy them for VMs and multiple browsers.

Dunno about you.

Encode all your anime folder to HEVC with handbrake, host a modded Minecraft server, install Gentoo and enjoy low Iridium's compiling time.

Oh god, the 1700 is at $300.59 and I was thinking if it went below $300 I'd pick it up, but I've heard that the 7nm version is gonna be god damn amazing because of the IBM process made for high power, I'm still running an 8350 but I'm pretty sure my cpu would keep me going a couple more years unless I legitimately wanted to start doing video editing on a decent basis.

Even the 1600X would be a huge upgrade over that.

While I realise that, I was thinking of doing some video editing / compression for stuff for youtube, and the extra $50 for a processor I assume would last me a while longer because I'm running a noctua d14 or whatever their like $80 one was with 2 fans a few years back, I could get the near 4ghz and it seems many of the ram compatibility issues have been solved

anyone have some good data (in english) of single rank at like 3200 vs dual rank at 2400 (which is the highest I'm finding dual rank at)

you realize AM4 is a multi generation platform yea?
you can buy a cheap ryzen now and get the 7nm version next year

Dual rank is EVERYTHING without tight timings.

newegg.com/Product/Product.aspx?Item=N82E16820232181

For example that, for this specific kit I dunno how it overclocks but I really doubt you can't get 3600 and 16-17 all timings

run every program at once

or run a gayme and obs at the same time and get famous

>
Mah nigga

you don't

Run some real-world applications that you actually use and some games that you actually play. Realise that the poor memory latency and other design deficiencies make for disappointing real-world performance. Realise that you should probably just play it safe and go Intel for your next build.

>Realise that the poor memory latency and other design deficiencies make for disappointing real-world performance
This is a load of shit. The most memory latency effects is "muh gaymes".

>Realise that you should probably just play it safe and go Intel for your next build.
Their 8 cores cost twice as much and don't perform any better.

>well, you can certainly have fun with it if you like 3D modeling and animation.

enjoy the placebo, my man.

3D modeling has almost completely shifted to GPU. I don't even use my CPU when rendering iray. It actually slows it down.

Well, I mean... why would you need more than a 1600 unless you're doing video-editing/VMs/something productive on a regular basis? Yeah, CPUs that will be released more than a year from now will be better. That's to be expected.

Most people are well aware that an individual can buy a product now, and then purchase another product in the future.

>This is a load of shit. The most memory latency effects is "muh gaymes".
And what do Sup Forumsentoomen do with their computers? Game and brag about their programming feats during freshmen in schools they promptly dropped out from.

How much is intel paying you? Because if you're doing this for free, you're retarded and you should probably kill yourself.

>Anime
>HEVC
>handbrake

>video-editing
even video editing has cuda-core utilization

VMs are literally the only reason to go AMD

Stream Slime Rancher

>Nvidia graph syndrome
plz no.

You know there's a reason most people are reccomending the 6 core 1600/1600X, right?

Yeah because they can't afford the CPU they really want, so theuy want noone to have it

GPU video encoding is never as good quality as the best x86 encoders are. Much faster, sure, but not usable for a lot of professional content.

god nwn is shit on modern machines
it's kind of ironic that 1 run like butter back in they day and runs like shit now, and 2 was a stutterfest on the beefiest machines at release and you barely need to throw anything of note at it today

MOAR BIBELINES!
MOAR BINGBUS!
MOAR NIGGAHURTZ!
MOAR TDP!
MOAR GIGGAWATZ!

Unless you use an Intel chipset, which changes every generational update of the same CPU.

>tfw I bought a 1700 and now I'm actually tempted to install Gentoo just to see how long everything takes

>Just buy a bentium and gib the rest of your moneyz to Nvidia, goy.

No shit. iray is a GPU render engine, what did you expect?
3D is far from being GPU only. All of the best renderers are still CPU, Max seldom uses the GPU, blender too, and other areas like photogrammetry are almost completely CPU.

youtube.com/watch?v=EhwOZyfjon0

>buying the 1600X

On a more serious not the Zen architecture is a monumental acheivement that completely BTFO's intel core arch in every way. The only thing holding back in first generation is the process node. As soon as Zen2 drops on 7nm finFET I will buy AMD and never look at intel again.

VMs run like fucking hot garbage on ryzen.

Vms my dude. Rock qubes.

Why?

That issue is about to be fixed in the next update which is soon to be released

Not anymore, you'll have to find another straw to grasp at, you enormous faggot :^)

alot of impatient faggots who dont understand that Zen is literally brand new first generation architecture and that AMD is will working out all of the bugs in the microcode and the BIOS. This is the equivalent of a first generation core i7 chip so you need to be patient for every single issue to eventually get resolved.

Also AMD just updated its DDR4 compatibility to 4000mhz. And we know from the few benchmarks run with a 4ghz 1800X and 3600mhz RAM that gaming performance is hot on the heels of an i7 7700K OC'ed to 4.9Ghz

Once again Ryzen performance amazingly well as a first gen design on an inferior process node up against intel's fully matured and smaller current 14nm process.

If they have this kind of performance as it stands then when gen 2 zen comes out with more IPC gains and better DDR4 support along with the higher efficiency and clockspeeds of 7nm finFET then intel should be shitting bricks right now. Especially since its come out that their 10nm process is being delayed due to horrible yields. They are likely using 14nm yet again for the nest series of desktop chips which is really bad for them. Kaby lake was already a slap in the face of their customers. the IPC is actually 1-2% worse than skylake and the thermals are worse than skylake as well and all you get for that huge price tag is a processors that has more DRM bullshit which everyone hates and 100-200mhz more clockspeed on stock and OC.

Shitpost on a Burmese underwater carpet weaving forum.

Intelfags are literally all just desperate buyer's remorse fags trying to secure their own purchase decisions, that's their only motivation for shitposting just think about how hilarious that is for a second, atleast an AMDfag can claim they care about market balance

All I hear is

>AMD released a shit product
>b-but they'll p-patch it
>w-wait for version 2.0 it'll be better

Intel released a shit product back in the day.

It was called the "core i series" and it was riddled with problems.

They've been using it for what, 10 years now just making tiny adjustments here and there?
They still have issues.

AMD releases a completely new arch that has 0 optimizations from devs, motherboard manufacturers dropped the ball because they figured zen would flop, some manufacturer actually stated they were more worried about their supply of z210 mobos or whatever shit intel has.

God when intel flops out a new arch because they HAVE to if they want to compete I'm gonna laugh at all the intel shills bitching about it being new and thats why it has issues.

LMAO yeah that's why Intel is panic releasing DOALake-X and InfernoLake-X

>Intel
>Just wait 'till 2020, guise.

Assuming an annual release AMD will be able to cockslap them with 3 generations before they can produce anything new.

They are objectively fucked and may have to trash current plans and invest into R&D instead of silly diversity quotas.

What

install gentoo (serious answer)

They will either have to make an even smaller and more power efficient design with better multi-threading than AMD or they will have to elongate the piepline and make a clockspeed monster pushing 6-7ghz and beyond while somehow not turning into a fucking housefires meme.

Based on how well zen's single thread performance scales up with faster memory speeds its kinda silly to push the more ghz meme at this point.

when a 4ghz zen with 3600mhz is almost at parity with your 4.9ghz chip at 3200-3600mhz memory you need to rethink your chip design at a fundamental level.

some of its their better caching and pre-fetch capabilityies but a decent chink of the performance is simply AMD having a better SMT design that intel which has done almost zero innovation on SMT since sandy bridge.

So you got an intel core which pushes higher single thread and a zen core which pushes higher multi-thread with other zen cores so a sinlge intel core still speeds ahead but when the application need 4-6-8 cores with hyperthreading suddenly AMD closes the gap while being more power efficient.

> with better multi-threading than AMD
They'll have to drop the Ringbus design for that.

>elongate the piepline and make a clockspeed monster pushing 6-7ghz and beyond while somehow not turning into a fucking housefires meme.
Ha, they tried going that route once. Its called Netburst. Its cancelled successors Tejas and Jayhawk were supposed to be precisely that and they were forced to drop it because 2.8ghz 50 stage chips were drawing more power than an overclocked Pentium D Emergency Edition.

>AMD having a better SMT design that intel which has done almost zero innovation on SMT since sandy bridge
AMD's core is 6-wide and has separated the integer and floating point pipelines into 2 separate blocks each with their own dedicated scheduler. Intel's core is 4-wide(? not sure), INT and FP pipelines all feed from a single scheduler.
AMD gets better SMT scaling simply because it can hurl more resources at the threads at any given time.

>Kaby lake was already a slap in the face of their customers. the IPC is actually 1-2% worse than skylake
Source : my ass.

Play games, get the same performance as the Intel and slightly less, or you can encode some stuff and encode faster than the Intel or you can make another CPU thread where I feed bait biters.

No benchmarks with DDR4 4000mhz on X370 yet.

Enjoy better performance and smoother gayming than Intlel's housefires.

you now fell for the pajeet shill meme.
time to shitpost from your new poo CPU.

other than that, no one really knows.

> checked benchmarks, reviews
> he knows what he is talking about

> call him retarded
Yes, this is Sup Forums.

Run games on a non 60+ hz monitor, my gpu is the limit.
Run programs that eat 10-20% of the cpu power
Run a video in the background as a podcast that eats 1-3%
run a game and realize that I have the power of 2 computers all in one, doing all that shit in the background AND playing a game at the same time doesn't even touch the game at all.

Once my 1.0.0.6a bios comes out i'll get a minimum 3% across all applications boost (there were no non beta 1.0.0.5 bioses) along with a memory uptick of around 1000mhz, and thats if I dont dick around with overclocking it to further boost cpu performance.

and amd's platform is doing this in under 4 months while intel took over a year to knock the kinks out at best.

Run 3200+MHz CL14 memory based on Samsung B-die and life is good.

we have 3600 ones though but no one believes them because they seem to be too good, and 3600 is near impossible to get, outside of a few very specific builds.

>elongate the piepline and make a clockspeed monster pushing 6-7ghz and beyond while somehow not turning into a fucking housefires meme.
Tried time and again.
This always fails.

I've built a 1500X, GTX 960, MSI B350M and 16GB of Corsair Vengeance LPX 2666Mhz RAM and it runs like shit. FPS in games drops 50 FPS and returns in a pulsing like fashion, every ~1 second. It's unplayable. I've tried everything and can't seem to get it to work.

RAM also won't run at 2666Mhz. I'm tired of waiting for shit to get fixed in BIOS.

topkek

AMD fan here and it's true

but I'm still buying AMD over Intel

fuck intel

run lots of shit in background
switch between programs momentarily

You should have enough ram for this.

...

Related question, i read the other day that the ryzen 5 + 1400 boxed fans are supposed to be ""extremely quiet"" (at least during idle that is), any ryzen owners on here who can confirm or deny this?

>Want to look up comparisons of the 1600 vs 1700 in things like Premiere and Lightroom
>Every single benchmark video is for gaymen
Why can't gaming kiddies just stick to Intel? And why the fuck are there so many benchmark videos for CPUs for video games? CPU benchmarks in gaymen only matters if you have a GPU bottleneck which 99% of people don't have unless the only game you play is stuff like CSGO at 1024x768

>CPU benchmarks in gaymen only matters if you have a GPU bottleneck
1) Try reviewing the shit you wrote.
2) There is no abstract X bottleneck, every game is unique. Unless you are generalizing autist ofc.

That is because reveiwing cpus for serious business doesn't get clicks like gaming reviews do.

>1) Try reviewing the shit you wrote.
Do you know how to read? This is what happens when the GPU is the limiter and not the CPU, it makes no difference in games unless you play at low resolutions

>How do I have fun with this?
Return it

If it gets better with CPU change (Intel->AMD) it is not GPU bottleneck by definition, fucktard.

>960
Found your problem.

Nvidia drivers are shit on Ryzen in specific situations, ask them directly to fix it.

reddit.com/r/Amd/comments/64r0hj

um, hate to break this to you but amd's 4 core non apus suck dick as many games require more then 2 cores, and currently the interconnect is completely dependant on ram speed.

Then you have the distinct pleasure of nvidia no longer giving you feature updates so it's unlikely your 960 will ever perform good on it.

Here is hoping its more amds problem then nvidias and the bios helps you.

Also, just incase anyone asks, the current 4 cores are salvaged 8 cores or gimped 8 cores, judging on yields, more likely gimped, the apu 4 cores will only have 1 ccx so there is no cross ccx latency.

R5 1400 is one CCX.

No its not. All ryzen chips are two CCX's with varying amounts of cores disabled. It is a huge factor into why AMD can sell them (relatively) so cheap. AMD - as of right now - only actually manaufacture a single chip (the 1800x), everything else is cut down from that.

1400 is one CCX whether it is binned or not. That's why it has 8MB cache, not 16.

1500x is two CCX, 1400 is one.

>1500x is two CCX, 1400 is one.

No, the 1400 is two CCX - it just has more cache disabled. There are no single CCX ryzen chips in existence at this point.

You've only been hearing that from Intel fags.
Ryzen is better than anything Intel has to offer at the moment.

Tried to ask yesterday if getting a 1700x from my 6700k is worth it
Was told I was baiting

Need advice
I have two 1080s and I game at 4k. Use a lot of photoshop

ok ill research it

It may be not worth it if you are not feeling limited. If you are: yes it is worth it.

Or overclock yer Intel.

Not limited at all
Never ago above 30 percent usage in gayms
I was told that Ryzen is aimed towards people who haven't built their computers yet

fucking dumbass, every consumer ryzen cpu has 2 CCX, and specific cpus have specific cores disabled among them
1 CCX contains 4 cores and a certain amount of cache

the 1500X Is 2+2 (both CCX enabled) while the 1400 is 4+0 (only 1 CCX enabled), that's the reason for the cache difference

When I asked my insurance company if my i7-7700K was covered by house insurance they increased my premium by over 600%.

>posting easily verifiable wrong shit that has already been refuted
kill yourself

Can anyone prove it one way or the other with a source? I would really love to know

Do you need 5960x for 400 dollars?

Isn't the 1700x better?

It is irrelevant information unless you are coding a program that is so tightly tied to the hardware that infinity fabrtic latency has to be accounted for.

No it's not you dolt, stop shitposting

Look, just because you don't like the fact that the 1400 is 2+2 doesn't make it any less true. Thew whole point of ryzen is to tscale and that is the method AMD has chosen. Now the raven ridge dies probably will be 4+ 0 due to the inclusion of an igpu but as it stands right now all ryzen chips are made up of two CCX.

hardocp.com/news/2017/03/16/amd_ryzen_5_processors_core_ccx_allocation

AMD said a while ago that all current CPUs are 2 core complexes, Someone saying that this one isn't too core complexes, they need to back that up with the fact.

Any program that uses more than 2 cores is affected by latency between cores. For video games, when the game doesn't use more than 4 cores but is split between 2 core complexes, the impact of frame rate is about 20%. Someone ran a benchmark a while back that was hammering the absolute fuck out of the CPU, and he ran it in dual core mode, and selected cores for the 2nd core until he found one that was higher performance and lower performance, indicating one was on one core complex the other was on another. The difference between them is somewhere between 15 and 20%, but since the benchmark only put out absolute values it was 20 frames to 16 frames. Or was 20 frames to 18 frames I forget, it's been quite a while since I've looked into this.

In terms of absolute performance across everything Intel is currently better, but not by much.
In terms of compatibility across everything, Intel is better but by the end of this month or next month that gap is probably a close to a very minute margin

Personally I'd rather go AMD right now so long as you're getting a 6 or an 8 core, but that's just me. You could technically get far more performance out of an Intel i7 then you would out of then AMD 1700, but the 8 core AMD allows you to have so much more shit going on in the background and not affect the program that you're working in currently that it's really hard to justify that 20% higher peak performance.

HEDT stuff

Molecular Dynamics, Material Analysis and DFT: LAMMPS, CHARMm, Amber, NAMD, GROMACS, abinit, bigdft, CCP4+COOT, Warp3D

MATLAB/Octave/Maxima/SAGE scripts

multiphsyics or FEM/FEA jobs like COMSOL, ANSYS, PETSC-FEM, Code_Aster, Elmer, libmesh+MOOSE, Calculix, ngsolve, FeniCS

CFD via OpenFOAM, Gerris, Code_Saturne

CAD: NX, FreeCAD, Solidworks, CATIA, Creo, solvespace, OpenSCAD

Modeling: Blender, Maya, Cinema4D, 3DS Max

Photo/Video Editing: Natron, NUKE, Premiere Pro, AfterEffects, Sony Vegas, PS, GIMP, Krita, Inkscape, Illustrator, FireWorks, Darktable

Rendering: Rhino, Handbrake, POVRay, luxrender, ffmpeg

Virtualization: Docker, QEMU/libvirt, Dolphin, VBox, VMWare

Just a few end-user programs, not even including server and router services, compilation/development, VCS/SCM, cryptocurrency, distributed computing, and ofc gayming and multimedia uses. Learn something new and get real work done.

If you're not fucking with us, and I honestly believe you are, going to the 1700 would be a sidegrade for the most part.

I can personally set up my 1700 and have it doing the 20% CPU power in the background while I start up a game to kill time, and it feels like I'm not even using the CPU to do something fairly intense in the background.

On the topic of Photoshop that is a heavily single core program with some of its components using 2 cores, your 6700 is going to be better than the 1700 and it. Now if you want to have a bunch of shit open at once and not affect anything of the game, while the there you go AMD's probably a good option, however you have Skylake, and i7 Skylake, upgrading from Skylake would be fucking retarded if you're not in a completely max out the CPU. Ryzen successor will probably be worthwhile to you, it may even be worthwhile to you once the new update comes out for the BIOSes, but just in terms of performance I wouldn't recommend an upgrade.