So I'm actually fairly impressed with the R7, pretty good TDP and the 16 threads at 3+ghz is nice as well

So I'm actually fairly impressed with the R7, pretty good TDP and the 16 threads at 3+ghz is nice as well.

I'm thinking of upgrading my severely underpowered media server from an old budget build Pentium G3258 to a new R71700X.

Would this actually be worth doing or would an i7 on the 1150 socket be a better idea, the most important function of the server is video encoding so I will need as much CPU power as reasonably possible for around 400 bucks or so.

I really hate the i7 since its just a memed out quad core but it would be cheaper to just buy the chip and swap it, however if Ryzen really would be that much better I wouldn't mind dropping an extra hundred bucks for a new MOBO as well for Ryzen.

Can any of you guys who have one of these CPUs comment on the FFMPEG encode speed? Specifically in regards to the rate of encode for 1.5 MB/s 1080p video using libx264?

I've looked around the web but none of the benchmarks cover the type of tests that I want, they are at best testing a 2 minute long video at 720p or some garbage in handbrake which isn't even indicative of the same thing. My videos take about 10-20 hours to process now and I am looking to greatly reduce this number.

Intel users, feel free to chime in as well.

Other urls found in this thread:

builds.x265.eu
twitter.com/NSFWRedditImage

Anything that uses MOAR CORES is going to do better with R5/R7. You don't even need to see specific benchmarks for your workload to know that.

I don't know, you tell me.

>1.5MB/s
Can you please not do this? Just use 22 or 16 CRF and stop pulling arbitrary bitrates out of your ass.

Also use the fast preset. I've done tons of encoding tests and that one gives you the best balance of encoding speed and compression efficiency.

>x265 Encoding Benchmark
You fucking retard. That encoder is still unoptimized as fuck. Just show me x264 results

If that is accurate that is actually pretty good, how did AMD send Intel into the trash so fast? It's surreal.

Also, one more question, do you think the difference between the R7 1700 and the R7 1700x is actually consequential?

I have to do that for my LAN, otherwise the files would be lossless rips, the slowest link on my wifi is just a hair over 1.5 MB on average.

It's not like I'm seeding this stuff to people.
Besides, the veryslow preset makes the video perceptually lossless at that rate anyway.

>If that is accurate that is actually pretty good, how did AMD send Intel into the trash so fast? It's surreal.
It wasn't fast at all, people forget that Zen had been in the works for at least 5 years

Yea but why in the hell is Intel just releasing Skylake 5 times, are they seriously retarded?
I refuse to believe that Intel knew nothing about the performance of AMD's new chips.

>media server
what do you need 16 threads for?

>the most important function of the server is video encoding so I will need as much CPU power as reasonably possible
Can you read?

They didn't even start on their new arch until last year. It should be out by now.

Where have you been for the past 5 years? HEVC now has 50% improved compression efficiency over H264.

See:
builds.x265.eu

Jim Keller

Nah, just get the 1700 and OC it to 3.8GHz. Most report stable temps on prime 95 with the stock cooler at that frequency.

1.5MB/s or 12Mbps is a fuckload of bandwidth m80. CRF 22 would work perfectly, hell even CRF 16 on 1080p movies would work on that.

Finally yeah the verslow is better than the fast preset but only by like 10%. Just don't.

I like H265 but I am stuck on H264 since a lot of my smaller integrated devices only have hardware H264 decoding.

does the power consumption go above the 1700x if you overclock it that high? I'f so id rather just pay the difference for the 1700X since its gonna be running like 24/7.

>1.5MB/s or 12Mbps is a fuckload of bandwidth m80
I'm coding down raw blurays to like 10GB, its fine and I got tons of disk space. I'm a faggot for picture quality and this is practically lossless to me for all I can tell except its a third of the file size. Most raw blurays have a bitrate of like ~4MB/s.

I dont mind waiting on it because I only have to do it once per file and I have like 5 PCs that I can use to encode along with the server so most of it is done overnight. If I was worried about speed I would just use NVENC or something but that looks like absolute shit so I'd rather not, I can just wait the extra few hours.

>I like H265 but I am stuck on H264 since a lot of my smaller integrated devices only have hardware H264 decoding.
Well that fucking blows. Even $100 android phones have HEVC hw decoding now.

>does the power consumption go above the 1700x if you overclock it that high? I'f so id rather just pay the difference for the 1700X since its gonna be running like 24/7.
Really depends how much you can undervolt them which is very unpredictable.

>I'm coding down raw blurays to like 10GB, its fine and I got tons of disk space. I'm a faggot for picture quality and this is practically lossless to me for all I can tell except its a third of the file size. Most raw blurays have a bitrate of like ~4MB/s.
16 CRF is pretty much like 90% lossless to human eyes. You should try it out at least. Using arbitrary bitrates only ensures a steady bit rate but CRF ensures quality across ALL source types and resolutions.

>I dont mind waiting on it because I only have to do it once per file and I have like 5 PCs that I can use to encode along with the server so most of it is done overnight. If I was worried about speed I would just use NVENC or something but that looks like absolute shit so I'd rather not, I can just wait the extra few hours.
For 10% better compression efficiency?

>For 10% better compression efficiency?
10% of 15GB is 1.5GB, that adds up when you've got hundreds of movies and shows to store.

desu all this oh these have trouble streaming is a load of shit, my 6700k at 4.6 (1.35) can handle 3.3mb bitrate just fine with a ton of stuff in the background

Ryzen needs:

>special fast ram
>Special mobo to support your special fast ram
>special overclock that caps at 4ghz in order to perform
>special aio cooling for your overclock because anything past 1.35v turns into a housefire

And after all this, it still loses in single threaded tasks and is only around 20% faster in multi.

Wait for ryzen 2 or 3

I have never seen so much bullshit piled up into one post at one time.

I got 1-3GB for 2 hour long 720p movies with 22 CRF HEVC using the fast preset. Your movies have way too much bloat man.

But still yeah 10% is a lot over large collections but is it really worth the wait? I mean what does a 500GB HDD cost nowadays, $20?

We're not talking about playing video games in your mom's basement all day long, timmy.

literally, all are facts.

Look man, it doesn't really matter, I'm not worried about the intricacies of x264 encoding flags, I can worry about that later.

Right now I'm more interested in the hardware because no matter what a Pentium is not going to cut it anymore so all of this is moot until I replace the hardware.

get 1700.
3.5GHz is sweet spot.
3.7 is second sweet spot.
after that it will suck power.

...

>literally, all are facts.
Working hard for those intel ©®™ bucks©®™ are we, timmy?

this has to be bait

or maximo autismo

whatever let's you sleep at night.these are just facts.

we've had ryzen performance for almost 3 years and the new ryzens are actually not that cheap.
fx at least were cheap.

> I really hate the i7 since its just a memed out quad core
Memed out? Modern i7 are nice CPU. As are Ryzen.

You'll be able to do this with either, so buy either.

doki, pls stop.

Stop posting please

Except Ryzen has twice as many threads and more L3.

i7's are basically a ripoff and Intel knows it, they just don't care. They'd rather keep shoving bigger GPUs into the package instead of adding more cores or hardware to increase performance.

I have a house full of Intel and even I know this.

literally, can't prove me wrong.
>wow ryzen matches this almost 3 year old xeon

>special overclock that caps at 4ghz in order to perform
>special aio cooling for your overclock because anything past 1.35v turns into a housefire
What is with Intelfags and all the projection lately?

Hmm well then yeah man, R7 1700 is probably a good idea. Yes the 1700X could have better undervolting but there's just no way to know for sure.

Also I would highly advise you upgrade your shit to to have HEVC HW decoders. Having 4TB instead of 8TV worth of video files is a huge difference.

Doki uses like 28 CRF H264 8-bit video. Fuck them.

>Also I would highly advise you upgrade your shit to to have HEVC HW decoders
In good time I will, as they break or make me rage enough to break them.

>Doki uses like 28 CRF H264 8-bit video. Fuck them.
Yea, everybody knows YIFY is the best anyway.

delid this

doesn't matter if the 7700k is a housefire.

ryzen just doesn't overclock due to architectural reasons., and once you do all that efficiency goes to the trash.

>YIFY is the best anyway.
They managed to somehow be worse than doki. Literally just 2-pass ABR 720p movies with ~750kbps (~1,500kbps for 1080p). They don't even use a CRF.

It doesn't overclock due to process limitations you brainlet nigger.

Dude STFU, you're giving us all a headache.

> Except Ryzen has twice as many threads and more L3.
That's cool and all, but it doesn't nearly translate into twice as much performance.

The overall deal between Intel or AMD is ~the same for the same price.

> They'd rather keep shoving bigger GPUs into the package instead of adding more cores or hardware to increase performance.
Sure, there was nothing but GPU to compete against before AMD finally released a useful chipset on the midrange / high end again after years. Intel certainly acted accordingly.

But you're not going to change US law so that that these monopoly games stop with US companies, right? Be prepared for every US company to act that way.

It was just a prank bro.

I was joking

>The overall deal between Intel or AMD is ~the same for the same price.

lol.

you don't really need a super fast cpu to do a LAN streaming server, as you can just tradeoff encoding time for bitrate

Oh, it kinda sounded condescending lol.

But seriously though fuck yify. I would be somewhat okay with their releases if they at least used a CRF of 28 on all releases. Every time you play a yify rip you gamble at either getting fair looking video or play glaucoma simulator.

woah user where'd you get episode 10 from

OP wants to do a 1-time encoding to share on many devices and is worried about file sizes.

1600X, 1080p crf17 veryslow

then why did he specify a bitrate?

Read the thread jesus.

Thanks, that's pretty helpful

because he dumb lol

Stop lying.

Stop lying.

Stop lying.

Stop lying.

Stop lying.

Fuck this, I'm just gonna burn all my shit and collect stamps or something.

t. OP

>no iGPU
am I only one who is sadden by this?
I know I know
more power for the cpu
but hell, I loved clean builds with intel
and I dont need gaming or 6 monitors gpu

it was amazing running two monitors on i5 6600k and it whole my fucking pc with 6TB HDD storage and 500GB ssd storage consumed 35W on idle

now i have to put power hungry dedicated gpus in my case that cant even downclock when running two monitors :[

If you're that poor then use a G4560. It has an iGPU.

its called APU when amd does it kiddo

Ryzen APUs are not on the market yet

I'm waiting on Ryzen+vega APUs so i can build the ultimate HTPC

Noice.