RAM Performance

How did we go from this....

Other urls found in this thread:

youtube.com/watch?v=r4EHjFkVw-s
youtu.be/AqDOefJc7a4?t=19
en.wikipedia.org/wiki/Word_(computer_architecture)
anandtech.com/show/4503/sandy-bridge-memory-scaling-choosing-the-best-ddr3/2
twitter.com/NSFWRedditImage

.... to this?

The games in the first graph were made to work on old gen consoles which had at most 512 MB of lower clocked ram, while Fallout 4 was made to work with current gen consoles which have 8 GB of GDDR5 ram. I assume you can connect the dots here.

...

Yes it turns out buying expensive ricer overclocked gaming RAM is the best gaming performance boost a system can get and is totally not a scam anymore

Fuck

I know the feel :(

also
>1200 MHz CL17
at least yours wasn't expensive

ddr3-1333 is hilariously slow though.

Well it mostly means that 1333 DDR3 is far slower than what the game was designed for.

According to the benchmarks I've read, 3200 CL15 seems to be sweet spot. FO4 is the only game that benefits very slightly from higher speeds up to 3600, but every other game stops scaling in the 2400-3000 range.

hey guys! ooh big ramdisks, huh? alright! welp, see ya later!

I want that. What CPU? I'm getting a 5820K soon and I'm worried if it does 32 3200 Mhz 4 x 8

What, you mean the slow shitty RAM?

No 32 GB
16 would be more than enough but 32 are perfect to experiment with DDR4 ramdisks

> if there is higher amount of data to be transfered faster ram will have a advantage over ram slower ram
Is see no problem
If you use only like a few hundred mb have to be transfered of course the ram wont be the bottleneck.
But these days games are all open world games with like 6gb ram consumption.

I have 16GB of Corsair DDR3 @ 1600. Is it worth throwing it in ther trash and buying 2400? I think my 4770K system is being bottlenecked slightly by memory.

Enjoying my 32 GiB of quad-channel 1600 MHz CL11 over here

>ramdisks
tmpfs > ramdisk, but I guess your shitty OS doesn't have that. I mainly use mine for ARC anyway, which works much better than whatever hacked-together ramdisk garbage winfags like to try to make their VIDYA GAEMS load LIGHTNAN FAST

Exactly, despite GIMP excessive use I want to find out if ME3 loads instatly on a RAMDISK.

>1200
it's 2400, because that's how ddr works

Oh, so Speccy is just reporting the number inconsistently with how every single manufacturer reports it?

Got it, Speccy is a piece of shit I guess

I still wonder why Speccy can't detect VRAM and RAM properly 2 years after release. DDR4 isn't that rare anymore.

Just windows things I guess. inxi gets it right.

winfags can't have nice things

A little warning here.

You really should run Fallout 4 at 60, don't run it uncapped.

youtube.com/watch?v=r4EHjFkVw-s

I thought this wasnt really a thing anymore since the DOS days.

>I thought this wasnt really a thing anymore since the DOS days.
Ever heard of skyrim?

youtu.be/AqDOefJc7a4?t=19

its like the GTA ports all over again

>tfw 800mhz ddr2 while playing fallout 4

>GDDR5 RAM
nigga what

Consoles use what traditionally would be VRAM as system and GPU memory.

There is a performance impact to this as GDDR5 is not meant to be used as general system memory.

PS4 have a GDDR for its RAM.
> Memory 8 GB GDDR5 (unified)

At least the PS4 does. I believe the XBONE is different.

This also happens in Quake 3, you know the game everyone must stroke their cocks to, except the magic number isn't 60.

>playing fallout 4
¿

why dont PC use gddr5 instead of ddr3 and ddr4 too?
gddr5 is sooo much faster

What CPU was the first pic from? Because in your second pic all you showed was pleb tier chips with dual channel ram. For fucks sakes old as shit i7 920s at least have triple channel memory.

is it 30 fps?

OK I managed to overclock my DDR3 @ 1600 to 2200. I can't quite get to 2400 for some reason. I probably need to mess with the timings but I am shit at this stuff. Anyone have any experience overclocking memory?

Because it has HUGE latency and power draw

Is 20gb ram (2*8 + 1*4) worse than only 16gb (2*8) in terms of speed?

Does the latency not affect the data use in video cards?

Of course it does, but for them bandwidth matters more than the latency.

It's more prone to flip bits than the usual DDR. That's why it's used for a video memory, one wrong pixel at a frame is not a big deal. However, using GDDR as a RAM is risky. I think Sony decided to cheap out.

Also happened on Dark Souls 2, 30 fps was the cap for consoles, the port for PC allowed up to 60, which caused hilarious problems such as weapons wearing out twiceas fast, among others

As long as they're all the same speed and CAS latency it should be alright, though I'm not sure how running a kin in dual channel and a stick in single channel would affect things

I'm not that guy.

System ram: Good for many small chunks of data.

VRAM: Good for few, but very big chunks of data.

125 as of Quake Live, meaning you can't even make full use of 144hz. Pros have no issue playing like this, but I'm just saying.

depends.
does all of it use the same freq/timings? if no, all ram will run on the lowest settings. i.e. 2x8gb 2400 + 1x4gb 1600 will give you 20gb 1600.
how many memory channels does your system have? on a dual channel system, you'll end up with 16gb dual and 4gb single channel (if you do it right) or 8gb dual plus 12gb single channel (if you put both 8gb sticks in one channel). on a triple channel system, you'll have 12gb triple channel plus 8gb dual channel.

It reports right clock frequency, but in ddr memory transfers occur both on rising and falling edge of clock signal, meaning it's like a conventional memory with twice the clock speed.

tl;dr bethesda can't into vydia development and have begun to use system RAM for vRAM, which you should literally NEVER do.

GDDR5 is basically DDR3 but tuned for GPU access patterns (higher burst throughput, higher latency)

On your CPU you don't really care about throughput because you aren't saturating that 100 Gbps with your word processing program, what you ARE suffering from is the latency introduced by all those indirection fetches that modern bloated enterprise langs come with

>It reports the right clock frequency, but here's why it's wrong
You can't justify away this shit when every single tool, manufacturer, BIOS, etc. in the world uses the “doubled” frequency numbers and only your special snowflake piece of shit tool reports something completely different

For memory bandwidth, yes. It will only run in single channel.

>on a dual channel system, you'll end up with 16gb dual and 4gb single channel
lol no

>reports something completely different
you mean the actual frequency?

>you mean the actual frequency?
‘actual’ is a strong word to use for the bus frequency when the ‘actual’ rate at which you transmit data words is 2400 Hz

>when the ‘actual’ rate at which you transmit data words is 2400 Hz
lol no
en.wikipedia.org/wiki/Word_(computer_architecture)

What are you trying to imply?

I never said you only transmit a single octet per transfer. Sure they're bursted together, the point is, you get one transfer per clock edge

Stop trying to shitpost and just accept that the data rate (transfers per second) is what matters in practice, not the underlying bus clock

And this is the reason why EVERYBODY uses this figure, except speccy. Go defend your shitty winbabby software elsewhere please

youve repeated your retarded ideas so much that you actually believe them dont you?

Daily reminder that anything over 800mhz is a meme.

(this shit should really be running at 800mhz, what the fuck is happening here?)

Whether the engines were poorly optimized before or are poorly optimized now, who knows?

God you're retarded

It could simply be that the DDR3 hardware was way ahead of its' time in terms of bandwidth. It's been how many years now since DDR3 started being common? And only now do we have software that attempts to make use of it. Though one can muse about how well Fallout 4 in this example is using the bandwidth.

Mind telling me why, Mr. Smart?

You don't know how speccy read ram speeds and claim slow ass DDR2 is fine

No, its that they compared shitbox tier CPUs with dual channel memory to a system with a unspecified processor which probably had triple or quad channel.

"Probably"?

anandtech.com/show/4503/sandy-bridge-memory-scaling-choosing-the-best-ddr3/2

That's the test system in question. You'll find the image OP posted on the Graphics & Gaming part of the review. Looks to me like similar type of hardware, with the same dual channel limitation.

here's your (you)

go find me a program, website or article on the internet that does not refer to DRAM speeds using the doubled data rate