Just 6 years ago, 6gb of ram was considered "exceptional"

Just 6 years ago, 6gb of ram was considered "exceptional".
Will my 16gb look obsolete in 6 years?

Other urls found in this thread:

youtube.com/watch?v=JDUcy2IkyQM
google.com/search?q=2016-2009=
overclock3d.net/articles/cpu_mainboard/intel_skylake-e_will_use_new_lga_3647_socket_and_use_6-channel_ddr4_memory/1
twitter.com/NSFWRedditVideo

>Best build back then: $1340
>Best build back now: $3992

In reality Falcon just started getting paid to shill for overpriced and bad hardware.

Been using 4gb for 5 years and im fine

>Just 6 years ago, 6gb of ram was considered "exceptional".
>Will my 16gb look obsolete in 6 years?
Yes

Right? I don't play games or want ramdisks or anything fancy, though, so I'd still be happy with 2GB.

No you wouldn't, nigger. Win7 with any decent browser will exceed 2gb in 5 seconds flat

of course hes a paid shill now.

He outlined the entire thing in NVIDIA cards now that they released.

His goal is to not make you aware of better options but to sell what hes paid to sell

If I didn't run virtual machines and Chrome wasn't a memory hogging piece of shit, 2GB would be plenty for casual usage. Most midrange smartphones and tablets don't have more than 1, and you can browse the web just fine. The only reason games require 6-8 gigs of RAM nowadays is because of the new console generation, and devs being lazy pieces of shit.

What are ya, some 32-bit pleb?

...

>he fell for the 16 GiB ram meme

Literally none of that is true. With only 2GB of RAM, the OS (all of them do this) would not be able to cache any data, so everything would feel much slower.
Midrange smartphones and tablets typically have 2 GB of RAM, with some by people such as Motorola even having 3GB. Only very low-end phones still have

>tfw building a workstation with 128gb RAM just to piss off Sup Forums

>the OS (all of them do this) would not be able to cache any data
What fucking data? What data do you need to cache in RAM other than the OS itself for running an OS?

>but how the fuck are you going to use 4K textures and simultaneously keep the memory usage below 500MB?
YOU'RE NOT, THAT'S THE POINT.

You don't HAVE to use 4K textures and a hardcoded draw distance of a billion miles. Add an option for it, and let people play the game with lower resolution textures if they want to. Fallout 4 for instance has no right to require 8 gigs of RAM in order to function when it looks worse or on par with Skyrim. A good engine adjusts to the hardware it's running on, and scales from Quake 2 graphics to a photorealistic multi-monitor 120 FPS 4K scene that needs a render farm.

Until there's a major shift in 3D technology and we're all using voxels or something, devs have no excuse other than incompetence or laziness.

you could probably use the 10GB that you've never used in your current PC.

You cache the recently used programs and files. Do you open Excel often? Keep that shit in the RAM that way it opens instantly.
>Making your textures high quality is lazy development
Man, you should let EA know. I'm sure that they would love to stop being so lazy.
Fallout 4 uses 2-3GB of RAM on my system.
And let's see you create this god engine. But you obviously can't, because you don't even understand caching.

>You cache the recently used programs and files.
I'm pretty sure I don't.
>Keep that shit in the RAM that way it opens instantly.
Or get an SSD.
>I'm sure that they would love to stop being so lazy.
They wouldn't, as long as there are people who buy their lazily made games. And there are, because buying 16 gigabytes of GAMER RAM is the obvious solution.
>Fallout 4 uses 2-3GB of RAM on my system.
I have 4 gigs of RAM, tried playing FO4 with virtually nothing running on my computer. Even killed explorer.exe and most services, my total RAM usage was a few hundred MBs. Fallout 4 starts paging like fucking crazy even though it barely uses any of the available RAM, because it just EXPECTS to have 6+ GBs available even when it doesn't use any of it. That was on release, no idea if they patched it since then. I'm willing to guess no.
>And let's see you create this god engine.
Yeah I'll just write a new fucking engine from scratch in front of you to prove a point. Being able to adjust graphics settings used to be a standard people expected to find. Console ports and rushed triple A games optimized for a single platform are the reason we don't, not because it's impossible.

No you dumb nigger, it's because the best shit right now is an intel extreme enthusiast platform/socket which obviously costs more see lga1366 lga2011
x99 motherboards are lga2011v3 aka the new extreme socket and are priced as such

I know you're full of shit because I just upgraded from a 7 year old build with 4GB of ram and it was painful to use with just Chrome open and a few tabs.

Well that's weird because the box I built in 2009 has 4GB and handles a few (or more than a few) Chrome tabs just fine. Are you sure you're not just a moron?

>hes a paid shill now.
>some people actually believe this

No idea what you're on about.

In 6 years, even you'll be obsolete OP

>A good engine adjusts to the hardware it's running on, and scales from Quake 2 graphics to a photorealistic multi-monitor 120 FPS 4K scene that needs a render farm.

Give me an example of a game that does this.

There aren't any, that's the problem. Source comes close because it's optimized competently for low end hardware, but it's kinda mediocre compared to "next gen" games with a billion shaders and FOUR KAY TEXTURES. Even Unity which is supposed to be ran on phones and cheap tablets runs like shit on anything that isn't a recent mid-range device or a proper desktop.

Also, while it's not really an engine or a major title in terms of flashiness (the art style carries most of it), WoW is a pretty good example of what you can do to optimize your shit to run on 2004 hardware. It's seriously not that difficult, turn some effects off and render less shit.

Is there an updated guide?

Or the classic.

youtube.com/watch?v=JDUcy2IkyQM

how many tabs do you have open? it's just retarded 1337 gaymers thinking system ram makes their games faster or some shit

16GB today is cheaper than 4GB was 6 years ago. Once there is a proper use for all that RAM for a layperson (and not just more firefox tabs), then more RAM will be useful. As it is, 8GB is the new norm, and we are not at the "16GB is normal" stage yet.

VMs.

You don't understand what superfetch is, user. Superfetch is when Windows boots up and preemptively caches programs.
When you close Chrome, it stays in your RAM until that memory is needed for something else, at which point it is freed.
And RAM is far faster than an SSD.

>You don't understand what superfetch is, user.
You what?
>When you close Chrome
I don't.
>And RAM is far faster than an SSD.
Yeah, by fucking milliseconds.

Seriously though, in addition to being terribly designed and wasting both RAM and disk space, Superfetch is fucking useless for most people. There is no scenario in which I open Excel, do something, close it, and want it to stay in RAM. If I'm working on something all day, then Excel stays open. Same with my web browser, I never close it, because I always need it. If I open a program meant for "productivity", it stays open until I finish what I wanted to do, after which I shut down or hibernate my computer without using that program again for weeks.

When I unload something from memory or close a program, it's because I want it to fucking stay closed. Buying 16 gigabytes of RAM just so you can preload every application you could possibly use is the very definition of wasting money.

8GB today already feel and seem like the minimum today, in both PC and Notebooks (for video rendering e.g.)
2x8GB is around 50€ today, which would even be a worthable upgrade from my current 4x2GB in the Desktop.

>mfw my first ever graphics card had 16MB video storage but was fine with 3D desktops in Ubuntu and that was nearly all that made me happy in my life back then

Calm down, mate.
Superfetch is shit, which is why everyone disables it. But how is it a waste for a recently used program to stay in memory unless that memory is needed for something else? Unused RAM is wasted RAM, so you might as well use it all in a way that has the greatest chance helping the user while having the least impact on the user experience.
And why do you hibernate? Sleep is suspending to RAM while hibernation is suspending to disk. It's a waste of write cycles on your SSD.

4GB is still perfectly enough if you don't game or render on your computer.

8GB is still the de facto amount of memory a computer should have.

16GB is overkill and you will need to have a specific application if you want to justify having that much RAM. Most users, including gamers, just do not need it.

The thing is, processors haven't come a really long way from around 6 years ago. The jump 6 years back was a lot larger. In 2010 we were just getting used to the i5/i7 sweeping the table. Go back 6 years from that and processors still had only one core with P4 having hyperthreading. At that time most P4s were still 32 bit and AMD was leading the business.

The game changed again when Intel released the Core2 series, which finally brought multi core processing to the masses. I'm not saying AMD was worse back then, but the 6600 is a legend. A lot of processors came after it but really it was good right until the 2500K/2600K came. And those two processors are still entirely perfect for pretty much all kinds of use, especially if you overclock them.

So what I'm trying to say is, in 6 years, processing power hasn't really come that far away, excluding perhaps the most extreme intel shit that's not a sold at a viable price to consumers. A 2500K still is enough for almost everyone.

But what about a Pentium 4 in 2010? It could in no regard compete with the Q6600 or even the C2D line. It was entirely outgunned without any hope.

So before, when we saw more rapid advancements with CPUs, the amount of RAM needed grew with it. But since the progress has slowed down, the need for more RAM simply is not there. That's why 16GB is overkill.

It's not a waste for it to happen obviously, it's a waste of money to BUY more RAM for that explicit purpose. People who buy 16 (and in the past 8) gigs of RAM with the delusion that it'll make their computer faster are stupid. Even in an extreme case where you load up 50 tabs, cache every application you use and run a virtual machine, there's no way a casual user COULD utilize 16 gigs, ever.

Unless you're using a DAW with a billion samples loaded at once or editing raw 4K footage (both which get cached to disk anyways), you'll just waste money. Even if it's for "future proofing" or whatever, by the time browsers and games require 16 gigs minimum, we'll be on DDR6 or whatever, at which point you'll have to buy new modules regardless.

tldr, buy 4 gigs, get another 4 if you need it later. Replace those numbers with 8 a few years from now, if Chrome and Windows keep getting more bloated.

Oh yeah, generally when I turn off my computer it's because I don't return to it for 10-12 hours, so it's probably a waste of electricity to leave it in sleep mode. I usually turn it off unless there's a ton of shit open and I'm in a hurry or something. I used to do it by default back when my OS was on a regular hard drive since Windows takes ages to do a cold boot, not so much nowadays. Although writing a few gigs of data once a day probably doesn't hurt the lifetime of an SSD too much. I've had this cheap 60 gig piece of shit for a year now, and it works fine.

2009 was 15 years ago

Exactly, it's ridiculous.

He was always a paid shill. Nothing changed.

google.com/search?q=2016-2009=

>16gb
>not obsolete now

I've had 16gb for 5 years. If you don't have at least 24gb now you are doing something wrong.

>6gb ram is obsolete now
What the fuck is wrong with you people?

>Games will NEVER use 6 cores
How right we were...

Not really. Plenty of gaymes make use of six or more cores these days. It's only really become a thing in the past couple of years though, at the same time as we finally ditched dual cores.

>It's over NVIDIA is finished

Meanwhile AMD is declaring bankruptcy any day now.

>8GB today already feel and seem like the minimum today
>tfw you begin to fall for the 16GB ram meme

Intel gonna create new socket with 6-channel memory, where at least 20GB ddr4 is needed.
overclock3d.net/articles/cpu_mainboard/intel_skylake-e_will_use_new_lga_3647_socket_and_use_6-channel_ddr4_memory/1

I think you are just jealous.

that is an exaggeration, i rarely broke the 1GB RAM usage mark, but eh that was one year ago. now i use debian and my RAM usage is always sitting at 512MB RAM tops (100MB idle cuz lxde and gvfs).

Back then, every game ran on a single core. Today, most AAA titles require a dual core. And with DX12 set to be mainstream in a couple of years, more-than-quadcore-CPUs may start to get utilised.

>Will my 16gb look obsolete in 6 years?
Uhhhh... that's nearly obsolete now.

>may
And this "may" is enough for you to recommend the Broadwell-E CPUs in your guide which get worse FPS then a 6700K in every current game?

>Broadwell-E
>recommend