/gfx/ - Grafix

Is it possible to run a modern AAA game in 8k at 60 fps?
Let's just say Battlefield 1 had an 8k option for instance.

What cards would my PC need and how many of them?

Other urls found in this thread:

youtube.com/watch?v=211Vdi4oC9o
twitter.com/NSFWRedditImage

You would need multiple Titan X cards, each with more than sufficient cooling and even then your processor would probably overheat.

The tech just isn't quite there yet, but it would be a fun expirament to attempt if you had the money.

2 x 1080ti crossfire should do just fine; should even be allowed to reach a solid 30fps in 16k, minus tessellation.

>spending thousands of dollars to fry a processor attempting to hit 8k 60fps in we wuz field
>not giving that money to a good cause

>8k
I love this meme.

all lowest, maybe:

youtube.com/watch?v=211Vdi4oC9o

t. 970 owner


How does it feel knowing that more and more new games can't even hit 50fps with your meme card?
Soon you'll be in the same boat as the console plebs.

Don't say we didn't warn you.

>2x1080ti
Eh, might be able to render 8k, but the FPS would probably be all over the place. It might PEAK at 60, but I can't imagine it sitting there.

I own a 1060.
More than enough to play anything that comes out.

>I own a 1060.
>More than enough to play anything that comes out.
For now.
By 2018 the 1060 will be a Silky Smooth 30fps card

wtf i hate my 970 now...

>Own the best 770
>still run everything at 60 on medium/high

relly makes u think

There's literally nothing wrong with smooth 30fps though. You only need 60fps for competitive multiplayer.

With a screen that resolution you'd be much better off running the game in either 2160p or 1440p both of which a 4320p (8k) screen would treat as essentially native resolutions.

That's the real advantage of 8k screens in the near term. Every recent standard resolution for games or content is available via simple integer scaling.

>8k
retarded

Even 4k is retarded unless you play on a 50 inch screen.

They can't even currently do 4k at 60fps. You need a supercharged PC like the PS4 for that.

t. poorfag

...

>Even 4k is retarded unless you play on a 50 inch screen.

B-But muh HDR

Frying processors to push the limits is a better cause than feeding children.

In my opinion 40" is about the sweet spot for 2160p as far as computer monitors go. Just a bit sharper than your typical 28" 1440p monitor but with a lot larger usable area.

48" is where you start pass beyond typical monitor sharpness and start to see it creeping into screen door territory.

>being this much of a faggot
My 760 still runs most games on medium-high at 1080p/60fps.

>medium-high
What's even the point?

>My 760 still runs most games on medium/high
Yes, and how does it feel to have less visual fidelity than a PS4 pro with something you probably spent $800+ building?

I'll wager that GPU performance doesn't double in the next 10 years.
I've been saying this for a while, that we could see another one or two cycles before NVidia starts selling GPU's in pairs or 3-4 slotters.

If you think you're going to be able to run 8k any time soon in the future with dynamic lighting/shadowing and high res textures you're going to be sorely disappointed.
If you want to look at the future of computing look at the present of batteries.

This. Also VR is a flop that won't be feasible in any meaningful way for at least another 100-200 years. Cancer is impossible to cure. Interstellar travel will never happen. We'll never have photorealistic graphics. We've pretty much reached the limits of our knowledge, there won't be many more scientific discoveries that will change how we look at the world. Also these horseless carriages will never take off.

>We've pretty much reached the limits of our knowledge

oh i wonder if anyone's ever thought this before
like 300 years ago or something haha wouldn't that be funny

The real question is what the point of running game with textures designed for 1080p in 8k
What are you trying to achieve with it?

All of your statements have truth in them except for cancer research, which is actually moving forward at a rapid pace.
Oh wait, shouldn't we be reaching the singularity right about now?
How's the internet of things going?
Hmmm...
Luckily we have nanotechnology and flying cars now!
See I can do ridicule as well.

8k screens are inevitable as it becomes financially unreasonable for 100ppi or lower screen fabs to continue operating.
You don't have to actually run all games at native resolution though, lower resolutions will work just fine on them.

>I'll wager that GPU performance doesn't double in the next 10 years.
You're almost certainly wrong about that. GPUs parallelize far better than CPUs so as long as Moore's law continues they'll continue to get faster. The real story isn't at the high end though, its at the low end. AMD's next-gen SOC chips should basically remove the point of buying anything under a 480 or a 1060.

You can't even play modern games at 4k/60 on most rigs. Wait until the next generation of GPUs next year.

Stacking more cores doesn't constitute a performance increase.
If your graphics card doubles in size there is no performance increase.
Is that hard to understand?
The only way to meaningfully increase GPU or for that matter processor performance is to increase the size of the machine at this point(give or take a few tweaks taking place in the next generation).

>Stacking more cores doesn't constitute a performance increase.
GPU upgrades literally work by stacking more processing units

I just find it funny how you, a person who likely has zero involvement in scientific research or simply scientific knowledge, make assumptions about how the future of technology will go basing yourself on what little current information you have as if scientific progress is linear rather than entirely non-linear.

Also, your equivalency between computation in general and battery technology innovation in part is completely false. While there is much interest in investment in what you very generally refer to as "computation", battery research doesn't get nearly the amount investment "computation" gets, which is only further exacerbated by battery research suffering greatly from multiplicity, which is to say there are many different concepts for innovation but there are so many that whatever investment there is is far too spread thin.
With that being said that is a much greater interest now in battery research as electric vehicles are becoming a larger more viable market.

And finally, moving your ridiculous wager that GPU performance won't double in the next 10 years when GPUs are coming out this year that show a 30-35% increase in performance aside, AMD is working on technologies that could possibly not only double but multiply GPU performance by many more times, and it is projected to be viable within the next 5 years.

>Stacking more cores doesn't constitute a performance increase.

With CPUs this is true, at least as far as handling single threaded tasks is concerned.
GPUs on the other hand deal with highly parallel tasks and MOAR CORES is actually exactly the solution. GPU transistor counts and GPU performance have advanced together in lock step.

As long as transistors keep getting smaller, or we keep coming up with novel ways to fit more into a die like 3D stacking GPUs will keep getting more powerful.

Transistors physically cannot get any smaller than the coming generation, and it will be many decades before we'll see them get smaller if at all.

t. retard

Except this time they can't make them smaller so they have to increase the size of the GPU to increase performance.
In previous generations there were more cores and they got smaller which means a performance increase.
Moore's law is dead. This is a fact.
This means your GPU won't increase its performance. In the next few decades the upgrades to your GPU will be solely feature based (as in Nvidia's Physx and other goodies which specialize in a certain aspect of performance tweaks).
As said, transistors can't keep getting smaller, which is why I stand by my bet that GPU performance won't double in the next 10 years.

>nvidea
>crossfire

Your mind literally cannot comprehend innovation beyond the most obtuse frame, can it? AI integration, efficient binning, quantum processors, HBM are all things you would have never anticipated in a million years, and yet they've contributed much to performance beyond just decreasing transistor size, which is also being worked on.

>In the next few decades the upgrades to your GPU will be solely feature based
Maybe if people like you were in charge. Thankfully actually innovative minds are in charge of innovation.

Every time someone has claimed transistors can't get any smaller, they've gotten smaller anyway thanks to new fab processes or new materials.

If we really do hit a wall, then like I said stacked layers will become a thing. Even if each layer in the stack is only 75% of what you'd get from a normal die two layers would get you the 50% increase Moore's law requires and then you just have to keep going taller.

I should note we're already doing this with memory. That's how HBM works.

There are already smaller than we have now prototypes. The problem is making them easier to manufacture and/or make them cheaper.

And making them smaller is not the only way to improve CPU/GPU. One example is stacking chips which is comparable to "cores innovation" in terms of performance increase.

Why do PCfats care so much about benchmarking ?
Don't you get bored of playing Battlefield or Crysis or whatever you are benchmarking your videocards with ? Do you even have fun anymore ?

4k and 8k are memes, 12k 244fps is the endgame you should be waiting for

Don't care, I own a ps4 pro for exclusives. I'll gladly trade some graphical fidelity for 60fps and being able to play First person shooters like a normal human being with a mouse and keyboard.

>AI integration, efficient binning, quantum processors
Now if only they could make technology out of memes.
Actually it's the opposite.
Moore's law was a given ever since microchips were invented.
It's only recently that it's become widely known that transistor size has reached a wall.

I'm actually hoping that the industry moves away from Silicon eventually once they hit the scaling limit (likely

>intel is selling the same shit because they have monopoly
>this means we have reached the limits of technology

>AMD doesn't exist
I will inform my intel overlords immediately.

PCfats =/= Benchmarkfags

When your competitor isn't actually competitive, you have monopoly. Once AMD is competitive only then can true technological progress occur.

Duopoly's aren't enough either, you're going to need three major companies to have progress

Is it fair to say that within the next decade progression will stall out a bit?

Wouldn't this help consoles stay relevant?

Nothing wrong with 15 either. Just give your eyes 15 mins to get adjusted and you'll be loving dem vidya experiences as much as any 144 fps experience.

>game needs 2Tflop GPU to run game at 1080p and good graphics 60fps

>game needs 8Tflop GPU to run game at 4K and same graphics 60fps

>game needs 32Tflop GPU to run game at 8K and same graphics at 60fps

Consoles have adopted their standard "generation" standard of releases to make more sense in the common era.

PS4 Pro is a PS4, which is an optimized locked PC, with upgraded architecture. The Scorpio is a similar but for the Xbone (although significantly more powerful than even the PS4 Pro).
This trend will continue for the foreseeable future.
The big advantage is that backwards compatibility on consoles is pretty much guaranteed from here on out.

>game is poorly optimized
FTFY

I'm just saying 1080p -> 4K needs 4x stronger GPU

1080p -> 8K needs 16x stronger GPU

though it's more complicated than that

but it holds true to pretty much every GPU task that ends up on the screen

Too much sauce