Will technology ever advance to the point of being able to create digital worlds indistinguishable from reality...

Will technology ever advance to the point of being able to create digital worlds indistinguishable from reality, at least on a limited scale?

no

it already has you can simulate a single atom to some degree of fidelity to real life congrats.
Also not videogames sa ge d

The only way to make fiction indistinguishable from reality is to never show reality, Matrix style.

Yes. When science advances to the point of using our brains as hardware for software, it will happen.
Play the waiting game until then.

>ywn see these blast of the past shocker ads for SSDs because of the asian price fixing cartel

What else would such a technology be used for other than video games?

>what would a technology that can simulate the entire universe be used for besides microtransaction dispensers
geez idk maybe science? or maybe literally fucking everything

everything? or fucking everything?

...

Where do you think you are?

first one then the other

Yes we will
But it's gonna be like The Matrix with furries and anime girls

Will technology be able to? Maybe. Will humans be able to? Doubtful. Current technology could support an A.I. that can pass the Turing test but I'll be damned if human programmers can create that A.I.

>it already has you can simulate a single atom to some degree of fidelity to real life congrats.

No you can't, you can't even properly describe an atom.

Most likely, though we probably won't be able to do it in real time anytime soon.

The current technology revolution is almost over. We take for granted the marching of technology but we are in a very specific period for the world to be this way, and I don't imagine it will last for much longer.

Right now the bottleneck to technological advancement are clever engineering advancements. Once we reach the point where we need fundamental paradigm shifts progress will be much slower for a while.

The issue is money and resources. What good is better technology if it takes tens of thousands of man hours to do anything with it? Video games *already* cost too much to make, which is why AAA titles have to sell like 12 million copies or risk bankrupting the publisher. And that's not even getting into the limitations of silicon, which we might be bumping up against.

>Might be

There is no might about it. We already pretty much know the theoretical limits for the absolute best silicon that could ever be made and still function.

That's why it's hilarious we are making things like 8k TV's when games will never be 8k and look any better than current gen games.

>8k TVs
This is another thing. At what point does a camera and TV display reach the upper limits of what the human eye can distinguish? What's the human eye's "resolution"?

>At what point does a camera and TV display reach the upper limits of what the human eye can distinguish?

Smart phones have already reached this point. You cannot see the individual pixels in a phone with a 4k screen for example, and even 1080p screens on small phones. That is also why shit like making the switch 1080p is a waste. All it would provide is slightly better AA for a huge performance hit.

>What is the human eyes resolution

Meaningless metric for eyes because they don't work like screens.

I guess a better way to phrase it would be at what resolution will a display look like you're looking out of a window. It's obviously lower for smaller screens, so say 80"? Probably the largest TV people would reasonably have in their homes.

>which is why AAA titles have to sell like 12 million copies or risk bankrupting the publisher

Because the CFOs of all the AAAs are dumbasses who feel it necessary to spend 50x the production budget on marketing.

>Meaningless metric for eyes because they don't work like screens.
WRONG

The visual resolution of the human eye is about 1 arc minute.

At a viewing distance of 20, that translates to about 170 dpi (or pixels-per-inch / PPI), which equals a dot pitch of around 0.14 mm. LCD monitors today have a dot pitch of .18mm to .24mm.

A 30 monitor with a 16:9 aspect ratio would be sized around 26 x 15. To achieve 170 dpi, it would need a resolution of 4400 x 2600 pixels.

Motherfucker.

It entirely depends on the distance. If you are 20 feet from a 1080p 80 inch tv for example you cannot see the individual pixels.

i mean we're currently inhabiting one so i guess the answer to your question is yes

>Calls me wrong
>Starts to spout completely asinine bullshit

You can't measure resolution in arc minutes retard.

>Viewing distance of 20

Yes but I view 100. (You are not even using units of measurement)

>that translates into 170 DPI

DPI is the exact measurement of the amount of dots you can fit in an inch. Even if you provide a distance (such as arbitrarily saying the number 20) you STILL are improperly characterizing the eye, because you don't measure how you see based on the amount of photons that enter in a space.

>A 30 monitor with a 16:9 aspect ratio would be sized around 26 x 15. To achieve 170 dpi, it would need a resolution of 4400 x 2600 pixels.

All you are trying to do is figure out the amount of pixels you need to have on a display at a viewing distance to not be able to see individual pixels. This has absolutely nothing to do with the question of "what is the resolution of the human eye," which is an asinine question (which is why you gave an equally asinine answer)

20 is FEET my nigga. I'm American.
>You can't measure resolution in arc minutes retard
>retard
Wew lad, how many chromosomes are you on? Did you not know you can convert from degrees to radians? Then you multiply your radian by your target arc length to convert it in to your minutes/seconds/etc.

Jesus christ did you sleep through elementary school?

You can convert it to whatever the fuck you want, it doesn't change the fact that resolution is measured in by a 2d matrix you idiot.