Nvidia is now pushing cloud based gaymen

>$25 for 20 hours with a VM with access to a GTX1060
>$25 for 10 hours with a VM with access to a GTX1080
>8 or 4 hours free (depending on which configuration you want to use) on sign up
marketwired.com/press-release/nvidia-expands-geforce-gaming-to-millions-more-pcs-and-macs-nasdaq-nvda-2186326.htm
Uh, guys. I thought games were supposed to be the hardest mainstream computer use to push to the cloud and that companies trying to do so wouldn't happen any time soon. If this is viable then how long will it be before there are cloud based alternatives for every mainstream use for desktops/laptops and the average person is using terminals for everything?

Other urls found in this thread:

amd.com/en-us/press-releases/Pages/amd-unveils-worlds-2015aug31.aspx
twitter.com/SFWRedditVideos

This has been possible for years.
You're an idiot.

That's literally been a thing for years, the only issue it had like ten years ago was bandwidth/ping which has been mostly fixed now.

Everyone still insists that gaming will be the main thing stopping mainstream computer use from being pushed to the cloud every time that potential gets brought up. Now what Nvidia is offering costs less than per hour than many new AAA games.

Is it possible to divide up the resources of a GPU between multiple VMs yet or does every VM have a dedicated GPU with this?

Won't latency be a bitch though?

>less than 300 hours playing games before you'd be better off buying a 1080
What a fucking joke
Who will use this?

>>less than 300 hours playing games before you'd be better off buying a 1080
It depends on how many games you spend those 300 hours playing. If you mainly play brand new AAA games with less 24-48 hours (depending on graphics settings) of gameplay on average each then you'd just be breaking even with what it costs to buy the games when they come out, and I've seen many games that don't even have that many hours of game play. Also if you're using the service for playing games on a laptop on the go then you might not put in enough money to break even for a gaming laptop.

You're going to need some serious bandwidth to get live encoding image quality to a point where using a 1080 to get the highest settings is even worth it. Going by the quality the NVIDIA encoder on GTX 1080 pushes out with Steam IHS for instance, 30Mbps is most definitely not enough for 1080p 60FPS to get to the point where it's "visually lossless". 50 is closer, but you could probably notice differences even above that. Who the fuck is going to have that sort of bandwidth "on the go" in a reliable fashion? I don't live in the US, but as far as I can tell that sort of bandwidth isn't necessarily available even for all home connections.

Latency is also going to be an issue, unlike local streaming options like NVIDIA's own or Steam, the servers won't be on your local network. Streaming of any kind adds a decent bit of overall latency, Steam which feels quite responsive adds about ~100ms button-to-pixel latency, which is similar to activating double-buffered VSync at 60Hz in practical terms. That alone can be noticeable, more or less depending on the game, but adding extra network latency on top of that will start feeling quite sluggish. I guess it will be less noticeable when you're not using a mouse, analog sticks will probably mask it to a degree since they're sluggish and imprecise anyway. Competitive MP is going to be off the table anyway.

>It depends on how many games you spend those 300 hours playing
You seem to be under the impression that the games are free on NVIDIA's streaming shit. They are not, you seem to be expected to use your own Steam account and such. You're still buying the games.
>With a few clicks, they can connect to their own GeForce GTX virtual PC, install their favorite games from popular digital game stores -- like Steam, Battle.net, Origin, Uplay and GOG -- and start playing.
>Gamers can enjoy games they already own on the stores mentioned above, as well as purchase new games as soon as they're available.
So you pay the sub for the VM and pay for your own games too.

This has been tried before in the past, and it has failed hard. It will fail hard again too, because no one wants this shit, and it will always be a bad deal for anyone who spends more than an hour a day playing games.

>bandwidth/ping which has been mostly fixed now.
And where do you live, South Korea? Fuck you, that shit's barely been touched upon!

I'm in Canada and my connection that I pay $40/mo for is the same speed it was when Napster was around, in the same city. Nothing has changed, higher bandwidth plans are available at higher cost but I'm still getting 15mbit.

Dedicated GPU.

Same thing I was wondering

>ping which has been mostly fixed now

fug, when did we invent faster than light internet

>canada
nobody give a fuck about you thirdworlders

Would making to so GPUs could be shared between VMs be possible with just driver/firmware updates? It seems like it would really lower costs if they just used high end cards and were able to divide the resource up between multiple people if they bought lower tier service.

Could you start your own private service for this? Buy a GTX 1080 and then rent it out for certain hours of the day to be used by a customer through your private VM service?

>Is it possible to divide up the resources of a GPU between multiple VMs
AMD has this tech

The majority of gaming will eventually be on thin clients.

It is inevitable, because it can be so profitable and allow such fine control for hardware companies and software producers.

Quality and issues with stream quality, input delay, etc. don't really matter for the majority of consumers.

The question now is about how long we have until traditional computing and gaming goes away.

How the fuck with nvidia and amd make money if they're not selling physical products?

Is it done by having multiple GPUs on one card or by sharing a single GPU? I know Nvidia has their K series but they just have 2-4 GPUs on a single card.

>if they're not selling physical products?
The companies running these services still need hardware to run them on, it just won't be consumer grade. That is if the situation ends up playing out that way. Then you get the choice between a thin client or a datacenter surplus rackmount server and a modified thin client.

They'll still sell physical products, but not to end users. Anyway I think it's going to be a long, long time until something like this becomes the standard way of playing video games. As far as competitive multiplayer goes I'd say it's dead by default due to the significantly increased input lag and potentially also due to the limited stream quality which could affect games where accurately seeing a few pixels of your enemy's head can be important.

enjoy your lag

>how can I make more money off of renting something to someone

>sharing a single GPU
This

amd.com/en-us/press-releases/Pages/amd-unveils-worlds-2015aug31.aspx

Will the fabs have to move to renting the dies?

No, because you could suffer product quality
This is basically a further step of people being too stupid to build their own computer

yes

Plot twist: you need a GTX 1080 to decode the data from the GTX 1080 VM.

Nvidia GRID cards can do that in conjunction with vmWare vGPU (available on ESXi 6 Enterprise Plus). Expensive shit though, with not that great performance, in my experience. At least with the K1.

Who fucking cares.

that pricing makes no sense, I can rent a Titan from Amazon for 1 buck an hour, and it works worldwide.

Also, unless Nvidia fixed the latency issue (which they didn't) this is trash, you need 30 ms max ping to make that shit be responsive.

Isn't this what OnLive did