Why arent people making programs to utilize moar cores?

Why arent people making programs to utilize moar cores?
Its weird to think we have GPUs with thousands of processing units but anything beyond quad core on a PC is retarded

Other urls found in this thread:

youtube.com/watch?v=BpT6MkCeP7Y
twitter.com/SFWRedditGifs

name of the short haired cutie pier? I wanna cuddle and talk about poetry with that fine woman

Mayli
purest girl youll ever meet :^)))))

>using the smiley with a carat nose

It's because she's done two Facial Abuse videos and he wishes to indicate a sly ruse.

Deviant.

you can't have a baby in a month by getting nine women pregnant.

Because most programmers are shit

look. some operations cannot be parallelized, exactly like said.
and gpus have so many cores, because each one can work on different environment object.

because designing parallel code is hard.

there are issues with issues with access, races, synchronization. if you don't know exactly what you are doing you don't anything and coders are fucking retarded.

Fucking love this gif

GPUs aren't suitable for most general processing tasks.

What this guy said.

And this is important too and it is pretty much the reason why most non-cpu intensive tasks aren't worth parallelizing. Also parallizing introduces overheads and if the introduced overheads are greater than the speedup then it is not worth doing.

>GPUs aren't suitable for most general processing tasks
actually they are. if you have some big chunk of data you need to process you can implement it on gpu for huge profit.

the problem is, you have to implement it. as in, actually design the algorithm yourself, because there's shit all in this field, even in terms of raw whitepapers from some european phd. then you have to know how to insert the gpu part into the big workflow to get your profit.

for people that use c# or python, or even for your regular c++ oop-shitter this is nigh impossible.

thats not her you idiot

You may want to learn what general processing actually means.

please let there be a video where she REEEEEEEEEEEEEEEEEEs with this thing

I don't get paid for that shit desu.

GPUs can only perform a very limited set of tasks and afaik the best way to summarize it is that they can only do the same thing at the same time, but with different data

which is helpful for graphics because you're shading thousands of polygons/triangles per second but not so helpful for general processing where one core might do this and another one something completely different

they are though

Parallel and concurrent programming is more difficult than pajeets and codemonkies can handle.
There are tools more appropriate than oop but that as well is too difficult for your average programmer.
Instead people jump on node.js etc.

>falling for the Moar core meme
>not realizing that except niche applications like gaming, rendering, or VR an i3 suits most people just fine
>needing tu justify your purchase of Intel core i7 extreme edition

We need more cores. Not just dozens but hundreds of thousands. We need massively parallel processing to be better and more common so we can actually use them for tech that is within reach for consumers.
I'm thinking real time path traced games for example. For that to work in real time you literally just "add moar cores" in essence.

Sauce?

I would love real time raytracing to be possible. It just adds a quality to the images, no matter how simple they are that simply can't be reproduced by all this hacky shit that most game engines implement. Also the simplicity is amazing and calls for extremely specialized hardware.

Anyway, real time raytracing already is possible with current hardware, the problem is that it generates very noisy images, so you would want to lower the resolution to maybe something like 720p. But atm everybody goes apeshit about 2160p gaming which is like 9 times the pixels, so the noise is three times more apparent, which is not wanted.

youtube.com/watch?v=BpT6MkCeP7Y
This is a video generated in real time by two Titan X. It's super noisy while moving (it integrates successive frames, which is of course not really wanted), but it shows nicely the photorealism that this kind of rendering provides. The noise needs to be lowered, but a little of it actually plays nicely with the photorealistic looks of it (and when you think about it: Photos are noisy as well for the exact same reasons).

>that video
heh, funny, I just had a whole course on advanced graphics, taught by the guy that made Brigade.
He talked quite a lot about his ray/path tracing engines, super fucking interesting.

You really don't have a clue what you are talking about.

Maybe artificial intelligence will fix this problem

yes it is

>you can't have a baby in a month by getting nine women pregnant.

Thank you for explaining this without the use of fucking cows and burgers.

She did mote than one?