Multithreading

A few days ago, some user did a thread about multi threading in vidya

Which in turn inspired me to do another thread on multi threading, except focusing on multi threaded Linux programs. The only ones I could find after some googling were some compression programs like xz and lbzip2.

So why is this such a badly developed area in Linux? Why don't all programs get some needed updating to come into the 21st multi core century? Its dumb that we have all these CPUs and they're just being wasted because of shit 90s code.

Other urls found in this thread:

google.co.uk/search?ei=qq_BV4nIFOjSgAbq5rzoDw&q=mpi tutorial&oq=mpi tutor&gs_l=mobile-gws-serp.1.0.0i67k1l2j0l3.6257.7237.0.11422.7.7.0.0.0.0.1590.6720.2-1j7-4j1.6.0....0...1.1j4.64.mobile-gws-serp..5.2.1260...0i20k1.Cv5nkqRE7Eo
twitter.com/SFWRedditVideos

It's difficult to program for and one or 2 cpu threads is plenty for most things to run smoothly.

Don't forget that most people will only have 2 or 4 cpu threads, not everyone has an 8350 or i7

But the number of people who only have one thread is virtually nothing any more. I mean you would need a 2007 CPU to still have one thread. Not many people use their computer that long.

I just checked and its more like 2004 when the first dual cores were available

Many people still only have 2 threads though.

My mom is using an a4 7300 for example. My neighbor is using a g3260. My multimedia laptop is a turionX2.

Besides, even though single thread processors are rare, why would a program that'll run smoothly in a single thread need to optimise for more?

Funnily enough, if you have a copy of windows, open minesweeper and check how many cores it uses :^)

I was still using one until about a month ago, were still out there user.
The programs need to be made to automatically detect the number of threads present, and also allow users to set the number to use.

>only 3
2 would still do the job for most multi threaded programs.
>why optimise
And because thinking long-term, the more effort we make now to have programs taking advantage of multi cores, the less work will be needed in future and less technological progress will be wasted. I mean there is no way we are going back to single-threaded programming ever. Its best to look forward.
I don't have Windows but I guess you're going to say its using multiple cores/threads.

>open minesweeper and check how many cores it uses :^)
0?

>0
>minesweeper runs on magic dust

That must be some ancient af CPU, user.

I'm playing it and task manager says 0% CPU usage. It's taking more power to make this post than to run Minesweeper on my PC.

8 cores iirc.
And multithreading is the future, but it's just catching on a little slower than it should be.

For most normie tasks like web browsing though, 2 threads is plenty. Especially with the ipc of a modern Intel processor

Are you mentally challenged?

Are you?

Well not all tasks can be multithreaded effectively and support for more threads is added complexity. If the application is quite old adding threading might require an extensive rewrite also.

lga 775

Yep that's ancient. Why did you upgrade, out of curiosity?

You answered your own question...

lightweight applications like a text editor shouldn't use multiple threads unless there's a clear benefit. Most of the time in applications like that multithreading for any other reason than responsiveness (e.g. have the editing handled by a different thread than saving), will just result in a lot of overhead. If you're searching through a 10KB file it would most likely be faster single threaded. If it's 10GB however, it would be faster multithreaded, but that's just not useful in a text editor.

Not useful to you, but there are people who sometimes need to edit large files in a text editor, I had to do that a few jobs back because it was the best way.
Multithreading would actually have made a difference there.

Professional and enterprise applications utilize many cores effectively. Normie apps aren't worth the extra effort and video game programmers are usually the stupidest bunch so no wonder there's bad multithreading there.

I really don't see the point in compression beyond DEFLATE(Zip/Gzip) at least for transferring files over the internet. If you go too complex with compression then it just takes longer to decompress than it would have taken to transmit a zipped file. If you're storing files long term in a storage drive or facility then compression like LZMA/LZMA2 makes sense because you're not going to frequently access the information so you wont be inconvenienced by the few times you have to decompress the files.

Any good MPI tutorials?

google.co.uk/search?ei=qq_BV4nIFOjSgAbq5rzoDw&q=mpi tutorial&oq=mpi tutor&gs_l=mobile-gws-serp.1.0.0i67k1l2j0l3.6257.7237.0.11422.7.7.0.0.0.0.1590.6720.2-1j7-4j1.6.0....0...1.1j4.64.mobile-gws-serp..5.2.1260...0i20k1.Cv5nkqRE7Eo

A few reason, but mainly because the motherboard was failing.
Sata was transferring slow as fuck, I require two ethernet ports, one of which was a pci card, the pci slots stopped working and I couldnt boot if I had a card in. ALL usb smoothing caps blew resulting in an increasinly slow write time going down to kbps on usb 2.0. The hdmi port went out so I had to use vga, and a couple other things I cant remember.

I do alot of intensive stuff like lots of virtual machines and >300 tabs of firefox. On the old computer I tried to rencode a video to work on xbox because I wanted to watch it in hd on the tv and it was in a format the xbox couldnt read, the reencoding took over 6 hours. I was just tired of not having the processing power that I should have (read deserve), so I decided its about time to call it quits and upgrade. The new rig is a z170 with a 6700k @ 4.2GHZ but this magic switch on the motherboard will make it 4.6 without doing me having to do anything in the bios. The memory is low, but at 8 GB @ 3.0 ghz its ALOT more than I was working with so Im happy. Ill probably get more later.