My adviser is a literal jew and he's unwilling to provide me with the computational power I need to wrap this piece of...

My adviser is a literal jew and he's unwilling to provide me with the computational power I need to wrap this piece of shit PhD thesis up even though he has the funds to do so.

I have about $1000 of my own money that I managed to scrape together to purchase a PC to run the simulations. I have a case, power supply, a HD, and I could probably scavenge an old GPU from somewhere.

What I do need is the best bang for the buck CPU and mother boards combo that will come in under $1000. I was going to go with the 1950x but I missed out on the $700 sales so now I'd go over budget because the motherboards are too expensive. Some of the scripts and software I run are multi-threaded, but not all of them, so clock speed is also important and I can't sacrifice it too much for more threads, or vice versa.

Other urls found in this thread:

pcpartpicker.com/b/qHxG3C
twitter.com/AnonBabble

Don't blow your own money on a computer that will be used for college.
Unless you're building your own computer don't do it.
Use the cash they have and buy some old as fuck dual lga2011 board from ebay but do not build a high end machine for them.

Used Xeon
Don't connect it to the Internet
Don't apply the Meltdown patch

your university doesn't have computers?

Spend it on AWS compute.

If you don't want to go second hand:
Ryzen 1700
b350m motherboard
8-16GB of whatever ram
That's probably around $5-600 depending on how much ram you need. I would go for a 60GB cheapo SSD for this kind of build as well.

If I don't buy one it will take me longer to graduate. I already have a job lined up and just want to put this mistake of an "education" behind me. He has the funds available to buy the 1950x right now and I offered to buy all the parts and put together the computer so we don't have to rely on the IT guy, but at the end of the day he's a jew. I guess he's hoping to pocket the leftover money from the grant.

I looked at Xeon but then I seem to run into the same situation as with 1950x, i.e. motherboards are expensive.

All the simulations are multi-hour. The most recent ones take over 20 hours to complete. I also use some commercial software that is only installed on our lab computers.

We are an experimental lab so we were never properly equipped when it comes to computation power. I can make due with what I have (access to a single computer) but even one additional computer with an equivalent CPU (i5-7500) would cut down the time needed to finish simulations in half.

>If I don't buy one it will take me longer to graduate
No it won't.
You just want a super powerful PC in college to run your shit, I used shitty computers at the lab and all I had to do was to use a smaller sample size, proving your algorithm works is way more important than how powerful the PC running it is.
If you can prove it scales well you don't need to give cray systems a call so you can run it in 36000 xeons to during your thesis presentation.

Do whatever you want but you will be giving more money to your college than you should just because you want to use a nice PC which you could just buy it for yourself and use at your home instead.

I need commercial software installed that is so niche there are no easily available cracked versions, and I can't afford to pay for the license. It's a multi-thousand dollar license per year. If I claim that my computer is a lab computer the IT guy will install the software for me. Then I just need to connect via VPN to the school network, or leave the computer in the lab, so that it can connect to the license server.

Time is a problem here and I'm afraid that if I go second hand I'll buy something broken and then I'll be both out of time and money. I can hit the $1000 limit if it's worth it since I do have a job lined up, so once I start working I should make the money back in no time, and of course I get to keep the computer so it's not like it's money completely wasted.

I use commercial software as a solver to an input file that I feed it. The commercial software is multi-threaded, but the script that modified/prepares the input files and feeds it to the software then saves the output isn't.

The software takes 14-16 seconds to finish a simulation, and I do a few thousand simulations. I can run two of these scripts simultaneously on my PC without melting it down. If I had more PCs and/or a PC with higher clock speeds/more cores I could run more of these scripts simultaneously.

I also do additional post processing on the output and I need to use the whole output file which is usually a few gigabytes in size. That clocks my CPU usage at 100% for about 30 minutes.

And I do get to keep the PC. It's my money so there isn't a chance in hell I'd leave it after I graduate.

If you can't deal with having to run 20 hour simulations you shouldn't be researching. That was expected even in my Master's.

You can try buying amazon server time, if its good enough to bruteforce WPA it will be good enough for your sim.

What kind of simulations, and for what purpose? I'm wondering why you would need so many.

For how long will you be needing it? If for only a short period of time, consider renting a server.

>phd
u sound like a dumbass

What simulations? Are you talking about fluid simulations? However just think about your own personal needs and don't buy something just for school project. Think about what you are going to need in the future and act accordingly.
You can rent cpu time (i think even google does that not sure) if you need a farm and that's way better than building a single box for simulation in any case. So...

I'd still recommend AM4 with an 8core. With x399 and x299 platforms you run too close to $1000 without substantial performance over am4 8cores (x299 8core or x299 12core) and that's before you've purchased ram. 1700 is where the bang for buck is, especially if over clocked to 3.8ghz with ease, or just get the 1800x for $150ish more and note you need a cooler as well. Do you know how much RAM you need, as it's $100 per 8GB these days.

This suggestion is coming from an Idea I had earlier this year where my work might have needed some more horsepower for some CPU 3d rendering and ryzen7 looked like the best candidate for size, power and cost (I did want it in mITX though).

So in this case if you need a computer for home then buy something nice.
If you need computational power for project and require farm do not even think about buying something with your own money because it's not going to be enough anyway and you will get half-assed setup and then you can't even use that for your own personal needs after you finish the project.
Just buy a nice computer for yourself and ask money for renting a farm. Then go on with your life.

I did years of experimental work. I'm not in CS or even CE. This is just a frivolity to appease my adviser so he lets me graduate. It wasn't done before, but the previous student did it so now he expects me to.

The simulations deal with emissions and it's sensitivity analysis. I perturb various factors to determine which have the greatest impact on the results so I can then modify them to better match the experimental data.

Few weeks to months as I might be asked to do more simulations down the line.

SSH into your University's HPC server and run your simulations like a normal person.

>computational power
>not using a GPU accelerated
wtf kind of third world country are you getting your PhD from?

Literally all colleges have a room full of Dell workstations with top of the line Nvidia Maxwell or Pascal GPUs. You just have to ask to use it.

>taking 20 hours to complete
You aren't really simulating until it takes a week to complete.

Yeah, I can't believe I forgot about the ram. My lab computer has 32GB so I can use that for memory intensive tasks. The scripts that need a lot of computation power fortunately don't need a lot of memory, so I could even get by on as little as 4GB.

Also, I should have mentioned that the case I have is ATX, so I'm not really restricted as far as motherboards go.

I'm in the same boat as OP. I need a computer to numerically some the complete Navier-Stokes equations over a 747 for my PHD. I only have 1.5k to spend, please recommend something.

Why don't you just buy server capacity and time from some provider?

Which field do you work on?

Because I need commercial software that I can't install myself as I don't possess the license.

Chemical engineering.

What could you possibly be doing that you don't have enough power in a fucking university lab but a single consumer grade chip will suffice?

So I assume you have experimental data, and need to determine either an equation, physical properties or a combination of both. Do you really need to run thousands of simulations for that? Are you using Chemkin?

You're assuming he's from a non-shithole country.

I am using Chemkin, and on I'm perturbing reaction rate constants. This allows me to obtain correlation coefficients for the individual reactions and a visual representation of the variation of the model with respect to the perturbations.

I could do fewer simulations but this was started by the previous student so I would have to justify why especially since my model has more reactions.

>could do fewer simulations
Then do. One thing I can spot from people is when they are trying to fill their work with a ton of examples so whatever shoddy work they have done goes on unseen.

If whatever you're doing is correct it will stay correct with a thousand or ten thousand simulations.

>but this was started by the previous student so I would have to justify why especially since my model has more reactions.
Explain why you would use a smaller sample to your teacher then.

I'm not really from the chemistry field but I have worked with some of you guys more than I wish when doing seismic data processing.

t. user who have no patience for retarded undergrad students trying to filibuster their stupid grad work in hopes we just let them move on.

The reason why the number also matters is because it's Monte Carlo analysis, so I can't drop them too much but I will try it with fewer simulations.

I'm literally IT support for a really, really large STEM university and I feel for you user.

If you're really willing to spend the money you should be able to throw together a system with a 1700 and 16-32GB of ECC RAM. There any other lab systems to scavenge a gfx card or a scratch drive or something from?

When do you plan on Sup Forumsraduating ph.d user?

How often do you deal with angery undergrad and grad students when things break down? Especially during finals week when all the projects are due?

There is a cheap low profile graphics card I have access to.

I was hoping for this summer but whether it happens is another thing.

pcpartpicker.com/b/qHxG3C

Something like this maybe? a deep learning comp. for around $550?

I feel for you man, I wasn't sure if I would graduate on time until hours before my oral defense

I don't know if this would help, but at my uni we have different compute resources you can SSH to and they all share the same network drive, so I was able to run my multi-threaded application on four different Xeon servers along with two auxiliary machines which I used Virtualbox to SSH into them and there I could run the scripts without any ulimit

Tfw Electrical & Computer Engineering Undergrad Student Mustard Race

Dual Xeon workstations are plentiful used. Building doesn't always save money and is highly over rated though it's fun. Don't patch, only connect to network when required, do sims, graduate.

ITfags, what dual CPU servers or workstations would give OP best bang for his money?

How long would it take on your current system and why can't you wait for that?
I know it sucks when it takes a week to run and you have to hand in the paper the day after, but the only way out of this is plot while you are calculating so you can see a trend as it is happening and if a crash occurs, you don't loose everything.
If a desktop is enough, just buy one yourself, it is much faster than through a university.

Grad students in general are basically like indentured students and are either pretty chill or so beaten down from interactions with everyone with a microscopically better CV they never give us shit.

The one thing that makes most professors lose their shit completely is email service outages. Profs live and die by their zillions of exchange calendars and double-digit gigs of email.

Profs are generally pretty okay here but many have a habit of sitting on their asses until they need something and then OH SHIT I NEED ALL THIS STUFF YESTERDAY and we have to politely shut them down because we nor anyone we're contracted with can conjure entire labs worth of systems from our asses in 1-2 days.

We don't run a trash shop here so stuff doesn't tend to go down during finals week. We don't manage the LMS so that goes down sometimes and that makes students angry motherfuckers because profs can be dickheads and won't give them extensions to turn shit in if say, Blackboard goes down on the last day to turn an assignment or project in. I don't tend to blame that team though because all learning management systems are complete fucking trash

You can tell who has money and who doesn't. New profs or profs who are flush with cash are buying ridiculous shit just because- meanwhile those who don't have grad students using PCs handed down from the fucking office staff.

We don't do purchasing like a lot of university IT shops. We have contracts with Dell/HP, and we consult/price out builds when the various schools come to us needing stuff. They then provide the funding based on the quotes we produce and we deploy/support the systems. We don't get some pile of money to buy stuff to support labs.

Because of this however- we specifically bumped our base machine (and DO NOT allow buying anything less) because some offices/schools/profs are so fucking tightfisted so that when the systems filter down to grad students inevitably they aren't garbage.

just rent google/azure/amazon butt computing

I have a 7500 in the computer I'm using right now, so this would cut things in half time wise.

Things are really poorly ran here. The license can only be used on department computers, but the department has no computer cluster. The IT guy is also overly protective of the department owned computers, so he won't allow computers in undergrad labs to be utilized for anything other than what's done in those labs.

Yes, this school is shit, and I'm fully to blame for falling for the PhD trap with a shit adviser in a shit school.

You could maybe find some old Core2 PCs your uni throws out, then stick $5 quad core chink xeons in them and run your shit in parallel.

This is actually a good idea, OP should look into AWS or Azure, I know the AWS Educate pro/g/ram is free for students (not sure about their compute resources though), the only thing to consider is the learning curve but you're going to spend time building out your system anyways so might as well learn something that'll pimp out your C.V.

I can wait, but the longer I wait the longer until I graduate. This is a matter of me trying to get out of a shit situation sooner.

that's not a bad idea but only if he's not handling export controlled or proprietary data.

The only acceptable answer for unbelievable tier is mechanical engineer. Computer engineer is bitch tier and aerospace engineer is fake news tier.

Mech Engineer is god tier. I got a job without even applying crushing spines.

>computational power I need to wrap this piece of shit PhD thesis
A toaster from 1985?
You're writing a thesis nigga, stop being a bitch and write it on your laptop or desktop that you're clearly typing this thread on, faggot.

because you got referred. anyone that gets referred gets a job easily.

>1700 is where the bang for buck is, especially if over clocked to 3.8ghz with ease
1700X is only ten bucks more senpai

if you can run your shit on some cloud bs, go rent some time on amazon ec2 or similar

if not, ryzen 1700x is fantastic value after the recent price cut, you could build a nice rig around one for that money if you didn't care about gpu (throw in some cheap shit rx550 or gt1030 or the like, put the rest into ram/storage as needed - can get a better gpu later if you want to gayme or run gpu-based shit)

If he isn't doing CUDA/OpenCL anything he can just rip whatever piece of shit is available out of a spare PC from his lab. It just needs to be able to throw to a display and that's all.

No cooler, aho

>whaaa they wont give me what I want they must be jews
>im a victim, help me I am a victim of the mean ol' jews

Get told to use Xeon
>I'm too retarded to know about the LGA775 mod

You shouldn't be in such a hurry to get out of college, faggot. With this kind of attitude and resourcefulness you aren't going to amount to shit in the real world. College is life on easy mode.

Half the kids in my CNC machining classes were mechanical engineering grads trying to get CAD certs because they couldn't find any work in their field. It should probably be somewhere between Shit and Suicide tier.

Does your university not have a cluster somewhere you can use? Fuck, just send out a mass e-mail to the grad school and ask if anyone has a spare workstation sitting idle. A lot of states maintain supercomputer authorities you can use if you are attending school. Lastly there are the national supercomputers you can use for X hours until you have to get approved.

Also, why no GPU acceleration?

source: PhD that does simulations and built his own dual Xeon/GeForce CUDA workstation

just rent a virtual machine with a lot of ram from google. like $0.40 cents an hour

t. butthurt jew

What kind of third rate uni doesn't have a cluster of very high end machines for exactly this purpose?

>$1000 budget
>suggesting 10+ year old quad core processors
why are the biggest idiots always the ones with a superiority complex? a modern celeron or r3 processor would make a better suggestion, you fucking idiot.

Rent a VPS (Google Cloud) or get a Ryzen 7 1700/1700X for about $300 depending on the pricing in your region. Motherboards should be 150-200 and 8GB DDR4 RAM above 2666Mhz for about another 200$.

Use cloud meme:
- paperspace.com
- shadow.tech

I am not even kidding.
> tons of cores
> plenty of RAM
> Quadro P5000-P6000 or GTX1080

Much easier than just building a shitbox for your thesis, user. If you have any further questions, ask away. I use both.

If you need a dedicated machine (and no GPU), you can use Hetzner even, and just pay one month, as there is no setup fee whatsoever. Though, the best is around 64GB IIRC.

Or, just as others said, use Amazon, or Google Cloud - if your simulation renders in a few days, it's still just a few $, instead of a 1000. You can even rent GPU machines from Amazon, they even have the new Volta v100 in stock. Though thats 1.4$ or so each hour.

btw why you guys recommend Google Cloud?
I am just legit curious, I used Amazon all the time, as Azure proved to be a pain in the butt.

> what dual CPU
well you can buy used Xeons on eBay, but to be frank, a simple modern CPU can run circles around those. so as others said, it's much cheaper to just use a cloud service, instead of building your own. if you must build a PC and do heavy computation that runs for long hours/days/weeks just go with AMD Threadripper and ECC ram.
But as RAM prices went up lately, fuck that, user. I wanted to replace my older PC, but I will keep using the cloud maymay, because RAM is just so fucking overpriced.