Previous thread: Whatcha workin' on, Sup Forums?
>putting /dpt/ in the subject edition
Previous thread: Whatcha workin' on, Sup Forums?
>putting /dpt/ in the subject edition
Other urls found in this thread:
verbix.com
rumkin.com
twitter.com
First for: what language is best for serious AI development? I want to make my robot qt and name her Eva.
Doing some actual fun CRUD with some live databases and APIs.
Fuck the haters, I love working with data.
First for C++ is shit
First for guile
>haters
Are you a 12 year old that frequents youtube comments or something?
Haters gonna hate hate hate
thank you for not posting a fag thread
I'm just gonna shake, shake, shake, shake, shake
Python seems to be the most commonly used library for machine learning, most of the Python ML libraries are written in C++.
And fakers gonna fake fake fake
god that text rendering is awful
Jeg ville poste en norsk tråd.
Wait how does
>Python seems to be the most commonly used library for machine learning
relate to
>most of the Python ML libraries are written in C++.
To me that seems more of a "learn C++" than learn python.
First for rectangle with words on it
Probably D, Go or Rust.
_good_ AI is performance intensive. And those are modern speed languages.
Could of course go C++ but then you have a bunch of morons making pull requests if you're ever open sourcing it.
Why are there zero debug utilities for determining where wpf controls's data sources are grounded?
It's because ML people can't program and need a babby language
I can't wait for 30 posts saying 'roll' and literally no one actually posting their results.
>Jeg
I
>ville
will
>poste
post
>en
a
>norsk
Norse/norwegian
> tråd.
thread
Thansk vikings for continuously raping my ancestors. Now I too can understand the language of the norsemen.
tfw you search tutorials on youtube and the only videos are made by indians
>Could of course go C++ but then you have a bunch of morons making pull requests if you're ever open sourcing it.
>open sourcing my waifu
>shiggy diggy
I'm not investing potentially thousands of hours just to make her available to the entire world. My waifu isn't a slut thank you very much.
Stop programming in C, then.
it's not that python is good, it's just that python is all that these "academics" and hobbyists know
Is it just me, or is the section for "I" missing a particular entry?
>it's not that python is good
True. I learned that recently when I listened to podcast on supposedly how Eve online was written in python only for the dev to be like "oh most of it was written in C++ or like Java.
yeah, iceland
roll
>tfw there's literally no better way to do what you're doing and the best way is tedious
You mean Iraq?
it only includes programmers who are noteworthy
Clearly he meant Iroquese.
What language and problem?
There is a suspicious lack of Indonesians on that list.
That's not correct, I'm afraid.
looks like shit, you're probably doing it wrong
Rust is based, prove me wrong.
Don't be such an antisemite
It looks like JavaScript?
Real talk, Java is a great language and anyone who disagrees is an edgy kid who just started learning to program
it's more like "i was going to post a norwegian thread"
What's the best resource for stuff that K&R doesn't cover, like threading and sockets?
this, java and C++ are literally the best programming languages
Still butthurt that you don't understand Haskell, huh?
> capitalized method names
> Visual Studio colors
It's C# you dunce.
haskell isn't hard to understand, it's just fucking shit, get your head out of your ass smug tard
.Map is capital, so not likely. C#?
Moving data from a view from one database, to a table in another database.
The destination table is not exactly the same as the source, and so the data needs to be individually mapped to new column names, and functions need to be performed on some of the data to obfuscate it for public viewing.
It's basically a custom ETL package, and I would bet my penis(female) that there's not much of a better way.
It's only good for raking in money doing codemonkey work.
Tie between python and java. See tensorflow and deeplearning4j
>dynamic typing
I bet you couldn't even grok applicatives, I'll be enjoying my delicious fast development and type safety
Programming babby here; let's say that I have nested dictionaries in C#; three of them, to be exact. Looping through one is easy enough, but how do I access the other two underlying dictionaries and print their contents based on input from the user? This was easy enough to understand in Python but I've run into some limitations with the end goal of my project by using Python.
Using LINQ with lambda expressions.
I'll write up an example in a sec.
You shouldn't have learned Python, you have broken your brain and now can't think outside Python. The only solution is a cerebral enema.
I half-agree with you.
Modern Java is great (Java 8 and on) because you have typed generics and you have default methods on interfaces and all sorts of good stuff like that, plus the JVM is actually fast.
Legacy Java (Java 5 or 6, or, god forbid, earlier) is fucking shit, because your generics aren't typed so it'll be buggy, and you don't have enough sugar to work around the class system, so you have to write Design Pattern Soup.
Actually, could you give me your dictionary declarations so I can show you based off of what you're doing?
>grok
kill yourself
>fast development
stay delusional
Dictionary ppcDatabase = new Dictionary
{
{
"AL",
new Dictionary
{
{ "Autauga",
new Dictionary
{
{"Autaugaville", "6"},
{"Billingsley", "9" },
}
}
}
}
};
Don't talk to me or my wife's programming language ever again.
Not to mention the benefits of pure functions when it comes to testing.
Face it, you couldn't code your way out of a paper bag if your dildo collection depended on it.
>being retarded
>hurr, let me write my own distributed, GPU-enabled machine learning library
Also java is static, you autist.
I was talking about Python being shit, faggot.
Just make a mapForMe function. Pass in the thing you're mapping to and an array of what you're mapping. Have it map every element in array. Learn how to automate your life code monkey
No, it's still shit. Lambdas and streams and exceptions interact horribly and the workaround is a complete mess. Generics are still the most basic kind (pun intended, I'm not talking about C# being better here).
pretty sure you have extranous brackets there.
Also there's a possibility that nesting dictionaries is not optimal data structure to whatever you are doing.
pccdata["AL"]["Autauga"]["Autaugaville"]
Making a basic Pong game in Rust + SDL2
have you even made a single program worth mentioning in haskell lmfao
>Face it, you couldn't code your way out of a paper bag if your dildo collection depended on it.
nice projection
I wrote a Python compiler in it.
I still say D or Go though.
I'm not him but why does literally everybody who uses Haskell use it for writing compilers or interpreters or other programming language theory shit?
It's some powerful trend.
give me a proof
I can't, my employer owns the code.
haskell is useless for writing real software
>subset of C/C++ has no restrict keyword or designated initializers for structs/unions or flexible array member
Why did C++11 even bother to update their standard library C reference to C99 if they aren't willing to put in the most useful things in it? They're planning to do it again in C++17 to go with C11 but not putting in effort to make atomics and threading compatible with one another makes it pointless yet again.
They should seriously just drop C compatibility if they are just keeping it in what is basically name only.
I've been given conflicting advice between using nested dicts or hashtable. I don't have a whole lot of knowledge or understanding outside of how to use them in their simplest forms for simple textbook practice applications; this project was partly so that I wouldn't be operating in my comfort zone but I think I might be too far away from it. I'm not sure which would provide the best results.
Good question.
I think it's a combination of immutability, concise code, and good quality libraries available.
>my employer
nice lying on the internet lmfao, as if your employer uses haskell and needed you to write a python compiler
Proof that Haskell is a meme: You increment things by typing succ (Num)
it give u the succ
actually Ocaml is better for compilers due to Polymorphic variants (which can't be done cleanly in haskell)
So you're probably looking up stuff people pay to know.
>video tutorials
RTFM
>they should drop C compatibility
Yeah. Or you should move to a language not made by a retarded Norwegian.
succ is short for successor
i.e. the successor of a value
5 -> 6
'a' -> 'b'
>yfw Rust has a calculus of constructions
>what are compiler extensions
class Enum a where
...
succ = toEnum . (+ 1) . fromEnum
pred = toEnum . (subtract 1) . fromEnum
...
>grok
wew lad
>not being pendactic and sticking to the standard unless doing platform-specific work
>youtube has/had buffering issues since switching to ssl
>switch to deezer
>rediscovered some good tracks
>but the loud filtered shit on deezer is fatiguing to listen to
KILL ME
>>youtube has/had buffering issues since switching to ssl
you kidding?
in the last couple of weeks, since switching to encrypted HSTS shit, some videos would take a long time to load, and it's not just me, there are plenty of similar complaints online from the last the couple of weeks
Idiot.
Idiot.
Yeah it's happening for me too in all browsers except Internet Explorer. Bravo Google.
I'm having trouble implementing a deciphering tool.
Basically, the decryption formula is d(x) = a^-1 (x-b) mod m (It's the Affine Cipher:rumkin.com
And the first couple of letters work okay, but I can't get the correct answer after that (pic related). Here's my code:
int main(int argc, const char * argv[]) {
std::cout
Vmhhq and VMHHQ are very different things
I know, but I'm trying to do it with capital letters first to sort of understand the process. Don't want to add unnecessary errors.
>1./multiplier
Why the dot? You are limiting yourself to 7 significant digits.
1. is a double literal
Hello?