Dude, why are you learning python?

>dude, why are you learning python?

Other urls found in this thread:

britannica.com/topic/convergence-mathematics
twitter.com/NSFWRedditGif

I use python because matplotlib and scitools are nicer to work with than R.

Why do you create threads that don't offer anything of value?

Still better than the apple shill threads spammed here every fucking minute.

This post was brought to you by Apple™, please buy our Products™.

i thought about learning Python but then remembered it used white space instead of brackets

sometimes i want to write a robust and multi-functional program but i only have it in me to write 17 lines

Questionable statement, the further I am to my math & stats degree, the more I like R.

There's a reason why maths graduates tend to be horrible programmers.

Butthurt csfag

because it's versatile.

So I can write a script for the Blender API.

True

I've tried to interpret enough proof of concept computer vision programs written by mathematicians to understand that you are full of shit. They clearly do not understand how computers work nor how to design programs in a manner that is easily maintainable and extendable for others. You can see it in virtually any applied mathematics paper that includes example code, they literally name their functions and variables "x" and "g" and "sigma" and "mu" because they merely replicate formulas in a naive fashion.

But that's the correct way to do it. When your target audience for your code is pretty mathematicians who are reading your algorithm implementation, matching the terminology is playing into that audience.

>Mathematics journal uses mathematical variable notation, which is both standardized and familiar to everyone in the field, i.e. the people who read Mathematics journals
Zounds! What a discovery!

Except computers do not work that way. This is why computer vision is never going to be completely overtaken by AI in a matter of years, because math guys continue to write subpar programs that perform like shit and are unmaintainable piles of garbage.

>they literally name their functions and variables "x" and "g" and "sigma" and "mu"
So what?

It's not only the papers, their source code is like that too. This is how you end up with shit like Haskell.

Theres a fundamental difference then between a math major program and a cs major program. I'm a cs major and I would never name my variable x if i planned to use it for more than a one liner (i.e. more than a lambda expression)

i end up with method names like

validateClassroomForClassEventWithActiveOrganizations()

or something along those lines

and any variable i define is extremely detailed as well. The point is that the programs used by cs majors tend to be worked on and added to for years and years, whereas math major programs are usually more along the lines of scripts used for utility purposes, sometimes only used a couple times

Whatever makes you feel superior in your mediocrity. "I may have no idea what actual real-world AI is and how it works, but if I can throw up enough huff about how jargon looks weird to outsiders then I can ignore it."

>t. has never had a job as a programmer

and, more often than not, cs major programs are shared between other developers that never meet each other, so it's important that a dev is able to look at a program and read it like a book

Your post makes absolutely no sense, what are you trying to imply here?

People who write in mathematical journals are career mathematicians. They aren't software developers. Who is this "they" you're talking about? Obviously by sheer statistics, as a language is only used seriously in certain niche fields that aren't exactly paragons of open collaboration, most users of Haskel are not career developers with sophisticated senses of code aesthetics and maintainability. What does this have to do with any of your grand, sweeping claims?

nah, the real reason AI can't take over is because javascript is too fucking wonky for a computer to write itself. Throw a framework or two on there and the AI just kills itself

I'm implying exactly what I said. You see that quote? Those are captions for your thought process. I thought that was obvious, but then again I'm talking to you, so my mistake.

to be fair, that post doesn't make a lot of sense, although i don't necessarily agree with the person that it was responding to

How then should I name a variable for a data point, acceleration of free fall, standard deviation and mean if not "x, g, sigma and mu"?

>People who write in mathematical journals are career mathematicians. They aren't software developers
My point exactly. Mathematicians can't program for shit.

>You see that quote? Those are captions for your thought process.
Except that doesn't make any sense at all. Are you so desperate to win an internet argument that you just attribute complete random sentiments to random people?

var accelerationOfFreefall
var standardDeviation
var mean
var datapoints = {
x:null
y:null
}

although really you are writing it for those that will be reading it later on, including yourself. If its only intended to be mathematicians, i think it makes sense that you would use variable names within the domain of your field. When it comes to cs majors, the correct thing to do would probably be to try and use built in functionality to do whatever calculations are necessary (i.e. dont write a function to determine the min/max/mean of two values when you have a static Math object available with those methods already in place). However, if a cs major had to write a similar function, the variable names i mentioned would probably be better as it would communicate the purpose and intent of each line of code to an audience that isn't strictly mathematicians

>nah, the real reason AI can't take over is because javascript is too fucking wonky for a computer to write itself. Throw a framework or two on there and the AI just kills itself
Well, CUDA programs written by mathematicians aren't that much better desu.

Do you seriously not understand the difference between a "Math grad" and a career mathematician? However you gotta parse the world, buddy. Just don't expect to ever be put in charge of talent acquisition.

Whats wrong with them? Just interested.
> tfw I am math major and think that I can program on cuda

Both learn the same kind of shitty practices, user. Math and physics grads are truly horrible programmers until they get out in the real world and learn proper software engineering techniques.

Fine fine, I'll spell it out slooooow for the little brainlet.

First I say "I (you) have no idea what actual real-world AI is and how it works."

I say this because you say "This (mathematicians being terrible programmers) is why computer vision is never going to be completely overtaken by AI in a matter of years"

This is because modern AI isn't even written by Mathematicians and has very little to do with mathematics. Heuristics are like the complete opposite of mathematics, you're most likely going off of a perception of AI from back when it was just an Academic's play toy and had no real-world applications, but this is a horribly outdated notion.

Then I say "but if I can throw up enough huff about how jargon looks weird to outsiders then I can ignore it."

That's because your argument about mathematicians being bad programmers is built entirely on their choice of variable names. And as I have explained already, that notation is used by every mathematician since Newton and they all understand it. It might be "unmaintainable garbage" to you, but to them it's just a standard formula.

Everyone is a truly horrible programmer until they get out in the real world and learn proper software engineering techniques. Especially CS majors. The difference is that when encountering a novel optimization problem or linear transform that is different from the standard boilerplate, we have a solid foundation in theory to fall back on, whereas you're probably just going to look at Stack Overflow and then give up when it isn't there.

>Whats wrong with them?
Well, this is obviously anecdotal, but there are two things I in particular see over and over again.

The first is bounds checking. Many programs are written in a lazy manner where they don't care that some threads are working on data out of bounds as long as it doesn't crash for the specific input the program is given (usually these programs are made only with a few input cases in mind and then presented as a "proof of concept"). However, when given non-trivial input they tend to make all sorts of illegal memory accesses and crash horribly.

The second thing I frequently see is the lack of understanding of kernel startup cost. I've seen programs where kernels are launched in tight loops, because the development GPU is some simple GeForce card that has a relatively low launch cost. But then trying to run the same program on a Tesla, the same program slows down by a factor of 10x.

Again, this is obviously anecdotal, but my impression is that they simply don't care as long as their proof of concept works and they can publish a paper about it.

Its true, even after 4 years of college I was a terrible developer until about a year and a half of work in the industry

My point was NEVER that mathematicians implement AI, you brainlet. My point was that modern computer vision (i.e. 3D reconstruction, see SIFT and variations) can run circles around existing neural net approaches when done correctly, but that when implementations are made hastily and sloppy with no regards to real-time requirements of actual real life applications, then it wont be long until other AI approaches will overtake the field entirely.

I don't know what the hell you are rambling about at all, you just made up some completely random sentiment I never expressed and then started arguing against it. You're basically having a discussion with yourself.

you sound like you have an online verified 140 IQ

A "solid foundation in theory" doesn't help shit when you don't understand how a computer even works. That's how you end up with garbage collectors having to sweep up literal gigabytes per second (see Haskell meme).

Because I'm a year 1 IT student and it's part of my syllabus.

Isnt that ok for a proof of concept code?
thnks

> IT student
What does it even mean/

Proper CS degrees require a wealth of coursework in mathematics, proofs, and stats. I have no clue what you're ranting about fag but yes math student's code sucks dick because they don't take a wealth of nor stretch their mind into the world of software.
Correct
AI was hijacked by mathematicians and statisticians in the previous AI Winter. While their current stat based algos have garnered much attention, they are not AI.
Wtf is this trash?

>validateClassroomForClassEventWithActiveOrganizations()
i'm glad i'll never have to touch anything you write

> they are not AI.
> AlphaZero, written by mathematicians wins stockfish, written by cs
> sstil not true AI

AlphaZero beat crippled stockfish. That whole ordeal was intellectually dishonest.
Also no, optimization problems are not AI.

Optimization and applied statistics aren't AI.
Both formally require tons of data/training/compute power because youre literally brute forcing your way through a state space. Fancy convergence algos : britannica.com/topic/convergence-mathematics aren't AI either no matter how much flare you put on it. The game of Go/dickfish all are exploitable through markov trees/chains/decisions processes... nothing more than applied mathemetics

> mfw you end up responding to someone versed deeply in mathematics and CS and don't have a leg to stand on. They're not AI, they're applied mathematics/statistics which is why the frameworks/solutions are such convoluted black boxes

> optimization problems are not AI
> AI means acting rationally
> acting rationally means acting with a minimum loss/maximum gain
> finding min/max is optimisation

Oh and the framework eras almost always produce brainlet devs.
> mfw I tried to create a simple dynamic data display and python which took all fucking day because each library or base library function had some convoluted exception/lack of functionality/dependency
> mfw you search the web to see if anyone solved this problem and come across 50 work throughs that still all have exceptions

Python is an over extended scripting language and nothing more. Something you use to stumble your way through a ghetto ass prototype which should be dropped soon after.

>if I call it a "scripting language" it will somehow delegitimise it as a programming language
Explain this meme to me.

Your dad's sperm must have been developed using python then.

> mfw black and white view of the world
> mfw mathematicians theory of intelligence
> mfw you see why AI is currently dominated by such a dumbass and limited approach to intelligence
> Logic and min/max
> Woops desu, i settled on a local maximum.. Anneal dat shit dawg
> "Annealed insight convolution Turing muh dick spaghetti adversarial RELU Neural Network" to the rescue

Lost in a bowl of convoluted bullshit....
> tfw you let mathematicians and even worse : statisticians take over something as complicated as an attempt to solve intelligence

It was an example, if i was writing a method for validating a classroom for an event, i would probably provide it an enumerable collection of organizations as a parameter that were valid in context with the place that it was called from as opposed to creating two methods for active vs all. Also the method name implies that the classroom is being created for the class event, which would be unlikely in any scenario, so it could probably be boiled down to validateClassEvent(ClassEvent classEvent, Classroom classroom, IEnumerable validOrganizaitons)

but again, it was just an example meant to illustrate that long, descriptive method names are one of many practices that are good to follow when working in a business environment where many people are touching the same codebase

It depends on what you want to do desu. I just love Octave, but I know when to drop it to use R. Even python has some great stuff. I like them all desu.
t. 2 years in applied maths masters

>tfw not anime girl

where are all the good languages?

> meme
Yeah, so basically when you get a proper CS degree and do fundamental software development you learn the pros/cons and uses of various languages. Scripting languages are languages that are convoluted and tool'd in the sense that structuring doesn't scale, is messy, runs into severe limitations when you try to do complex things and is more suited as prototyping/glue code... This fits Python exactly as well as many other convoluted framework languages. They're get shit done very quickly, messy, using little to no effort. They allow you the ability to make a quick tool to get a task/job done and move the fuck on. They aren't programming languages. They're tool/script languages.

The butthurt of a framework kiddie...
> Everything is done in framework languages desu
> Look what I can do in less than 1 min.... can you do this in your programming language?
No wonder compute security is fucking ass and people are getting hacked 50 ways to Sunday... :
> I thought the 80 libraries I used were secure.

Then why are pretty much any performant web framework written in Python then, if it is so bad at scaling?

>mathematics, proofs, and stats
AHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAH

Because they are written in GO?

kek why is there even beef between CS and Maths?

Because literal Sup Forums memes.

Name a single web framework in Go and I will tell you how it's not even remotely used compared to flask, django, pylons, web2py, wheezy, Sanic etc.

Because Web programmers are some of the most shit tier software developers who like everything easy, librarie'd, and lego blocked.
> "Performant"
Aka, be able to churn out some template/standard block in less than a week. Web shit doesnt scale btw. The real scaling is done in the back-end tooling that is all in lower level languages. Python is just the glue/entry/exit interface to the more serious code/apps. As I stated... A tooling/script/glue code language

The response of a man who does not know what he's talking about.

all this fighting aside, can we talk about how shitty the documentation is for flask? somebody needs to pay for that

Salty sepplesfag detected

As someone who's used flask, I agree

>trust me, it's just bad, it just is
So meme it is then. Thanks for clearing that up for me.

Its awful right? I had to do a project in school where we set up a flask server and it was probably the hardest thing i had to do in my 5 years there. That's including all the discrete structure and algorithm analysis shit. I probably spent 40 hours a week on just homework assignments in that class alone, trying to figure how the fuck flask worked. Now im working in mvc/aspnet/entity framework and the whole process is so much easier and well defined

Rephrased : Mathematics has yet to progress beyond its convoluted structuring purposely designed to make it as difficult as possible for wide spread adoption. It was designed this way so that low intelligence but hard working dweebs can brag : Were an exclusive group of academics.

> mfw Feynman reduced a grueling and convoluted 7 year PhD in Quantum Electrodynamics down down to lecture series and handbook of diagrams on the fly
> mfw dweebs forced niglets to go through 7 years of convoluted hell because : muh shit tier math
> mfw someone with true intelligence came by and made diagrams and simplified it all
> mfw rage mode because : muh convoluted mathematics are now accessible to the lay

Anyone w/ integrity who is deeply versed in it will tell you that it is unnecessarily convoluted bullshit ..

Because I can write simple programs in seconds.
Because it's a great way to interpret abstract data into a form that better languages can read.

Well, perhaps you can suggest other scripting language for me?

A well versed truly 'full stack' engineer speaking coherently about the hierarchy of software development present throughout the OSI layers.

haha

Savage