TFW fell for the "Python is shit" meme

>TFW fell for the "Python is shit" meme

>TFW only now realized how great of a language it actually is
Why does Sup Forums always want to unnecessarily complicate things?

Other urls found in this thread:

quickdocs.org/clml/
github.com/roswell/roswell)
quicklisp.org/beta/).
gizmodo.com/1570573636
twitter.com/NSFWRedditGif

Is python a good language to learn after C?

Python isn't shit. If you're working with machine learning stuff, you're probably going to be using some Python, due to the sheer volume of Python libraries dedicated to the task.

That said, it has its flaws. It is roughly as slow as Ruby for the same reasons, and its syntax can get gross for a few things, particularly lambdas and private methods (which don't actually exist in Python).

It's so versatile and useful though, compared to doing everything in fucking FASM and C++

this is a bad mentality to follow. just because something has a lot of libraries pertaining to your area of interest, doesn't make it a good language. sometimes working from scratch but at a fast pace you're comfortable with is better than having a ton of tools but despising the language they communicate in.

>It's so versatile and useful though
The same can be said of a lot of languages. Between Python, Ruby, and a few other scripting languages, the decision typically comes down to either libraries or personal taste.

>compared to doing everything in fucking FASM and C++
C++ is actually surprisingly flexible. It's just that it's also very unforgiving. FASM is an inferior assembler to NASM.

I just never realized how useful scripting languages were, maybe that's why I was always so angry and kicked my dog all the time.

I have an intense hatred of its whitespace shit. Never will work in python, still can shit out simple workable programs with my eyes closed in it if I have a reference.

There's a reason perl took over the world, it got shit done without much studying. Awk has always been god king of csv.

I don't get the hangup with white space. What exactly is the issue with forcing good standardS?

It's not "good" it's just design, whitespace delimited is just dumb compared to unambiguous braces. Sure, they're ugly and some people can't type for shit but they're much nicer all around.

You can enforce good standards through tooling (see: Go). The problem is that being whitespace sensitive means that it is more difficult to determine if a given program is syntactically correct.

Python is a really comfy scripting language and 3.6 brought some significant performance boosts.
I find it readable and fun, it's rather quick to produce results in it, and it supports both OOP and FP, so you can come at a problem in entirely different directions.
Also, for scientific computing, its libraries are overtaking MATLAB (numpy, scipy, pandas).

I don't get the issue people take with python's formatting. Yes, it's weird, but I stopped having indentation errors pretty much in the first month of studying it.
Configure your favourite editor and indentation Just Werks (TM)

Scripting languages have their purpose, as do compiled languages. A lot of people want to shoehorn the same language into every project, when in reality you should probably have a couple under your belt. You should use your primary scripting language for tasks that are mostly I/O bound, won't be scaling up to running on a cluster of computers, and need to focus more on shorter development time than shorter runtime. These tasks may often run in only a couple of seconds or less. Compiled languages should be used for tasks that are more focused on performance, or are at the very least long running, and need to use less energy. Sometimes, tasks will blur the lines between these roles. Writing a website backend in C is probably silly. You might be using Apache or Nginx (which are both written in C) as the server, but using a scripting language to handle the site logic. Or the site might not need to scale much, even though it is long running, and so a simple web framework like Flask (Python) or Sinatra (Ruby) may be sufficient.

>and 3.6 brought some significant performance boosts.

Elaborate. Is it worth learning Python now?

thank you im going to become good at python now

Not him but the dicts are now ordered, way smaller and faster

Also they finalised the Asyncio module.

Me now, see I'd elaborate and say that since everything in python is implemented around dictionaries (objects and namespaces are also just fancy wrappers around dictionaries in python) you get the improvement in everything, not just dicts
Is it worth learning python now?
I don't know, you'll have to see how it compares to other languages in terms of performance.
It IS worth it to switch from python2 to python3.
I'm using it because it's fun and fast, and I need to sometimes process large amounts of data or automate something at work. I'm more of a hobbyist and an end user, not a dev, and as one, I like it.
Remember that python's performance is capped by its VM. Even if you write your code in cython (which will speed it up significantly if you need performance), it won't beat C (unless we're talking about ~100s GB of data and then it catches up asymptotically).
If you don't have a scripting language in your belt, certainly add it up.

Okay I'm joining python master race now

Python's syntax is shit.
It just tries to be a hipster and do everything differently (not I'm not just referencing the whitespace)

Rather work in C(++), Java, or JavaScript anyday

I can't really read python after using C derived languages for so long. It's also very meh when you want to do anything larger than fizzbuzz.

I would rather do things in javascript than python. And I fucking hate javascript.

What a moron...

Anons, please explain your hate for python's syntax.
Is it just the whitespace meme, or something else?
For full disclosure - I kind of like it.

should we use true or TRUE?
>what's something different? True
should we use null or NULL. We could make it different and make it Null.
>And change it to something else: Nil.
how should we let the programmer define the entry point? I don't know how to fuck it up.
>if __name__ == "__main__"'
yeah, that should be good. How to an object's constructor? Method name of class like C++/Java? How about just a "Constructor"?
>__init__ method
Okay, so are we going to have a "this" pointer/context?
>yep, but rename it self. Oh and it's not implicit. You have to put it as the first parameter of every method
ternary operator?
>fuck it up: change (cond ? true : false) to (true if cond else false)
mutiline comment?
>Nah, just let them use a multiline string inline, and we'll ignore it.
bitwise operators? logical operators?
>sure like normal for the prior. No, they have to spell it out for the former.
major compatibility breaks between version 2 and 3?
>hell yeah. Make print a global function rather than a statement so now each one has to be wrapped in parenthesis. Other no reason changes too.
And this is just the tip the of iceberg. Literally every part of the syntax had one goal: not to be usable or intelligent, but to be different.

Well, those are valid complaints and i can see how someone with experience in C derived languages will be miffed by them, but I only did C/Cpp at Uni and python and tcl (god help me) at work, so they don't bug me at all, except for dunder name but I heard they are thinking of changing it to solve the ugliness

Most Sup Forumsaylords here only know how to format HTML documents with some PHP sprinkled in. Of course they are going to trash talk anything else.

Sup Forums is Sup Forumsreat if you do the opposite of what these fucks tell you to do. If it's not a meme language and/or compiles native binaries it's shit here.

hey user, idk if you were the one asking for programming direction the other day and some of us suggested some python books.

if so, keep on truckin' user:)

"private" methods don't exist in any language since you can call any function through machine code. Python just makes it easier to do and assumes that if you're going to touch private stuff you know what you're doing.

>It's different therefore it's shit

> housecucks not living in glorious functional cave

>what's something different? True
They could've piss you off by going the Objective-C way. They didn't.
>And change it to something else: Nil.
There are plenty of other languages using nil.
>if __name__ == "__main__"'
This is essentially only used to give libraries direct "executability". Other than that, this is not necessary.
>yeah, that should be good. How to an object's constructor? Method name of class like C++/Java? How about just a "Constructor"?
Why should it follow their customs? Going by that every language not doing it like your favorite language must be shit because they did something different than your beloved And why the fuck would it be named "Constructor" when it isn't in the languages you named either?
>__init__ method
Makes perfect sense in the context of python's special method naming scheme.
>fuck it up: change (cond ? true : false) to (true if cond else false)
"I always whipped my niggers on the field, that's how we always did it!"
>Nah, just let them use a multiline string inline, and we'll ignore it.
By far not the only language doing that. It's not python's fault that you used only different ones so far.

Your whole post makes very clear that you are some edge-lord attending some CS class in high school, feeling way too great while "enlightening" all the stupid people on Sup Forums. I hate your kind some much.

>And this is just the tip the of iceberg. Literally every part of the syntax had one goal: not to be usable or intelligent, but to be different.
Seems like all of your "knowledge" about programming languages stems from C/C++/Java/C# which are intentionally very, very similar.
However, there are far more languages used today and many of them greatly differ to the ones you named.
Next you call out Lips for using lists for function calls, this goddamstupid language, right?

>A language's community does not contribute to whether it is good or bad

are you retarded

>TFW fell for the "Python is good" meme

>TFW only now realized how retarded its semantics and implementations are
Why does Sup Forums always go for the technically mediocre?

G never memed this though

Eh... it has it's qualities and uses, but for generic scripting I like perl more. It's halfway between shell and C, which is ideal for that task. Though python might be better at some of the tasks that are closer to programming than scripting, but don't require speed.

Personally I prefer Ruby (or even Perl), but scripting langauges definitly have their place.

On the other hand they are a gateway for n00bs. Nothing wrong with being a beginner, we all started somwhere. But if you ONLY can play with things like Python or Go, you shouldn't call yourself a programmer.

I also made up the part where I felt for it, since I spot shit tier semantics instantly when I see them.

Python is pretty good for simple scripts and prototyping. If you need to do some serious (team) development or need to wrangle a lot data (for tasks that do not have a library with a backend in c), then there are a lot of better options.

Right tool, right job and all that.
Also, never listen to Sup Forums

Perl brofist!
I'm not alone!

Python is shit

fuck off tranny cuck

>If you're working with machine learning stuff, you're probably going to be using some Python
Wrong.
quickdocs.org/clml/

Are there other clisp libraries? Cause the argument there was just that a bunch of people have already done fundamental shit in python, not just one.

>Not using php
Kys degenerate faggot

Yes, there are other more general mathematics libraries. There are also bindings to GSL if needed.

I would legitimately prefer to start fresh with a completely different style of programming than have to use python.

Well, Lisp is fairly different, but not so much that you'll be left confused for a large amount of time. You should be able to get up and running fairly quickly, however I would recommend some utilities to make your life easier.
Quick and easy route:
Roswell can handle most of the steps below for you. (github.com/roswell/roswell)

More involved route:
First, there are many compilers out there. I personally use SBCL, although CMUCL, ECL, and CCL are good free choices. I think LispWorks has a free (gratis) version.
Second, get Quicklisp (quicklisp.org/beta/). The setup is fairly pain-free, and I believe ECL comes with its own copy.
Third, get SLIME or SLIMV. This basically turns emacs or vim into a CL development environment.
Fourth, get Paredit. This handles having to manage all the parentheses for you.

Literature:
Practical Common Lisp
On Lisp
Common Lisp the Language 2
Common Lisp Hyperspec

Feel free to. But you won't do something meaningful anyway, so it doesn't matter. It's other people creating stuff and apparently they don't have that much of a problem with Python.

I don't have a problem with python that extends to me wanting other people to stop. It rubs me the wrong way. I don't do a lot of stuff with the VM langs anyway but some thought experiment/learning is always good.

I also do a lot of really hacky shell one-liner style things (from the perl tradition) and python' has never been good at that either.

>great language
>fatal error because you used spacess instead of tabss or vice versa

Python makes large codebases hard to maintain and hard to refactor because of dynamic typing. It baffles my mind that there are so many libraries written for it. Anything that'd be caught by static types is a runtime error hidden somewhere in python.

Dynamic typing isn't nearly as bad if you don't have automatic type casts (aka weak typing)

>because of dynamic typing.
How so?

Basically I hadn't seen good arguments here against python except
> It's different from what I'm used to / what I like
and
> hurrr whitespace I don't know how to configure a text editor or write code without 2^^6 levels of indentation
The first is an okay reason but doesn't say anything about the language, only about the user's flexibility and taste. The second... is weird for me. I stopped having indentation errors and scope mistakes in less than a month, which makes me think some people have disgusting programming practices.
Which brings me to Perl.
And I want to apologize in advance to all the people who love Perl. I'm sure it's useful and powerful and whatnot, but I find it disgusting. It looks like it was designed by a lobotomized turtle.
> perl is designed to be a natural language
YES I KNOW thank you Wikipedia.
I read about it and the design decisions. They seem brilliant on paper but I find the result abhorrent. I can read Perl code but I don't enjoy it, and I'm sorry for using you as an example
> I also do a lot of really hacky shell one-liner style things (from the perl tradition)
But this seems masturbatory and self congratulatory.
Maybe it's because I work in an enterprise environment and don't only write for myself, but while I'll feel very clever writing complex one liners (which I did, and shared them around work as well), and mind-bending list-comprehensions, I have to ask myself what I'm trying to achieve? Do I want to feel clever, or do I want to produce results?
When the clever solution is the most efficient, both in performance and production time, I say go for it, but otherwise, I'll take clean, clear, beautiful code every day.

Also
> lisp
you mean Job Security?

TL;DR
Python is swell as a scripting language. It's great for scientific work, data processing, machine learning and general purpose. It feels natural and moves away from the text processing ugliness of other scripting languages to a more data oriented approach

What is the recommended material to learn python?

as a perl programmer (put down the pitchforks guys. I'm in bioinformatics) I hate the white space meme

However I adapted to python for jerbs

Just dick around with it for a while.

hello fellow perl monk

I want to learn pyqt and shit

Onelinerman here, getting shit done in a one-off way without having to rely on support or anything is just a fact of life in some places. It isn't JUST perl, sometimes you just need to awk something to shit out specific results real quick. It's not what I would consider 'workflow' just one-off get shit done time-efficiently. As far as I understand it... once you've written perl once you're not supposed to re-read it anyway. It's for stuff simple enough to rewrite it completely the next time.

That was not a comment on python as a project-oriented language though, you're right that certain scripting things are absolutely incompatible with working in teams. Python was one of the first to merge the two domains to some extent. Ruby, too but the more-than-one-way approach is just asking for problems. I just wish Python didn't use Python's way personally.

>[ObjC]
>not best bool naming
Ill poz u senpai.

Literally no serious ML is being done in Python. It's all C++.

- Global Interpreter Lock (GIL) is a significant barrier to concurrency. Due to signaling with a CPU-bound thread, it can cause a slowdown even on single processor. Reason for employing GIL in Python is to easy the integration of C/C++ libraries. Additionally, CPython interpreter code is not thread-safe, so the only way other threads can do useful work is if they are in some C/C++ routine, which must be thread-safe.
- Python (like most other scripting languages) does not require variables to be declared, as (let (x 123) ...) in Lisp or int x = 123 in C/C++. This means that Python can't even detect a trivial typo - it will produce a program, which will continue working for hours until it reaches the typo - THEN go boom and you lost all unsaved data. Local and global scopes are unintuitive. Having variables leak after a for-loop can definitely be confusing. Worse, binding of loop indices can be very confusing; e.g. "for a in list: result.append(lambda: fcn(a))" probably won't do what you think it would. Why nonlocal/global/auto-local scope nonsense?
- Python indulges messy horizontal code (> 80 chars per line), where in Lisp one would use "let" to break computaion into manageable pieces. Get used to things like self.convertId([(name, uidutil.getId(obj)) for name, obj in container.items() if IContainer.isInstance(obj)])
- Crippled support for functional programming. Python's lambda is limited to a single expression and doesn't allow conditionals. Python makes a distinction between expressions and statements, and does not automatically return the last expressions, thus crippling lambdas even more. Assignments are not expressions. Most useful high-order functions were deprecated in Python 3.0 and have to be imported from functools. No continuations or even tail call optimization: "I don't like reading code that was written by someone trying to use tail recursion." --Guido

>Go on sentdex youtube channel
>Half his fucking channel is about machine learning in python

- Python has a faulty package system. Type time.sleep=4 instead of time.sleep(4) and you just destroyed the system-wide sleep function with a trivial typo. Now consider accidentally assigning some method to time.sleep, and you won't even get a runtime error - just very hard to trace behavior. And sleep is only one example, it's just as easy to override ANYTHING.
- Python's syntax, based on SETL language and mathematical Set Theory, is non-uniform, hard to understand and parse, compared to simpler languages, like Lisp, Smalltalk, Nial and Factor. Instead of usual "fold" and "map" functions, Python uses "set comprehension" syntax, which has overhelmingly large collection of underlying linguistic and notational conventions, each with it's own variable binding semantics. Using CLI and automatically generating Python code is hard due to the so called "off-side" indentation rule (aka Forced Indentation of Code), also taken from a math-intensive Haskell language. This, in effect, makes Python look like an overengineered toy for math geeks. Good luck discerning [f(z) for y in x for z in gen(y) if pred(z)] from [f(z) if pred(z) for z in gen(y) for y in x]
- Python hides logical connectives in a pile of other symbols: try seeing "and" in "if y > 0 or new_width > width and new_height > height or x < 0".
- Quite quirky: triple-quoted strings seem like a syntax-decision from a David Lynch movie, and double-underscores, like __init__, seem appropriate in C, but not in a language that provides list comprehensions. There are better ways to mark certain features as internal or special than just calling it __feature__. self everywhere can make you feel like OO was bolted on, even though it wasn't.

- Python has too many confusing non-orthogonal features: references can't be used as hash keys; expressions in default arguments are calculated when the function is defined, not when it’s called. Why have both dictionaries and objects? Why have both types and duck-typing? Why is there ":" in the syntax if it almost always has a newline after it? The Python language reference devotes a whole sub-chapter to "Emulating container types", "Emulating callable Objects", "Emulating numeric types", "Emulating sequences" etc. -- only because arrays, sequences etc. are "special" in Python.
- Python's GC uses naive reference counting, which is slow and doesn't handle circular references, meaning you have to expect subtle memory leaks and can't easily use arbitrary graphs as your data. In effect Python complicates even simple tasks, like keeping directory tree with symlinks.
- Patterns and anti-patterns are signs of deficiencies inherent in the language. In Python, concatenating strings in a loop is considered an anti-pattern merely because the popular implementation is incapable of producing good code in such a case. The intractability or impossibility of static analysis in Python makes such optimizations difficult or impossible.
- Problems with arithmetic: no Numerical Tower (nor even rational/complex numbers), meaning 1/2 would produce 0, instead of 0.5, leading to subtle and dangerous errors.
- Poor UTF support and unicode string handling is somewhat awkward.
- No outstanding feature, that makes the language, like the brevity of APL or macros of Lisp. Python doesn’t really give us anything that wasn’t there long ago in Lisp and Smalltalk.

Its a tribal thing.

Python is for people who need to program and aren't programmers.

The aristocrats of programming look down on them and their tools for this reason as casuals.

OTOH, most Python users have better things to do than worry about than the tools they have to use.

Depends how you see it all really.

stop trusting Sup Forums

Does ANY widespread GC handle circular references?

C#'s garbage collector can handle circular references.

Java can handle circular references too.

I've seen entire automation flows in Perl in enterprise environments.
Still can't get over the horror

>No outstanding feature
MEANINGFUL WHITESPACE

>1/2 would produce 0, instead of 0.5
Hey, you divide integer values. How come the result must not be integer?

Which of these is cleaner?
> 1 / 2
0.5

>1 // 2
0
> float(1) / 2
0.5

>1 / 2
0

Hm... you bring up some points I didn't think about.
> GIL
Didn't read enough about it and didn't mess around enough with threads to have an informed opinion buy I heard gripes about it. Think they'll ever change it?
> scopes and dynamic typing
I agree about dynamic typing, although I rarely have issues with it anymore. Regarding scope I don't recall any funny business. I'd question usage of append when a comprehension could be used.
If you really want typed python you can write in cython. It will also run faster.
> line breaks
I don't care how many lines the code takes, and python is indifferent to indentation inside brackets and parentheses, so I'd just line break after the parentheses in your example, and maybe put the for in a new line as well. Code should be readable, not a mess as in
>beautiful is better than ugly
Finally
> Tail recursion
Will probably be implemented the day after Guido steps down. I'd like them to implement it
And useful higher order functions are just reasonably wrapped in functools. they aren't really deprecated.
But good points all in all, thanks.

Well, as the electrical engineer peddling python in this thread, I can say your description is rather accurate.
I AM learning scheme to expand my horizons though

>>fuck it up: change (cond ? true : false) to (true if cond else false)
>"I always whipped my niggers on the field, that's how we always did it!"
Not an argument. The python way is utter trash. Cond in the fucking middle wtf

That notation is shit, but you can just index into a Tuple instead.

>>> 1.0 / 2.0
0.5


Not even a pythonfag anymore.

Is Python a good language for beginners, or what's the most readable language?

>most readable language?
BASIC

And what about if they're variables instead?

Someone just asked me if I thought BASIC would be good for an intro to programming and I think it would 100% be. It's practically asm.

It should never be used, of course, but a couple weeks of BASIC should cover you for the rest of your life.

Python is very readable but I wouldn't recommend it to a beginner. It's much better to start with a statically typed language in my opinion.

Either one of them is already a float or you just cast it as a float at assignation.

>Is Python a good language for beginners
You can learn any language but Python tends to have minimal syntax, meaning you can learn the basics of programming and not have to worry about weird syntactical rules. When you're first starting off you need to learn how to break down tasks and express them in a way a computer can handle and Python can help you with that without hindering you with rules of the language itself.

>It's practically asm
I suppose everything is once it's interpreted

Python is BASIC of our time, I wouldn't use BASIC for anything serious, would you?

Exactly.
So we're back here.

Have one operator that always does normal division, and one operator that always does integer division.
> x = 1
> y = 2
> x / y
0.5

> x // y
0


Have one operator that does normal division most of the time, unless both arguments are ints, in which case it just does integer division, unless you do a cast. Also have an operator that does integer division.
> x = 1
> y = 2
> float(x) / y
0.5

> x / y
0


The former is just more consistent and more clear. In a dynamically typed language like Python especially it's important to have operators behave consistently between types.

I mean BASIC is a generalized assembly language with some sugar. Jump/goto is something people should be able to visualize before going into procedural stuff and then NEVER EVER USE AGAIN. Python has a LOT of shit going on under the hood, I wouldn't recommend it for anything theoretical but for practical learning sure it's fine.

But python really shines when it's used as a glue between various high perf modules written in C or another fast language. As long as your program spends most of its time in those modules then python is worth using.

What about cBASIC??? (Cython)

Honest question: how did they first write BASIC? Seems like an impossible feat

but muh abstraction

> gizmodo.com/1570573636

>gizmodo.com/1570573636
>Wozniak
>First BASIC
riiiight... dumb nigger couldn't even float

When you're writing in machine code things are firsts for different chips, so yeah for 6502 probably. It was a pretty simplistic question since BASIC is pretty abstracted as it is.