What do you do when you want to learn a new programming language?

Read documentation? Search youtube for tutorials? Read a book?

And what do you practice it with? Do you have any goto projects to make in it?

Other urls found in this thread:

github.com/joyent/libuv/pull/1015
twitter.com/SFWRedditImages

Assemble pic related.

I write a program with it.

Pick one of language-independent programming exercises, split task into simple pieces, then for each one: "ok google how to do X in Y"
Repeat until satisfied

>Read the docs
>Search for a good and detailed book about the language

I do not make any specific projects when I learn a new language, It's difficult to think about doing something worthwhile when you have no ideas.

Normally I do some exercises from a book or try to redo some of my older projects from another language.

Type some examples until I get the hang of it. After that I try to build simple things then use all the simple things I've built as a reference to build greater things.

I look at its implementation's source code. Virtual machine, standard library.

Programming languages are all the same. You'll realize this once you try to make one. The important thing is what they actually do for you. The implementation.

what if its written in itself tho

Doesn't really matter. Syntax is more than easy enough to get used to in like 10 minutes.

Most implementations though use at least a partial virtual machine written in C or something of the sort.

Take Javascript for example. It's easy enough to read MDN and learn pretty much everything on a need-to-know basis. You can read the ECMA specification if you want to decode any language feature, all the stupid ass cringeworthy JS features are described in detail there. If you read it, you'll suddenly stop caring when noobs complain about ===.

The real power though comes with studying how your implementation does things. That's how you really understand what the fuck these web devs faggots are talking about when they rave about muh evented I/O.

Turns out Node is just V8 + libuv, and the JS event loop is actually a form of cooperative multithreading: system calls such as I/O implicitly yield the processor (Go has the same feature), and function returns are basically a form of explicitly yielding control. Every iteration of the event loop, events are collected from sources such as timers, I/O, UI, etc. and JS functions are scheduled to run as a result. They execute one at a time, serially, in a single thread. So if you're writing Node.js code, you're implicitly trusting that your retarded web dev peers can write reliable real time code. What you actually get is functions that either (1) block the living shit out of the event loop and increase latency to retarded levels, or (2) are butchered up into a million pieces in what amounts to a form of continuation passing style.

None of that shit is obvious by learning the "programming language". You can learn JS in a day. Realizing the above requires going beyond that and into the implementation.

Do you work with development or just hobby?

When you study implementations, you gain respect for Chrom{e,ium}'s V8 and realize that Mozilla dropped the ball hard because it started making a stupid phone OS nobody actually cared about.

You also realize that Node is literally standing on the shoulders of Giants. They built upon code written by top notch developers backed by Google, who employs people whose experience with dynamic virtual machines dates all the way back to Smalltalk, and includes such gems as the Self VM and the HotSpot JVM. People whose entire professional lives were dedicated to writing virtual machines. This fact expresses itself by how the browser V8 gets ridiculous amount of good features at an absurd rate while Node devs do fuck nothing; Google maintainers gotta go there and tell them to pick up the pace.

Libuv itself has a lot more capability and value than what's actually exposed by Node. One of its major contributors who are actually competent was symbolically fired and publicly crucified by all the lamers in the "web dev" community because he refused to change a pronoun in the documentation. I'd be surprised if he ever comes back.

Another interesting thing is about the "clever" Go implementation using segmented stacks. Your routines have stacks that are effectively unbounded, allowing really fast, effective and predictable stack memory allocation. Stack and static memory allocation is how you do real time code; you can't have motherfucking malloc fucking up your latency.

Unfortunately for Go, though, it was also designed with "lightweight threads" in mind. The idea is to enable you to write multithreaded web servers without all the asynchronous programming that comes along with the task. Hell, asynchronous programming is a thing because with a 1 thread/task model you exhaust your machine's memory since every thread normally allocates like 1MB of memory for its stack. Go allocates a lot less memory, allowing you to have millions of "goroutines".

These features smash right into each other. When a routine runs out of stack, which will happen often since they have small stacks, it allocates another stack and switches to it. This is no different than a malloc and may involve a system call. In practice this happens in the worst possible time, like tight loops, and you're basically fucked in that case. It will keep allocating and deallocating stacks right in the middle of the loop like a tard.

You'd think these people would invent a mechanism to do something like "hey a loop is about to start so let's preallocate a bigger-than-normal stack and switch to it in advance to guarantee performance" but no. They decided to get rid of the current implementation. Now when you run out of stack space, Go allocates a bigger one, copies the old shit to the new stack, and resumes from there. No different than a 4 year old's dynamic resizable array library. I've never seen developers backpedaling this hard

Really makes you think when you see Go diehards hyping their language like it's a "perfect C" or something. As if being designed by that Pike guy made it good. Sounds a lot like Apple when I think about it.

Interesting stuff, got any more of that? Especially about the javascript mess.

Python and Ruby are interesting cases.

They're both languages with one "canonical" implementation written in C and helmed by a "benevolent dictator" who are essentially the "men of taste" who decide the bullshit that goes into the programming language itself.

They both have countless satellite implementations that mirror the features of that implementation and come with all sorts of properties, from being faster to being integrated with the JVM and .NET CLR.

They both have virtual machines that implement a bytecode instruction set, and are able to compile the code to this portable bytecode, not unlike Java.

The languages themselves have more or less the same feature set and were created at around the same time. They have more or less the same projects written in them, such as web frameworks.

Were it not for the actual syntax, I'd say they're mostly the same shit.

About the asynchronous programming model that Javascript adopts... It's an unholy mess. This involves people from the Node and libuv to the Linux kernel. None of these people fully thought the problem through.

The main feature of asynchronous programming is enabling you to wait until something happens, without wasting memory and processor time. By "wasting" I mean keeping an entire thread of execution around waiting on something like a read system call. If you're reading from even one socket, it's gonna take potentially infinite time. Your code's got better things to do. So the traditional Unixen and Linux all invented some system calls of varying quality that let you basically say "hey kernel, I want you to watch these sockets. I'll ask you in the future about them, and you tell me if there's any data available, ok?". This interface allows you to read data from sockets "without blocking indefinitely". So you make a loop that (1) asks kernel if data's available, (2) reads the data if it is, (3) does other shit if it's not. The kernel is good at noticing when stuff happens since hardware is naturally asynchronous and often works by literally interrupting the processor and forcing the kernel to handle shit. You bet your ass the kernel is gonna notice when data starts flowing into the network card.

Notice how I keep talking about sockets as if it was the only type of I/O in the universe. It's not. Kernels really love their "universal I/O" file descriptors but they totally screw this up. This entire thing breaks down with file I/O. You ask the kernel to watch a file, it says "OK, no problem", but the thing is the data is always available so it always says there's data available, causing you to keep reading data ad infinitum, bottlenecking your application. That's not what you signed up for.

What you really wanted was to be able to say "hey kernel, I want you to read data coming in from the hardware, be it the network card, disk or whatever. I want you to copy that data to this location. I'll ask if you're done from time to time. Tell me when my data is available and in my address space, OK? Meanwhile I'll go do other shit".

Wow who would have guessed? This just happens to work with everything! You don't have to care if data is "available" anymore; let the kernel worry about that shit. All you need to know is "has the kernel transferred the block of data I requested to my address space?" It doesn't matter if it's not there because nobody is actually requesting your shitty web pages or because the disk is slow as shit. The data's not there, so we go do other shit.

These kernels totally dropped the ball when it comes to implementing this critical interface. But libuv totally wants that functionality since it must present a nice "unified" I/O event model to the consumers of its API, consumers that literally don't give half a shit about any of this. So what does it do? It spawns threads, of course. Threads that read the data into a buffer and report back to the event loop when they're done.

Why? Why can't the fucking Linux people get their shit straight so that userspace doesn't have to bend over backwards to have a sane interface? Guess what OS does this right? Fucking Windows, with its I/O Completion Ports! Holy fucking shit Windows beats Linux in this case!

And what does Linux Overlord Linus Torvalds have to say about this bullshit? "Man this entire issue is such bullshit. We really need to make some kind of small thread that consumes little memory, so that userspace will be able to just spawn millions of them to wait on system calls."

Ok

I said V8 gains features faster than Node. Want an example? Web workers.

As I explained, Javascript is single threaded. This means any shitty function by any shitty developer can literally kill your web page UI responsiveness or Node.js application latency.

However, web technology standards bodies created Web Workers, a way to run Javascript code in a separate thread. These workers can generate events and pass values back to other functions, making for a great communications mechanism. The Chrome people immediately stepped up and implemented the feature, developing previously non-existent mechanisms to serialize Javascript values, even functions, so that they can be passed around between threads. It already works.

What does Node have? Numerous half-assed C++ extension modules on npm. Numerous attempts were made to add this feature to upstream Node, resulting in code involving goddamn zeromq and the serializing of objects using JSON. OK, but was any of it merged? Nah, it sat around in github until the code lost compatibility with the master branch and then people just gave up. "We gotta attend to the problem of value serialization first" they say. "Chrome is coming up with something" they say.

Want parallel computing in Node? Spawn an entire new process, asshole! Complete with the stupid fork dynamics! Want worker threads? You're fucked. Guess how Node passes values in-between processes. That's right, JSON. What are the chances they just use Google's code to improve their multiprocess implementation? I'd say pretty high. After all, memory-sharing threads are "way too hard to develop with".

Probably just a hobby since he's got time to redo projects.

I just start using it and google away any problems

What?

First thing I read info about the language, like how to run it, why it is there, how it is intended to be used and so on.
Then I read the documentation while I use it.

It is just as easy/difficult to include a new library as it is to start a project in a new language.

I read the language specification, preferably a EBNF grammar. If it's shit like java, I just look at the guides instead, because its specification is bigger than most novels I read.

>node.js

That would be accurate if the woman was fat or had a dick.

well played. I actually hate js more than java, just posted a pic because whynot.gif

port one of my scripts over

I don't even hate JS all that much though. I guess I got used to the bullshit.

What I hate is the web dev brogrammer cult that formed around Node. These are the faggots who pull things like this:

github.com/joyent/libuv/pull/1015

To be fair I also hate java, but just not as much as javascript. The web dev culture is painful at best, I agree.

pls post more cute images

ok

so what do you do when you learn a new language?

What anime this from senpai?

what do you do user

Yup, just a hobby.

My work doesn't require programming knowledge (but it's a plus, specially scripting), so I have all the time I need to learn a new language and do anything with it.

I read EBNF's

Where do you come from?
I would really like to know what kind of work lets you have free time to enjoy such hobbies :-(

What is proper time management?

Not *that* user, but the secret to plenty-of-time is no wife, no kids, no debts. Bonus points if you can sustain yourself, I'm guessing, with part-time, freelance, or govt (aka effectively-also-part-time) gigs ..

>no wife

Life without pussy is too cruel.

Well (unless you're living in Sharia country) a wife isn't the only path to pussy, and likely not the most "time efficient", all-in (keeps wanting to shop, travel, reproduce, needs attention, tries to busy you with useless tasks etc)

All the fucking women I've been with demanded way too much of my time though...

completely, utterly wrong. A language is not it's implementation, it's an abstract concept, a set of rules. If you compare the two language you obviously have to compare their syntax as it's a core part of their design. The use case of python and ruby might be the same but the ideas behind them are vastly different

I get the 'official' book on it. I read the intro, which usually covers most of what I need to know.

Then I skim the chapters looking at shit that looks completely different or is specific to that language like say go routines for golang.

Then I write stuff using that language, with it's std library documentation open in another screen referring to it, and referring to the official book/text or unofficial text whenever needed.

I also join all their mailing lists and skim through them, reading whatever seems relevant to whatever it is I'm doing

Yawn. I don't care. Just because the grammar, idioms and "philosophy" is different doesn't mean they're all special snowflakes. It's just a façade for the underlying virtual machines. That's what really matters.

Ruby and Python are mostly bread and butter dynamic programming languages. There's literally nothing special about them. Have you seen that polyglot website? You can literally look at the languages side by side and see how it's all the same shit. Same overall features. Same overall structure. Same bullshit. You gotta look past the superficial syntax and into the substance that really matters. Once you do that, you start to understand some really "advanced" stuff like Ruby's object model and metaprogramming capabilities, all properties of its many implementations.

Languages are utterly irrelevant until their implementations mature to the point of supporting real applications. Why do you think so many of them target the JVM, CLR and LLVM? They get high quality implementations for free.

Why do you think C is relevant to this day? Because C compilers are top notch and ubiquitous, and they produce code that maps directly to the instruction set of the most important machine of all: the processor.

Get the "In a Nutshell" book on the language for a crash course on syntax and language features without the fluff of traditional books.

This. A book can help to learn the BP, or you use a tool like perlcritic.

There is always something one can come up: a scrapper, a simple database interface, photo archive (that checks for jpeg health), backup manager.

You know you are a worthless piece of shit when you search for a video.

Upotte

New languages are memes. All the good ones were already made.

I like this approach, can you give any good examples? Would appreciate it, user.

disgusting tit-bags, udders

read the manual then start fucking around