/dpt/ - Daily Programming Thread

youtube.com/watch?v=Pi5yTAJiJoY

Previous thread: What are you working on, Sup Forums?

Attached: 1502761353275.jpg (227x255, 8K)

Other urls found in this thread:

godbolt.org/g/i2sxF4
stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list
docs.oracle.com/javase/tutorial/tutorialLearningPaths.html
greenteapress.com/wp/think-python-2e/
twitter.com/SFWRedditGifs

dumb frogposter
go back to reddi.t
what an abominable thread
it is completely justified for someone to start a real new thread

I'm doing web programming in sepples :^)

Attached: Screen Shot 2018-03-27 at 2.15.25 PM.png (1078x1478, 271K)

looks neat, are you using a framework or making your own?

>skipping the bookkeeping associated with a VLA, loop unrolling, modulo power of two, etc
Let's see what would happen without VLAs:
1. You allocate a constant-length array on the stack and maybe only use part of it. Now you're using even more stack memory than the VLA would, and loop unrolling leads to wasted iterations.
2. You allocate a variable-length array on the heap (maybe through a custom allocator). You pay in allocation time and none of those optimizations you mention apply either.

I'm not really using a framework right now, except my simple webserver library which is pretty low-level. Once I have a working API I'll start splitting out the generic stuff into a higher-level API framework.

Also, there's no bookkeeping involved in a VLA, and using & instead of % isn't inhibited by a dynamic length (which could always be a power of 2, just not the same one).

I'd release the binaries of my software, but it has bundled libraries with my paths and username in it. I'm editing binaries because I don't want to recompile all this shit.
I bet the CIA niggers did this on purpose.

>Also, there's no bookkeeping involved in a VLA
You need a size variable, or a distinct stack/frame pointer. Either way it's nothing more than you'd be storing anyway.

I find it hard to believe that a compiler would be able to optimize modulo operations for a dynamic divisor into bitwise operations though.

Electron or NWJS?

over-allocating on the stack costs nothing, you only use what you actually use

If you never call another procedure perhaps.

>I find it hard to believe that a compiler would be able to optimize modulo operations for a dynamic divisor into bitwise operations though.
The compiler might not be able to but a programmer easily could. I guess it's still technically an inhibited "compiler optimization" in that case but the opportunity is not lost.

GNU Emacs

emacs for cross platform applications with gui?

>Cbabbies can't recompile their macro ridden fuckfests in a timely fashion

Actually, I'm sure a compiler would be able to do it in this sort of case:
int size = 1

I'm a noob, quick question (i think its up to coder, but wondering if there's a universally accepted method)

If I'm opening a serial port, do I call to open it everytime, or should I have a function dedicated to opening the port then call it from there! Thanks!!

I don't understand what's the difference between between a statically linked library and an object file.

There is not a difference.

Is this correct

software dev -> software engineer -> ...

Attached: cv.png (961x259, 14K)

that's where you're wrong kiddo

godbolt.org/g/i2sxF4

Why the fuck didn't I start using git sooner? This is the shit

git gud

That is quite silly.

VLAs are quite silly indeed

Why do they have different file endings then?

Except that the VLA is irrelevant, it's an issue with 1

yeah I would have done that if I wasn't on a 7 year old shit computer, and building mingw-w64-qt wasn't a 5 hour operation on it.

.a is just an archive of many .o, if that's what you mean.

>VLA is irrelevant
>change the size to a constant and cut the asm by 33-50%

What is the best way to learn more advanced Java?

So far I've been learning from W3Resource and it's just fantastic but what should I use next?

that's what your mom said

change it to [i & (size - 1)], it's still trash

Will learning Assembly make me think more logically?

No, it's basically pointless to write assembly unless you're programming for some microcontroller.

I've never programmed before but I've always had an interest. Starting out is very daunting..should I try learning Python or C# or C++ first? I've read that Python is the easiest but is also fairly useless.

>Will learning Assembly make me think more logically?
Yes. It has been scientifically demonstrated that coding in assembly can transform an effeminate brain into a more logical masculine brain.

You'd have the same issue with a dynamically allocated array. And like I said, using a too-big static array will lead to wasted memory as well as wasted iterations with loop unrolling.

Can you not just accept that regular arrays, VLAs, and dynamically allocated arrays all have tradeoffs? What is it about VLAs in particular that is so bad?

C++, it's worth it

stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list

Switch to Kotlin and practice converting code from Java to Kotlin. Taught me a lot about the internals of the JVM.

C++ is literally one of the worst languages to ever exist, and will rot your brain.
C# is Microcuck garbage.

>C++, it's worth it
"Modern" C++ is a primitive and extremely weak shitlang that produces severe mental malformations in beginners. You should kill yourself.

C# is a great language if you want to be able to get a job and make good money while not using the horror that is Java

Check out Oracle's docs:
docs.oracle.com/javase/tutorial/tutorialLearningPaths.html

Python is literally one of the worst languages to ever exist, and will rot your brain.
C# is Microcuck garbage.

"Modern" Python is a primitive and extremely weak shitlang that produces severe mental malformations in beginners. You should kill yourself.

"can i become a programmer? xDDD"

yes you can build programming logic and go to fucking uni you useless sacks of shit

"another for loop video and im set!!! xDDD"

stop embarrassing yourself

Python is a shitty dynamic language though. And, no, duck typing is not a good solution.

>going to uni

Are you incapable of learning things by yourself?

C
C++ is not a beginner friendly language

Attached: C programmer.png (211x239, 9K)

Python is also shit

Attached: 1521824303844.jpg (469x354, 20K)

Is C "redpilled" or something?

>Python is the easiest but is also fairly useless.
That's wrong. It's versatile, mature, very well-documented, and has a thriving community.

I don't think any of the languages you listed are "harder" to learn than the others. My first language was C++, which many say is a "hard" language; my second was Python, and I had to take the time to learn the syntax, just as I had to with C++.

However, of the three, Python is probably the easiest to begin working with.

This is a good book to get exposed to computer programming: greenteapress.com/wp/think-python-2e/

What?

C++ is more beginner friendly than C, mostly due to the standard library. You don't have to know all of C++ to be able to make good use of it.

Then again, Java is more beginner friendly than C++.

Cherno Project makes it pretty beginner-friendly

It's just a good language to learn and help you understand better what programming is about
C++ is also probably one of the most beginner unfriendly languages

Attached: tweet-terry-davis.png (669x235, 27K)

>learn C
>most C books are from the 70s
help

Python is a mediocre turd, but at least it won't mentally damage that user or make him into a quasi-human plusmonkey.

Yeah but what does a merchant have to do with someone who doesn't like C?

That would be HolyC, not C.

Python is easy to begin with, and with a huge community, especially of learners, so it's probably the best to begin with. That said, it abstracts away some things a lot of more "powerful" languages don't, which you'll have to learn when you try another language, and that might be a bit of a headache. You might also have to unlearn some stuff. I still think it's a worthy trade off, unless you want to go for the "start with the hardest and learn the hard way" route a lot of anons here suggest.
(I heard someone here recommending Go as a simple language that doesn't have some of the cruft of python, but I haven't used it, and Go is shit, so ymmv)

>learn a language stuck in the 70s
cue lispfags

Depends on your needs.
Embedded system : C
AI / some "Bleeding Edge stuff / having something that just works, but how they do that and the speed are not so important : python
General Well rounded language : C++
C++, but with less weird stuff : C# (caution: you need to sell your soul to microsoft for this)
C# without selling your soul : Java (performance is much worse than Python)

Something in web : PhP / Javascript
Android : Java / Kotblin
For the meme, but then realize how useful and good a meme lang could be : LisP

>That would be HolyC, not C.

What is the difference between HolyC and standard C?
Is there any good open source HolyC compiler toolchain?

>General Well rounded language : C++
AHAHAHAHAHAHAHHAHAHHAHAHHAHAHHAHHA

>haskell shills
What makes haskell better than lisp? They're both very scientific very mathematical very academic languages

Fuck off you nuschoolers

>Java (performance is much worse than Python)
Since when?

Attached: confused.jpg (605x539, 50K)

Since never. Only if you write Pajeet tier Java.

So should I just drop my lisp shit and pick up a "learn you a haskell for great good"?

My concern is that lisp has been around longer so there are more resources for whatever you want to do but haskell may lack these resources as soon as I want to make something nonscientific?
Like a browser or a Web page or an operating system

I've done computer architecture and operating systems but it burns me that I can't apply this knowledge myself
I thought Lisp would be good for me because its also academic and at this point I feel way more comfortable writing proofs and a lot of my expertise I suppose is in theoretical computer science and abstract algebra.

Attached: 1512227377435.jpg (690x863, 111K)

Haskell is to Lisp as intuitionistic logic is to classical logic.

Sorry, i still don't quite get static linking.

What's the difference between these commands? Why is one preferable over the other?
ld -r green.o blue.o -o red.o
ar -q libred.a green.o blue.o

>haskell
>intuition

If you don't ind this fun you shouldn't be programming

Attached: 547858.jpg (1680x1050, 192K)

>Low level programming is the only real programming

Low level programming is the only real programming.

Low-level parallel programming with limited space for instructions.
I once unironically mentioned TIS-100 in a job application for embedded systems developer. They never wrote back.

>took me 2hours to make a heapsort in C
on a scale from "pajeet" to "Better start flipping burgers", how good am I ?

How would one do genericity in C ? I guess I could use macros, but except from the basic stuff, I don't really master macros, any resources/advice to learn them ?

Such a fun game, I wanna get his new Alchemy game too, that looks good

Lol that's because you're autistic

Attached: Catloaf-23.jpg (640x640, 86K)

Type system

>genericity in C
Don't do it, it's a bad idea. There are techniques, but they are messy...

Is there a 6c (plan9 amd64 compiler) port for Linux?

void*. Maybe char[] and alignas if you're feeling ballsy.

Why would you want it?

C++. You can start with java too, for simple stuff and getting used to writing

I guess C really is shit.

Attached: 1437171031329.jpg (256x256, 22K)

>help i don't know how to use the internet

You can use C or C++ code in python to speed things up.

Isn't that mostly for scientific libraries and such, though? Probably outside of the realm of the guy who was asking the question, who said he had never programmed before. Anyway, you use C or C++ code in java, or pretty much any decent language.

>Lol that's because you're autistic
That's usually advantageous in this industry.

Attached: Rowan Atkinson in 1916.jpg (450x450, 21K)

No you bought into the memes
You don't want to be autistic in this industry, you want to be dedicated and detail-oriented. Writing about videogames on your resume is pure autism.

There's a reason so many people here are poor NEETs

>Isn't that mostly for scientific libraries and such, though

Yea but for someone new to the programming, most of the time will use tons of library without any care about what happened inside it. Also most library in Python already implemented in C or C++, so as long as "it just werk with this lib" is the most important, the overhead is not too noticeable, since Python only call those compiled library similar to how bash call program.

How do you flash a PIC chip?

Attached: medium-PIC16F630-PDIP-14.png (312x217, 25K)

use flashlight

and for those who like circuits

Attached: 57648.jpg (1280x720, 218K)

It's just old, use a newer language

daily reminder
lain uses lisp not haskell

.
. . lisp > haskell

haskell is for posers
lisp is for people who want to really know their machine
and for people who love lain

lain would never use haskell