/dpt/ - Daily Programming Thread

What are you working on, Sup Forums?
Old thread:

First for C.

what Haskell users claim vs What actually happens

Next time use an anime image

>tidying up coworker's code
>see a load of expressions of type void
>delete them
>he gets mad because his code is now broken
>I tell him to make his code referentially transparent in future

Make an UltraSPARC II anime girl.

How do you meaningfully decide on a language that's right for you when you only know 2 or 3? Can't most of them do anything you want anyway? What are differences that matter, besides large libraries of pre-written functions?

But old CPUs are cool, user.

Learn Haskell and Java and tell me they're basically the same.

I'm sure they feel very different, but how do you decide ahead of time which one is right for you early on?

Well it should be obvious that scripting languages and compiled languages and Java or C# are tailored towards certain problems. If you know 3 compiled languages but know no scripting languages then you're doing it wrong.

Can I get some project ideas for actual things? I just created a web extension that replaces baka desu senpai with the original words and I feel bad that it is really the first actual thing I've made with my programming knowledge, instead of just Comp sci exercises. I really only know Java, a bit of JS and can refresh myself on ruby. Also, is looking up code on stack overflow bad when you are trying to teach yourself, even if you make sure you understand it before using it?

Pls help, I know I have been posting in these threads too often but making that web extension showed me how little useful programming knowledge I have and I am desperate to learn more, and make useful shit. Again, please just rec me some easy-intermediate project ideas.

Say something mean about C

It spawned C# C-- and Java

Why do people try to learn programming if they don't have enough interest to even think of something to make?

Create a video player

You often have to write inferior versions of things readily available in C++ like vectors.

Thank you.

Sometimes it's hard to even imagine what a single human can make without any programming knowledge, because what we typically interact with are programs perfected by teams of highly-skilled career programmers.

Macros are a poor means of metaprogramming.

...

rol

k

2simple

llor

Because it's easy to pick a project that's too complicated for a begineer?

roll

>Enigma simulator
Is this actually hard to program or the difficulty is you need to understand how the enigma machine works?

Wasn't there a thread for this shit before?
Go post your useless rolling posts there.

>learning something
>afraid to make mistakes
Not how it works.

#define is pretty high up there in terms of top archaic and shit design practices.

Enigma just encoded text. So I would think it's just an issue of understanding the machine.

Thanks lad needed this.

I miss some of those old designs. Gawd, some of their designs make our Intel/AMD overlords look like the fucking morons they are, and that was 15+ years ago.

>minesweeper harder than sierpinski's triangle

wtf

What does coldet&texture mapping mean for the Wolfenstein 3D one

>. If you know 3 compiled languages but know no scripting languages then you're doing it wrong.
That's just the industry rolling over the current crop of programmers to shove something up their collective asses again - most likely smoke. 20 years ago that was inverted, compiled was expected and "scripting" was looked upon as "programming lite". How times change.

but still its better to know at least one compiled language + at least one scripting language than more of one kind

IIRC it's simply a set of layered transposition ciphers, plus a moving key schedule. You could probably emulate the entire thing with a set of arrays, one array for each "encoder" in the engima machine, each array holding one letter as an element. Bonus points for using modulus integer arithmetic to simply "wrap" the ends of the array together in a "circle"... but I'm just tossing ideas out.

Often times your time is more important than any other issue.

Yeah it is. You kidding?

Also a function of the age we are in. Machines go down in cost, programmer time goes up in cost. So yes, you're right, although it still highlights how the industry has "flipped over" certain assumptions.

There are no scripting languages worth learning that are not compiled languages.

>implying scripting languages are faster to develop in
They are not.

>know at least one compiled language + at least one scripting language
I don't disagree. In fact, you should probably know more than one programming paradigm. So many people can't look beyond OOP because it's the current (dominate) paradigm of the time, but you'll learn so much more by just peeking over that walled garden.

OOP is nice, but it's not all it is cracked up to be. Personally it seems bloaty as you have ever-increasing numbers of objects, and the overhead of structuring out each object adds up. Probably why procedural programming is mostly useful for high-speed computation but sucky at abstraction; OOP makes abstraction a breeze at a minor cost in computational terms.

>scripting languages are faster to develop in
Not really trying to saying that; rather I'm trying to highlight how the current generation of programmers (and the industry in general) have a certain mindset.

>Probably why procedural programming is mostly useful for high-speed computation but sucky at abstraction; OOP makes abstraction a breeze at a minor cost in computational terms.
Too bad there isn't anything that surpasses both OOP at abstraction and PP at optimization.

The industry in general is idiotic.

>OOP makes abstraction a breeze
This is false.

This thread is pathetic. It is not about programming. It's just a bunch of computers illiterates and first semesters memeing about whatever they just learned.

A quick search shows that most arguments used here are not original and have been copied from some trendy tech blog or other shit site.

The only real programming questions I have seen so far are from beginners(it's fine to be a beginner btw).

This entire site is shit and I don't know why anyone would regularly come here.

Let's give him the benefit of the doubt and assume he's comparing to C.

Same

Well 40 years ago compiled languages weren't really a thing and it was just assembly or a scripting language like basic.

what language should i use to program a music visualizer?

I've done it in C before.

I would love to move over /sci/ but I'm not smart enough.

>a scripting language like basic
Basic has a funny history. Many of the "Microsoft baby Basics" were "compiled" down into a set of bytecode for an interpreter. Prior to that, yes, some Basics stored the actual words in memory and had it interpreted at runtime, although most eventually moved to the "this byte means this keyword" format.

What makes it impossible to modify hardware?

Console, Smartphone, Smart TV, Cable Modem, etc.

It's all just binary at the end of the day, right? What prevents you from removing Microsoft's operating system and installing a Linux distribution? Companies can update firmware over the internet. So, clearly it's not built in. And even if it has security, what's stopping you from copying the firmware update, extracting the information you need to make it run, and forcing it to install your own thing?

>What makes it impossible to modify hardware?
How difficult it is to do.

Python 3 with PyCharm Edu

name = "John"
age = 17

print(name == "John" or not age > 17)

print(name == "John" or not age > 17)

print(name is "Ellis" or not (name is "John" and age == 17))


#Why and how is it that they are executed at the same
#time?

#How can i know what operator I should be using?

#The questions are being asked because it is making me
#understand that "and", "not" & "or" are different and
#not interchangeable, however the answer shines light
#on the fact that you can combine them...

#which would be the rules of this?

so what's stopping you?

>I would love to move over /sci/ but I'm not smart enough.
You're not going to improve by hanging out here.
Move!

ROM, I imagine?

i'd love to talk about math on Sup Forums but visiting /sci/ only makes me appreciate other boards more
it only attracts idiots

Learn what code tags are
so you can write stuff that looks like this

>70 Reverse a number mathematically

Rate my babby's first Scheme solution.
>pls no negative numbers
(define (get-digits x) (if (< x 10) (cons x () )
(cons (modulo x 10) (get-digits (/ (- x (modulo x 10)) 10 )))))

(define (reverse x) (if (null? x)
()
(append (reverse (cdr x)) (list (car x)) )))

(define (list-to-num x) (if (> (length x) 1)
(+ (car x) (* 10 (list-to-num (cdr x)) ))
(car x) ))

(define (rev-num x) (list-to-num (reverse (get-digits x ))))


So my code works as far as I've tested it, but it's far from optimal and doesn't handle every case. Despite this, how can I improve in thinking in a function oriented way? This problem seems far easier to solve in C++ for example, but that may just be since I'm more familiar with that.

I like math books that use informal language

>What are you working on, Sup Forums?
Still working on my anime program.

It's currently 'defeated' by groups like Horriblesubs who (only sometimes) name the second season as the first, but then just continue the numbering scheme.
For example, the for the Gundam one, the first episode of season 2 was release as "Mobile Suit Gundam - Iron-Blooded Orphans - 36" (my program can pick up on the english title).
Not that I actually want to watch boring Sunrise mecha garbage, but I don't want this to happen with other series I actually want to watch.

I can't really think of a way to fix this without potentially introducing a bunch of false-positives, or requiring me to do shit manually.

How do you get good at programming in general?

I know this is a broad question asked often, but I've taken a course or two on programming with experience in C++, Java, and some other basics, but I feel like I can't do anything "big." I don't necessarily have anything specific in mind, but when I see someone who knows how to navigate a computer/solve a random problem quickly and easily, I feel like I'm nowhere near that. How do I get there? I just feel like I know a bunch of tricks but no solid foundation.

Keep it simple and stupid.

How much time have you been lurking here? Because your question has been answered several times. Don't wanna be rude btw, but it just takes practice, like build a couple of personal projects, start using libraries (even though if you feel like you're cheating).
>but no solid foundation.
What foundations do you think you're lacking?

Sorry, I just mean I feel like I don't understand fundamentally how computers are set up to work. I get lost pretty quickly in jargon and feel a huge knowledge gap when it comes to terminology, and how each piece of the computer contributes to do whatever tasks. Similarly for how networks work.

To clarify, I have a little knowledge about simple data structures like stacks, linked lists, etc. and some idea about memory (as an abstract concept; not so interested in the physics of how these are implemented in hardware), but I'm clueless when it comes to higher level things and common libraries.

Why is so much CSV data just fixed-width data with commas? Isn't the benefit of CSV being variable width columns?

Programming has a lot of fields, you can't tackle them all. For example, networks it's a field on it's own, so you probably don't want to dive soo deep in them, just have a general knowledge. Same with databases. I think it really depends on what you want to program, desktop apps, mobile, websites?. That way you can narrow things that should interest you.
Libraries basically are just a set of classes and functions.

>I get lost pretty quickly in jargon
This is fair. There is nothing to understand about the infinite layers of abstraction. Focus on mathematics/physics and learn C. It's a really small and simple language that does absolutely everything you need.

>simple data structures
That's all you need to know. "Higher level" and "common libraries" are just tumors grown out of control.

like watching paint dry

>In the context of this 1970's computing style, K&RC is actually correct. As long as only trusted people run complete cohesive programs that exit and clean up all their resources then their code is fine.
>Where K&RC runs into problems is when the functions or code snippets are taken out of the book and used in other programs. Once you take many of these code snippets and try use them in some other program they fall apart. They then have blatant buffer overflows, bugs, and problems that a beginner will trip over.
>The best way to summarize the problem of K&RC "correctness" is with an example from English. Imagine if you have the pair of sentences, "Jack and Jill went up the hill. He fell down." Well, from context clues you know that "He" means Jack. However, if you have that sentence on its own it's not clear who "He" is. Now, if you put that sentence at the end of another sentence you can get an unclear pronoun reference: "Jack and Frank went up the hill. He fell down." Which "He" are we talking about in that sentence?
>-- Zed A. Shaw, "Deconstructing K&R C"
"K&R C is stupid because you can't just copy and paste from it like a total Pajeet and you have to actually do some work to reuse their code"

I'm not sure what exactly I'm looking for, but I want to have enough knowledge to feel "comfortable" digging around on my computer. I see some people who can customize their computers so well, with all kinds of macros, that it's truly theirs. Occasionally I'll run into a problem and think, "I wish I knew how to program better to fix this," but nothing specific comes to mind.

I'm actually in math, not CS. That's another motivation for learning programming to me, as sometimes I need specialized computations to understand things/research.

Zed Shaw was a big deal for a while. He seems to have been bullied into silence these days.

What's going on here? I'm using the OpenGL SuperBible, and I included the OpenGL framework to #include , but despite the "gl.h" file clearly being in the OpenGL framework, the IDE doesn't find it...

Any ideas?

Should probably mention this, but neither
#include nor
#include [ work

Is his "Learn C/Python/Ruby the Hard Way" series any good? I recommended it to a friend, but I'm trying to figure out the best resource for teaching myself C and I don't like what I'm reading about him. His "Rails is a Ghetto" rant was pretty funny, though.

my background is like that, but it's hard for me to tell what you're after. there are textbooks for things like operating systems and networking; some of them are good

he might just be content with book money

>but I'm trying to figure out the best resource for teaching myself C

what IDE is that
muh dik

XCode. Linking libraries has been a pain in my rear end.

1. Where in your filesystem is the gl folder, and why is it not /usr/include?
2. What flags are being passed to the compiler to let it know where to find gl/gl.h?
3. Are you sure it's not GL/gl.h, rather than gl/gl.h?

Thanks my famalamadingdong

One day I'd like to learn more about Operating Systems, but I guess I'm also unsure of what exactly I want. I feel like I have to first learn what the possibilities for programming are before I can dive into any one path. Do you have recommendations for that?

What can I learn from studying C if I already know C++ moderately well? What about C#?

If you know C++ moderately well, you probably already know C moderately well, or at least enough to read C codebases. Advantage to learning it proper would be the ability to modify C codebases.

C# is a completely different ballgame, and would be something to learn if you plan on getting into the .NET ecosystem.

>Linking libraries has been a pain in my rear end.
Maybe you should stop using an IDE and do the following:

1. Learn where the hell your compiler searches for header files and library files by default
2. Learn how to expand include and library search paths
3. Learn how to tell your compiler to link to a library in the first place

Really, it doesn't take a long time to learn these things, but it can make dealing with third party libraries a breeze if you actually know what the hell you're doing. And if it seems like a pain to be writing out long GCC and Clang commands, that's what Make and CMake are for.

>make some progress
>get stuck on some stupid trivial shit

Are there any C++ libraries that handle cross-platform GUI without using HTML? I know that they exist for C#.

my IDE or compiler assumes all headers are in /usr/include and all libs are in /usr/lib.
I use codeBlocks and at the top of my build log is the actual script it makes and runs.
"gcc -Wall -lSDL2 -lGL -lGLEW -lGLU -ljalb -ljalbFT2Atlas -lfreetype -g -I/usr/include/SDL2 -I/usr/include/GL -I/usr/include/freetype2 -I/usr/include/jalb -c /home/jokes/workspace/cSpace/edit/jalbViewer/main.c -o obj/Debug/main.o"
I can then go back into my IDEs compiler settings and see how it transforms what i give it into an actual command.
This helped me understand what is going on behind the scenes. Go read some compiler tutorials... do first then ask

In Fortran 90 how would I print a two dimensional array row by row?

fixed it

thank fuck

There are MANY. C++ doesn't tend to do GUI stuff with HTML. You'd be thinking of Electron, which is used with JavaScript. If you're looking for a big framework that can do a bit more than just GUI stuff, can be scripted, and works on lots of platforms, use Qt. If you want a light GUI toolkit, try FLTK or WxWidgets. In between the two would be GTK+.

>make some progress
>get stuck on stuff that should be trivial but the language sucks