/dpt/ - Daily Programming Thread

Old thread: What are you working on Sup Forums?

Other urls found in this thread:

better-dpt-roll.github.io/
sukritkalra94.wordpress.com/2014/05/26/sicp-exercise-1-8-newtons-method-for-cube-roots/
github.com/meganz/mingw-std-threads
waifu2x.udp.jp/
buildyourownlisp.com/
ahefner.livejournal.com/20528.html
twitter.com/SFWRedditVideos

C a shit

Second for D

You have been visited by the crossdressing C programmer of Shimoshina Academy!

Good performance, triple indirection, and tail call optimization will come to you, but only if you post "Keep overflowing the stack, Hime!" in this thread.

Hi. Where the heck do I get the Sup Forumss reccomended reading for programming picture?!? I want to learn!!!

Don't pollute a fresh thread with your trash.
Also, triple indirection isn't a good thing.

What if you need to pass an array of char arrays by reference?

7nd for Lua

You only need double indirection for that.
After double direction, you should seriously start questioning the design of your program.

Angular 2 based web app for business productivity software startup.

I'm an idiot for trying to use a beta version of a JS framework to build an app on. It's been a fucking nightmare the past couple months. They keep changing up core parts of the framework, and I have to rewrite shit. It just demotivated me and I'm not even working on the frontend right now.

Selenium tool that tests new builds of asp.net applications with different browsers that chains together complicated tests

Why does developers tends to avoid comments in functions and in their source code in general? I've seen that in semi-large projects that are good and usable.
I wonder how they manage to remember what their functions/classes does.

Wiki

WHAT THE FUCK DO I PROGRAM!! I HAVE NO IDEAS!!! I'M CREATIVELY DEAD!!!

better-dpt-roll.github.io/

Fucking google "programming project ideas," you soulless cunt.

You're a carpenter wandering around in a goddamn lumberyard, bitching for something to do. Disgusting.

threadly reminder that you have to porgram every day or you'll lose your powers.

Hey Sup Forums about a week ago I started to program for the first time in my life (I picked Haskell) and I was wondering when is the optimal moment to start learning about data structures/algorithms and all that stuff

job security

From the beginning.

Do you have any material recommendations?

Working on an automated manga redrawer using machine learning. Spent awhile slogging through research papers and scrapping data but hope to have a working application before 2017.

Pic related some data I collected.

Nice, when did you start on this project?

w-what?
Looks cute but I miss the purpose.

Any programming book worth its salt will talk about data structures as well. If you're just beginning, then don't worry about getting a book specifically about data structures until you finish the book about your chosen language.

Early June.
Scanlating manga involves erasing the original Japanese text, which can destroy a lot of the artwork. The application's purpose is to automatically draw in the erased areas so everything looks right.

The same tech has applications elsewhere. [spoiler] Like uncensoring hentai[/spoiler]

best way to check if certain key combination is pressed in python?. Like if I want to terminate the program when control+S is pressed, what would be the best way?

>Scanlating
lol just learn Japanese already.

I work on my school project.

I need to make a truck manager program with Windows Forms. The teacher sent us some basic UML diagrams to help us get started but we have to "contribute" to the diagrams or we get fucked.I am done with that tho I just need to make the code nicer and correct some mistakes.

I am having fun with this project.

I will post a webm when it is done.

If that could help to get my rustle doujins uncensored faster, then it's a great application.

Why not WPF?

The next project will be made in WPF. I want to "impress" the normies with some fancy animations in the class next time.

That actually sounds pretty useful assuming it can work consistently.

>lol just learn Japanese already.
I keep meaning to do that and end up starting/restarting cause I usually can't remember a lot of kana, let alone more than a handful of kanji or actual words.

That's a pretty handy tool you're working on there.

Don't have time for that when you're a starving grad student senpai.
Support for RGB pictures will come much later once I get it to work for monochrome pics

Can anyone explain IDE wars? I don't get it.

Waiting for feedback from my last chance at a job. Literally don't know what I'd do if I don't get it.

If nobody but them knows what their code does, only they can maintain it. making them a valuable asset to the company

I'm a new programmer currently working on a Chess program in Java. Please tell me how retarded this implementation is.

Space class. Children of this class are the pieces. The only difference the children have with this class is that they have a different default state. Squares on the board are changed by changing the string variable "state" to the name of the piece and color of the piece.

public class Space {
private String state;
private String color;

public Space() {
state = "blank";
color = null;
}


Then we have this object which takes in the game board as its only parameter. It then has many methods to check whether the move is valid. Which I have not created yet.

public class RuleChecker {
public Space [][] board = new Space [8][8];

public RuleChecker (Space[][] rboard) {
board = rboard;
}
public boolean isMoveValid(int x_start, int y_start, int x_end, int y_end) {
String start = board[x_start][y_start].getState();
String end = board[x_start][y_start].getState();
return true;
}

Is this shit idea of implementing chess board? Or is it good enough?

Now I get it, seems like a dirty trick but necessary. ty

Are there any offline interpreters, compilers, and/or IDE's for non-shit languages (acceptable languages include scheme, C, C++, C#, Python, and Clojure) I can download on a jailbroken ios device or unrooted android device to program on the bus?
>inb4 use a laptop
Laptop has shit-tier battery life and I'm too poor to afford a new one at the moment.

>Programming in public
>Programming on a mobile device
>Programming without a keyboard
Are you serious? Do you really think you will be able to accomplish anything like that?

Edgy faggot

>C still doesn't have a standard half float implementation

Who the fuck thought this was a good idea?

There are talks of adding "short float" to the next standard.

Why is SQL so comfy?

>C programmers
Why do people use C if they're not writing a systems program?

I'll be able to learn a bit, fuck around in scheme and read some SICP. I also have an old Motorola droid phone I could use.

It's probably the worst possible programming environment you could impose on yourself.
Seriously, don't even bother.

Why wouldn't I use C?
Who says that I'm not writing a "systems program"?

Any programs I should download for programming in C besides Vim and GCC?

sicp 1.8

sukritkalra94.wordpress.com/2014/05/26/sicp-exercise-1-8-newtons-method-for-cube-roots/

is this pajeet totally stupid or am i?

>Why do people use C if they're not writing a systems program?

Because they are 1337

GNU Make
Valgrind

Thanks pal.

What's wrong? They used two accumulators?

+gdb

It's you.

why the frickin frick does this not work?

someStruct.member1 = &someStruct.member2;


member1 is a pointer to the type of member2

show complete code

>two accumulators
enlighten me

>whats wrong?
i was reading it in public setting :/
in reality what i got was nearly identical to his:
(define (good-enough? guess x)
(< (abs (- (improve guess x)
guess))
(epsilon x)))

(define (epsilon x)
(* x .0001))


it tends to be

I'm working on a threaded SDL2 application and despite working on Linux, I would also like to compile it on Windows.

Should I use std::thread or SDL's threads? Couldn't find an answer to this by Googling, but I heard thread support on MinGW wasn't very good. If possible, I would like to be able to compile the same code on both systems without too many system specific macros and what not.

>5000 lines of python code written over a 3 months
>need to convert to c++ in less than a week
Fucking kill me now. I dont even feel like Im writing c++. Its more like Im writing a compiler for python.

std::thread, no question
thread support will be just as bad on MinGW no matter which one you choose, but you'll want to compile with VC++ on windows for any "serious" program anyway

Is that some racket?

GCC supports threading from 5.1.0 as far I remember, if you can't afford that then just go for SDL, it'll work out of the box on almost every major platform.

Doesn't MS's compiler now inject some sort of call-home stuff into all binaries? Its not a huge program so I would prefer not to install VC++ just for it and since MinGW I've already used.

Maybe this library would fix the issue? github.com/meganz/mingw-std-threads

I can use any version as long as it's available so suppose I'll go for that. Thanks!

>Doesn't MS's compiler now inject some sort of call-home stuff into all binaries?

Think they just got rid of the telemetry stuff in the last update of Visual Studio.

more or less
im running drracket using
#lang planet neil/sicp

that said i imagine that snippet would run in any number of parentheses-laden languages

>half floats
Who the fuck thought that was a good idea?

/dpt/ used to have an IRC channel with tripfags and shit. Some guy was making an OS.

Is it still out there?

Debuggers are a meme.

What is a manga redrawer?

i think he basically means one of these:
waifu2x.udp.jp/

How do I make my own programming language? All the ones I learned suck. I want to make a good one.

buildyourownlisp.com/

Designing and implementing a programming language is actually quite difficult, you know.
Anyway, in essence, all you need to do is to write a compiler or interpreter for your language, but there are many steps to that.
Read the dragon book.

Forgot to mention: no memes.

Not the guy you're responding to, but does the book get better?
I never got past tokens, i just felt like he was spewing a whole lot of complicated needless bullshit for something that is actually quite simple.

If you want to write a compiler for something, and have it not be total trash, you probably want to read the literature.
A lot better computer scientists that you have been developing those compiler techniques.

I don't need to start from scratch. I want to start from Scheme. It's the almost perfect programming language.

I'm just not sure what to do or where to start. I've read all these fucking papers, had some deep realizations but I still feel like I haven't learned shit. These fuckers keep talking about abstract shit but they never quite tell me how they did it, much less how to do it.

I just want to have a fucking system bootstrapped.

Lexical analysis really is a simple concept. Want me to explain it to you?

Papers like pic related? Yeah right.

What the fuck are these magic runes.

>Want me to explain it to you?
Yes please sensei.

You split a meaningless input string into meaningful chunks (lexemes), so that you can parse them easier.
The end.

I already understood that.
If anything I need help with code generation and (i think) AST's.

why is python so weird about referencing variables in functions

like I try to get one variable without importing it into my function and that fails, so I set a function to make a global and that fails too because 'it hasn't been set yet' (even though it has). and booleans don't work at all for that
stupid

>so I set a function to make a global
Like how? (Pseudo) define the variable globally and then change it in the function?

You have a stream of stuff.

In 95% of cases, the stuff is just characters. In 4% of cases, the stuff is raw bytes. The 1% is there because really you can tokenize anything.

In the stuff stream, chaos reigns. As far as anybody can tell, it's a bunch of random bits, in dire need of structure. So, what do we do with them? We lexically analyse them.

One by one, we take stuff from the stream. Let's call them things.

We compare them against what we expect will be in the stream. These expectations are higher abstractions that you define. For example, an identifier.

If it's successful, we have a partial match against one specific kind of abstraction. Once we eat up enough things to fully match that abstraction, we have a result. A token. A lexeme. The basic unit of parsing.

Lexical analysis reduces the complexity of the input stream. It digests a lot of characters and returns one token with the string inside. If the lexer expects an identifier matching [a-zA-Z], when given asdfg it will match the a, consume the sdf and finish at the g, and return a single token. It drastically reduces the size of the input of the parser proper.

Notice that it maps a stream of characters into a stream of tokens. The structure is still linear, regular. It stops being linear when the parser runs. The context free grammar naturally produces output of higher complexity.

Code generation maps this high complexity parser output back into a low complexity, linear form. Byte code, assembly code... What is it if not a linear stream of bytes?

It's been days and I still can't even begin to decipher this shit.

Can you go into more detail about code generation and specifically register allocation?

Uh, wouldn't it be better to have the pieces calculate the squares to which they are allowed to move than to have a singleton class for checking rules? Like a piece.getMoves() method?

Movement restrictions are not just determined by the piece, they are determined by the state of the board that is my logic.

Read the dragon book.
It goes into that.

You could pass information about the board state to the method, or you could get the possible moves from the piece and then have the board class determine if these moves are illegal (due to blocking, moving off the board, whatever). A huge global controller class is best avoided when designing a program, I've heard.

yes
works in other languages, you know

you can have entire statics classes for this sort of thing

I've just been using enumerate() as a very hackneyed way around it for now

One of the most based examples of code generation is this this assembler for the NES CPU:

ahefner.livejournal.com/20528.html

It's a nice read. Abstraction by abstraction, he synthesizes higher level constructs on top of the assembly code. It's as if he was working his way bottom-up from the assembly to the abstract code of the AST in his head.

Code generation is to do the opposite. You have a complex AST. You want to compile it down to a linear output vector.

The trick is that all those higher level AST nodes compile down to some lower level node in some defined way. You take some nodes and generate some output. You take a C function node and a __fastcall annotation node and you decide you're going to use fast call ABI calling convention. You proceed to recursively compile the rest of the function, but you've gotten rid of that fastcall node. Now it's not two __fastcall + function nodes anymore. It's just one fast-call-function node. You compile that node and you find a lot more that become simplified into lower level constructs until it's just one single whole bunch of bits.

And just like that, a procedure eventually gets mapped into some architecture's assembly code according to some calling convention.

Register allocation really matters when you want to optimize the shit. We say there's pressure on the registers when there are more variables than registers. They get written onto the stack to make room for new variables and get restored later, as if a function had been called without a jump. Slower but hey it works.

So now we have the problem of deciding which variables get mapped to which registers. I don't know much about this. Apparently there are some NP-complete graph algorithms that are doable for static compilation but suck for JIT because slow.

What is wrong with this insertion sort? Code is not working.

public static int [] inSort (int[] array) {

for (int j = 1; j < array.length-1;j++) {
int key = array[j];
int i = j - 1 ;
while (i > 0 && array [i] > key) {
i = i - 1;
array [i+1] = key;
}
}
return array;

}
}

>C++

Jesus christ, how horrifying.

...

Why is Prelude so shit?

Keep overflowing the stack, Hime!

this kills the programmer