/dpt/ - Daily Programming Thread

Old thread What meme/project are you working on Sup Forums?

Other urls found in this thread:

youtube.com/watch?v=wLHl8zJxeUQ
labs.ig.com/static-typing-promise
a.pomf.cat/vbrtpq.c
github.com/JamesJDillon/CHIP-8-Emulator
xoroshiro.di.unimi.it/
software.intel.com/en-us/articles/fast-random-number-generator-on-the-intel-pentiumr-4-processor
twitter.com/NSFWRedditImage

Rewriting the Linux kernel in Haskell.

Rewriting the Haskell kernel in Linux

Rewriting the C standard in Haskell.

>[Deleted]
Sweet irony. Didn't know Sup Forums had mods, nice job.

This one is 13:37, though.

legit thread

youtube.com/watch?v=wLHl8zJxeUQ

The broken promise of static typing

labs.ig.com/static-typing-promise

Where is OCaml?

pls rate my dumb music program~
pipe the output to your sound card!

unsigned char b[], r1[], r2[];/* run: 'gcc -o pcm pcm.c && ./pcm | aplay' */
main(p,c,m){p=0,c=1;while(1){for(m=0;m2){p=0;c=!c;}}}/* a.pomf.cat/vbrtpq.c

Rewriting yet another CHIP-8 emulator in C++. I've posted about it here before, but here's the source if anyone wants to see it.

github.com/JamesJDillon/CHIP-8-Emulator

Clojure has few bugs because nobody wrote anything more complex than hello world with it, just look at the commit count.

I already felt this was true from personal professional experience

>experienced more issues with Java despite it being designed to be safe
>fucking Python is smooth as fuck to dev in and we end up more productive too

I don't know why /dpt/ pushes this 'dynamic is unsafe' meme

It's not lisp's dynamic typing as much as it is the sweet sweet homoiconicity that makes it easy to write bug free code.

That being said static typing is obnoxious. Most of my issues are implementation issues not type issues.

It's in the F# section.

What's the most rewarding algorithm you implemented with chip-8 opcodes?

F# and OCaml are clearly not the same thing.

I know. One has freedom and the other doesn't. Otherwise they're pretty fucking similar. It's like Scheme vs Racket.

Rewriting the assembly standard in python

I've not created any chip-8 roms. I don't think it would be particularly difficult, just very time consuming.

>Rewriting the assembly standard in python
>the assembly standard
>assembly standard
>the
Wat.

>That being said static typing is obnoxious.
I find it to be the complete opposite - I like static typing because of the productivity gains, I don't necessarily care about whether it helps reducing bugs or not.
With static types (or I guess type annotation is what I actually care about) you get immense help when refactoring, navigating code and documenting the code.

Friendly reminder that there's nothing wrong with VLAs.

it's a variant of the "rewriting the linux kernel in haskell" meme

Shit's fun tho

There's everything wrong with them, they are 100% useless.

It's a useless meme.

If you're working on a large project, you'll be explicit about static/dynamic, and not fussing about with VLA unnecessary magic

He's so wrong that he's not even wrong.

int main()
{
int n=_____
srand(n);
}


Well /dpt/. What do you do?

I saw you post your emulator the other day when I had a weird problem with writing my emulator. I didn't know whether it was a graphical glitch due to some SFML fuck-up, or an actual opcode error.

Turns out I had a brainfart, and was accidentally accessing some value of the memory array when I should've been I should've been getting from the register array. Stupid mistake.

int n=*0
At least I won't be responsible for anyone using fucking srand() in 2016.

>stdlib rand
>any year

Seed it with the nanosecond value of a read from clock_monotonic. If you need more than 1 billion possibilities you probably shouldn't be using rand().

Please never do anything that involves cryptography or security in any single way, ever.
This is the most terrible advice in the history of broken shit.

even for non-crypto applications, there are much nicer more modern methods like xoroshiro128+

You're not using rand() for cryptography you idiot. If you are then you deserve death.

As such my method is perfectly fine.

Make it a constant because I generally want a consistent series for reproducibility .

You can write your own faster and statistically normal rand() in 10 seconds, literally anything is better than the bloated shit that's part of glibc.

rand() is implementation-dependant, it could literally return 0 all the time and that would be a valid implementation.
Even for reasonnably distributed random-looking numbers, you can't rely on rand().

Oh really? How?

xoroshiro.di.unimi.it/

Why doesn't glibc use xoroshiro or something then? It's a free software project can't someone just push a change to use a better randomizer?

software.intel.com/en-us/articles/fast-random-number-generator-on-the-intel-pentiumr-4-processor

char r;
read(open("/dev/urandom", O_RDONLY), &r, 1);

I'm actually serious.

2 shifts, an add and a bit rotate is what I use for most of my demo projects.

It's probably higher quality randomness, but you're wasting thousands of processor cycles on creating a file handle instead of just scrambling numbers using a LCG.

int rand() {
return 4;
}

>using /dev/random unironically
>in the year 2016
I shiggedy.

int n=0, PRIME = 961753369, genSize = 1234;

n = (n + PRIME) % (genSize);
....

Will give you a perfect permutation of 0..genSize as long as PRIME and genSize are relatively prime (which they are if genSize

Well since the C standard doesn't impose any requirements about distribution or period, you couldn't rely on it anyway.

>Haven't programmed in 2 days

I don't think that's a good excuse to have shit. And you could rely on it if your platform uses exclusively glibc.

>Spent 2 days programming an application to forge my job search history so I can do even less work

get to it

Are there any benefits of typefedding structs besides not having to write "struct" all the time?

here's some advice
on school laptops if they're using chrome dumb students will login and save their passwords

if you go to settings > manage passwords you can actually print their login info out for literally all of their accounts.

After you graduate this will be helpful if you need to do further research or frame someone for something

No, that's literally the only benefit.

If you're writing a library and that struct is integral to your library, like a context, yes, it should be a typedef.

typedefan is the same as #define'an except the compilar can use its black maagick and optimize u're code

Send a patch and see if it gets accepted ^)

C++ compatibility

glibc is dead to me since my boi Ulrich left

Here's some more advice.
On school networks where you need to enter your name/password to connect to the internet, Ethereal + Wireshark will hapilly give you a list of everyone's passwords.

You can also use Ethereal for replacing all images on the fly with !beecock.jpg when people browse the Internet. I had fun in boarding school.

s/Etherreal/Ethercap/

Thanks, I kind of liked not typedeffing all my structs so I can immediately tell if a struct belongs to a library or if I defined it myself.
Just wanted to check if I missed a hidden benefit.

Web application framework written in an intermediate language which compiles to ZendEngine C.

const float vertices[] = {
-0.25f, -0.50f, 1.0f, 0.0f, 0.0f,
0.50f, -0.50f, 0.0f, 1.0f, 0.0f,
0.50f, 0.50f, 0.0f, 0.0f, 1.0f,
};


#version 410 core

layout(location = 0) in vec2 position;
layout(location = 1) in vec3 Color;

out vec3 color;

void main()
{
gl_Position = vec4(position, 0.0, 1.0);
color = Color;
}


#version 410 core

layout(location = 1) in vec3 color;
out vec4 Color;

void main()
{
Color = vec4(color, 1.0);
}


Why does it produce a rainbow?

OpenGL averages color by default so when you have red in 1 vertex, blue in another, and green in the last, you get a motherfuckin' rainbow.

Can somebody explain this xor rotate shift rotate thing to me?

my school has very up-to-date network security sadly so I wouldn't risk it

oops average is incorrect term. bilinear interpretation is better term

OpenGL is ran by sjw jewish multinationals who want to control the whitey by pushing homosexuality, abortion and higher taxes

VOTE TRUMP
MAKE AMERICA GREAT AGAIN

Because if you want 3 solid bars you need 3 solid bars.

If you change your MAC, there's literally no way to catch you ARP-poisonning the local router.
Easy way to have some fun watching people reload madly while their facebook feed fills with beecocks.

>not forcing HTTPS

Oh yeah Facebook probably 301's to HTTPS nowadays, but it used to be that you could just SSLStrip it.

varying/out variables get linearly interpolated between vertices

I finished my pastebin editor, my thread scraper, and my intermediate framework

Now all I need to do is pipe the inputs between each module properly and sanitize everything. Made a lot of progress actually.

I wonder if I could make money undercutting Sup Forums passes by selling a browser addon that uses commercial Captcha-solving websites.

don't be a jerk, Sup Forums needs money

But sigourney literally abandonned us and ran to mootxico, what's the point?
I'm sure hiro can pay for servers.

Someone gave me some snippets to some array system for C awhile go, and i've been working on something that lets me queue SDL_Texture's into an array.
//array.h
void *array_add ( Array *arr );
#define array_add_val(type, arr) (*((type *) array_add(arr)))
//array.c
void *array_add ( Array *arr )
{
if ( ++arr->length > arr->capacity )
arr->buf = realloc( arr->buf,
arr->element_size * (arr->capacity = arr->capacity buf + arr->element_size * (arr->length - 1);
}
//texture.h
typedef struct
{
unsigned int ID;
SDL_Texture *txr;
unsigned int w;
unsigned int h;
} TxrArr;

Array *G_txr;
//texture.c
void texture_add ( char *filename )
{
SDL_Texture *t = NULL;
t = textureLoad( filename );

unsigned int ID = texture_generate_id();

unsigned int w = getTextureW(t);
unsigned int h = getTextureH(t);

array_add_val(TxrArr, G_txr) = (TxrArr) { ID, t, w, h };
}
I didn't post everything but I will post more if you need it.

my getTexture functions just run SDL_QueryTexture and return the value, they work. BUT, they don't store in texture array that's just made. A workaround is to add this to the bottom of the texture_add function
TxrArr *p = array_get ( G_txr, ID );

p->w = w;
p->h = h;

p = NULL;
It seems unnecessary, but it's the only way I can actually force those values to be set lol.

we solve ridiculously many captchas, i don't think you could undercut it

Hey what about you stop reposting the same crap every thread and go annoy people on StackOverflow?

Hey gee, are pointers signed or unsigned in C?

this is only repost #1, if nobody is interested in seeing after a thread or so then i'll leave it be

Unsigned.

I prefer the term 'autographed'

malloc
a
l
l
o
c

your mom

get a load of this webshit!

Pointers are just a bunch of bits. Asking whether they're signed is like nonsense.

Pointers are pointers. They are neither signed nor unsigned. They are no numbers, they point to addresses in the memory. You can interpret them as signed or unsigned numbers, it just doesn't matter.

>integers are just a bunch of bits
>asking whether they're signed is like nonsense
user...

Well a captcha solve goes for around $0.001, assuming you make 5,000 posts per year, which is 95 posts per week, it'll cost 5 bucks a year. A Sup Forums pass is $20 a year.

If you don't post much more than a hundred times a week, which is probably the majority of people, I can cut the price in 4 AND you don't lose it if you get b&.

For fun, I'm finally getting through the python koans and working through a haskell book.

everything is bits you moron

what the hell are you on about?
A pointer is an address in memory.
It does not make sense to cast it to a signed integer, but you can do what ever with the value.

How the fuck does this shit even work?
uint64_t s[2];

static inline uint64_t rotl(const uint64_t x, int k) {
return (x > (64 - k));
}

uint64_t next(void) {
const uint64_t s0 = s[0];
uint64_t s1 = s[1];
const uint64_t result = s0 + s1;

s1 ^= s0;
s[0] = rotl(s0, 55) ^ s1 ^ (s1

>integers are in any way equivalent to pointers

baka, user.

they are both abstractions of bits, so your answer is unsatisfactory