/dpt/ - Daily Programming Thread

What are you working on, Sup Forums ?

Previous thread:

Attached: 1519852336101.jpg (900x900, 59K)

Other urls found in this thread:

github.com/Valkryst/Schillsaver
sqlitebrowser.org/
stackoverflow.com/questions/22256124/cannot-create-a-database-table-named-user-in-postgresql
twitter.com/NSFWRedditImage

First for D

does it have not null reference yet

How hard is it to remove data in a database? like a lot of it and not manually.
In this course I'm supposed to design a database of my choosing, create it in some dbms and add a few sample data manually myself.
Thinking of just getting a sample database online and removing data from it keep only like 50~ instead of thousands entries(cause if i leave thousands professor will know i didnt do it myself), since the structure is pretty much the same.
how to do this in lets say postgresql? thinking of using that movie rental sample db on postgresqltutorial

Attached: 1521436110940.jpg (409x409, 89K)

fourth for Lua and OpenResty

LISP LISP LISP LISP

There was talk, but it went nowhere

Pls help.

I want to make an ai which controls helicopter. Is there a decent 3d game which can run on Linux or wine in a windowed mode? Ravenfield was my choice, but it crashes when it is windowed.

Julia is better.

say
>Is there a decent 3d game which can run on Linux or wine in a windowed mode? Ravenfield was my choice, but it crashes when it is windowed.
>have Ravenfield.pgn
>get results

Sup Forums is not the place for this question.

>postgresql
If it's designed properly, for the amount of effort it would take to figure out how to trim the data you could write it on your own.

Go make something in SQLite then look at the file in a hex editor. It should be easy enough for you to figure out what you need to do.

Reviewing the code for Schillsaver for bugs and reading-up on OpenGL

github.com/Valkryst/Schillsaver

why can't constructors promise objects

why do I have to resort to fuckery such as

Promise.all(promises).then(() => this);

Honestly, you could do your entire assignment in an hour tops... if you've paid attention in class

Spring data drop-create mode

Sqlite + sqlitebrowser.org/ my friend.

related: to anyone who may know

why is postgres "in" when you can't even call a table "user"?

most ridiculous shit ever.

Just design the damn database.
You could have finished by the time I responded.
Designing a database that will do the job is the easy part. A monkey could do it.
The hard part is everything that comes after.

create table "user" (id int not null);
??

>using singular nouns for your table names

>Scheme can't do this, definitions local to a let can't be used outside.
You're right. Scheme can't do that and for good reason. But it can do way, way better. From SICP:
(define (make-account balance)
(define (withdraw amount)
(if (>= balance amount)
(begin (set! balance (- balance amount))
balance)
"Insufficient funds"))
(define (deposit amount)
(set! balance (+ balance amount))
balance)
(define (dispatch m)
(cond ((eq? m 'withdraw) withdraw)
((eq? m 'deposit) deposit)
(else (error "Unknown request -- MAKE-ACCOUNT"
m))))
dispatch)


If you wanted exact replication of the behavior in your hacky C++ommon Lisp code it's a little more work but:
(define count #f) (define increase-counter #f) (define decrease-counter #f)
(let ((counter 0))
(set! count (lambda () counter))
(set! increase-counter (lambda amt (if amt (set! counter (+ counter (car amt))) (set! counter (+ counter 1))))
(set! decrease-counter (lambda amt (if amt (set! counter (- counter (car amt))) (set! counter (- counter 1)))))

>complaining about not being able to CAR a null list
Yuck. In most situations that's a bad thing anyway, and it would be better your program crashed on the spot instead of a few scopes later.

I'd rather my program die from trying to cadr the return of a member call than a billion calls later when that null is passed to some FFI'd C function. And when it's useful for CAR to return null, I'd rather use a special CAR for the job than for that to be default behavior.

Math is just signal processing. "Computation" and machinery thereof, is just orderly signal processing, via the underlying logic that drives the universe.

You can call it math, you can call it machinery, you can call it signals. None of these things are a complete and direct description of the absolute basis of nature.

Put in other terms, there are plenty of programs and problem domains that will never involve math more than numbering addresses, and offset calculation. Which again, occur via chains of logic gates that are simply signal manipulators. Bluntly, I've found more often than not people who think computers are "math machines" make poor programmers. They are incredibly averse to accepting the system for what it actually is, and fail to use it efficiently. Whether it's an x86 computer, vacuum tubes, or a bunch of tubs that divert flowing water around(as existed in ancient China), the system must be viewed natively, not as a means to do whatever your present conception of "math" is with results narrowly interpreted accordingly.

thank you.

Working on babby's first webgl game. Hoping that I'll be a billionaire like notch in a month or two.

forums.tigsource.com/index.php?topic=6273
always a fave

>Math is just signal processing
>Math culminates in calculus, which I only used in my signal processing class
Tell me the signal in the twin primes conjecture.

Simpler times

Attached: Screenshot_2018-03-19_20-46-35.png (865x591, 262K)

who gives a fuck jesus

fucking nerds

stackoverflow.com/questions/22256124/cannot-create-a-database-table-named-user-in-postgresql

I didn't make the table, I just had to identify the fuckup because the original dev was crying in a corner.

cute

p(n) - p(n-1) = 2

are you retarded?

Attached: nerd.png (500x521, 116K)

>an equation is a fucking signal
You won't be able to impress girls with that when you talk about your EE job user.

well not quite what i meant but having a mathematics wank off on a french inflation fetish bbs is bit silly. is all.

Point out where "math" has been observed, apart from signalling. Signals appear to be made of math, but there is never any math without signals.

Doesn't matter whether it's a bunch of transistors and memory cells, a collection of biological cells moving ions around and generating solitons, or a bag of marbles bouncing around. All state is created via signals. There is no change that is not signals. There is nothing without signals.

though i do i like this gimmick

>Signals appear to be made of math
A beautiful pasta. I'm gonna save this one.

Rob pike is that you?

how many brilliant creative writers has the world lost to programming. how many something awful front page writes never even become heard due to their ill chosen profession. woe is us

literature has no value

How many shitty bloggers dump their mental diarrhea on the internet like if it was public bath.

dumb philitstine

what personal project / work project did you work on where you took that huge leap in your skills? I'm a junior developer and I still feel as if I'm not much better than I was when I was just starting out. Is it a slow development? Or do I need to force myself to do hard shit?

found the autist

sicp

Breadth is as important as depth
Do new shit
Well, they're not that different in practice

never experienced any huge leaps myself. its a grind but i havent exactly struggled either so its fine.

>Or do I need to force myself to do hard shit?
Different shit. More shit. Same shit from other angles. Mind to find better approaches to the same shit, at a number of scales.

I learned to program in batch. Bad habits were formed rapidly, but it was also pretty helpful. You're forced to invent the basic facilities used in other languages to do anything nontrivial, eg loops, standardized calling convention, ring buffers, other aspects of structural organization to respond to demands of increasing scale.

There are times when you don't actually see what you're doing, why you're doing, how you're doing, for what it rightly is. That's the problem with programming for a living, anything used constantly will be prone to write itself into a corner and then stagnate. Only in the absence of constant demand does the risk of change diminish to manageable levels.

The project has to be something you care about personally. I have a png optimizing program, and a lossless compression algorithm I work on occasionally. It can also be useful to slowly gut an existing project and refine it to your own needs.

damn fine post

>C does not allow you to declare a variable anywhere. Gnu C does. But normally, you declare variables at the beginning of a compound statement.
what did he mean by this? you can't do
int i;
i = 0;

?

most programmers feel like they suck at it, it's completely normal. the programmers that actually think they're good at it are often among the worst. computer science is a very broad field and it overlaps many other fields of study. realizing there is more to know than you could ever pack into your head in a lifetime is the first step towards becoming a competent developer. it takes thousands of hours to become fluent in even one of the simpler spoken languages, and they don't involve math, data structures, scheduling, etc. don't expect to get good in a short year or two. it's more important that you enjoy it early on because if you don't you won't be able to stick with it long enough to get good. do what's fun for you and don't worry about your power level according to spergs on Sup Forums. it's definitely a slow development.

it seems the only imageboard with any monecre of activity is Sup Forums.
Why is this?

In C89, you can't do this:
printf("Russia is a wonderful country with beautiful angry women\n");
int i = 0;

inertia

yep. and it sucks because one should always declare variables close to where they are first used.

so you're saying the board has died in activity?

It means you can't do...
int var = (int i = 0);
..even though assignment returns the value assigned the declaration on the right hand side isn't allowed.
GNU C allows for things like...
for (int i = 0; i < max_i; i++)
...although that may have been added in C99 or something. I wouldn't know, I still only use ANSI C.

Very few reasons to use a different one when Sup Forums exists. The few other major imageboards are generally language-specific (e.g. 2ch, Krautchan, Ylilauta) or otherwise of special interest (420chan for drugs, the chan that shall not be named for custom boards and self-moderation).

i've only been to the chan that shall not be named, how and why would anyone in their right mind do 420chan?
Also 69chan when?

being literally, and not even figuratively retarded.

Attached: sognol.png (365x251, 12K)

This is good and kind advice

>69chan
whats this?

Attached: OwO.png (593x89, 3K)

intelliJ or Eclipse?

>I wouldn't know, I still only use ANSI C.
"ANSI C" can refer to either C89 or C99, as well as any other C dialect that's been published as an ANSI standard

Drug talk. It's something that's surprisingly hard to do on Sup Forums, because teetotalers, junkies, and schizos (all shitposters) vastly outnumber well-adjusted psychonauts who actually have original and reasonable thoughts to contribute.
>inb4 fedora

>huge leap in your skills
I dunno about skills but when I started using git and reviewing my old code constantly my style, workflow, and project structure became worlds better.

FUCKING FEDORA HOW DARE YOU ACT LIKE UR BETTER THAN ME OR ANYONE WEE'RE ALL THE SAME THE OPNLY ADJUSTED YOU ARE IS ADJUSTED TO BEEING A PRETENTIOUS PSEUDOINTELLECTUAL JUNKY DEGENERATE.

OUT OUT OUT!!!! FEDORA OUT!!!

What's the final consensus for UI libraries(not frameworks) in sepples?

Attached: 9248939829348923.jpg (343x543, 16K)

>"hey, I should learn shaders to make a intro"

What a mistake. Opengl is fucking disgusting, and everything shader-related is all hacks and frankenstein's code. Unholy to say the least.

Attached: 0b3.jpg (600x449, 26K)

I'm sorry you're a brainlet

Just use pure win32 you queer

I've been getting minecraft recommendations lately on youtube. Made me wanna work on the clone. Partial implementation of block rotation.

Nah shaders are sweet. It's a little verbose to set up vertex attributes but you do it once and it's smooth sailing.

Attached: rotate.webm (800x608, 2.3M)

static void fizzbuzz_helper(int n) {
int p = !!(n%3) | (!!(n%5)

what the fuck is this

fib suzz

Anyone here worked on games before? Not that I'm going to, but I find some things intriguing. So maybe someone could clarify this for me. In some multiplayer games there are weapons that constantly deal damage every fraction fo a second while they are being fired. How is the networking handled on these things?
I can't believe sending packets over the network every 15 ms or so with damage information reliably. Some of these things even work without the need of central dedicated servers and can be played with player hosted servers.

Is there any trick to these things? Perhaps only sending data every few ticks but the clients simulating that the damage is constant?
I know that MMOs send attacks like those in ticks, and you may have even just one tick per second or so. But in shooters it feels like the damage is continuous

the fact shaders are now used for animation is the reason i gave up on opengl. i hated the idea of passing vbo's through shaders to transform/translate objects to begin with, but forcing me to animate skeletons in the same manner is beyond regular object oriented fuckery, it's heresy. opengl is a frankensteined nightmare of mistrust of developers and disregard for the process it claims to abstract away.

Client side prediction. The server and client both run the simulation, but the server's version is authoritative (the client only sends input) and the client corrects itself with data from the server.

>Perhaps only sending data every few ticks but the clients simulating that the damage is constant?

how about the revolutionary idea that the whole thing is just an animation

the client only needs to know that the effect has been applied successfully, it doesn't need updates on something it knows. try pulling the ethernet cord while in game and see if the dots still keep coming. my bet is that in a lot of games, they will still continue for at least half a second.

you also underestimate how much information a modern internet connection can handle.

Right, that sounds believable. I knew there was something to it

make an in game rubik's cube

So that's it then, the clients simulate the effects and correct themselves if there's any discrepancy from the server. It makes a lot of sense but it must be a pain to program

Theoretically? Trying to write a shitty compiler for a Pascal like language targeting C. I'm actually just configuring vim though.

Actually you do spam the server with packets because of packet loss. But as said, client side prediction. The client performs a guesstimate of what the other players are doing and updates with new information of the server. You can notice this in wow if you lose connection, because everyone who was moving starts running in place.

Don't assume a flamethrower actually updates damage even every .1 seconds. You can give the illusion of continuous damage with interpolation on the health bar and you can hide the choppiness with wind-up/wind-down animations.

>!!
what the fuck

What the fuck are you on about. Nothing is forcing you to do animation & physics on the GPU. I know it might be difficult for brainlets to understand, but hardware specifically designed for doing lots of parallel number-crunching is good at doing lots of parallel number-crunching.

I keep feeling overwhelmed when I try to write something and get a mental block. I know I should just work bit by bit but the block feels too big and makes me lazy

Best resource for learning opengl?

>you will never go back

Attached: 904138412348123.jpg (588x526, 38K)

>animate client side
>write new matrices to uniforms per limb call or whatever
How's that so hard?

opengl superbible

f = setTimeout (()==D~ incrementDOT(); if helloserverfromtheotherside.arewestillinsync f(),250)

start with learnopengl.com, continue with superbible/red book

thanks

>overwhelmed
>mental block
>insurmountable
>feel lazy

Wireless devices. Your brain activity is being goofed all up, along with other low level things concerning metabolic activity, transductive coupling of hormones and endogenous fields, and increased free radical generation.

Everything in programming is a series of tiny steps. Even the most complex problems at their core are just a series of 3-4 statements in series.
Start with the very first thing. To do X, what is the very first thing you need?
If it's a big thing, break it down. What's the part of that thing you need? Prototype it, just insert mock static data for now and flesh it out.
If it's a big thing that you need all of, go back up the chain and make that big thing.
If that big thing is really really big, start small. What's the absolute most basic part? Make that first.

I had to build an RPC identity server with TLS encryption and multi-threading for a class. Took me about 6 hours because I just started multiple mini-projects for each component.
How do I get TLS? Like this.
How do I use RPC? Like this.
How do I thread it? Like this.
Combine. Wa-la.

Is D the ultimate language?
Seems pretty cool.

>making up this much bullshit
wew lad

Why do you think that?

It's almost like you don't want to brain good.