/dpt/ - Daily Programming Thread

What are you working on, Sup Forums?

Previous thread:

Attached: computer-girls11.png (676x956, 1.34M)

Other urls found in this thread:

docwiki.embarcadero.com/RADStudio/Tokyo/en/String_Types_(Delphi)
youtube.com/watch?v=Dp6cAi7lEOM
ziglang.org/
github.com/yegor256/eo
youtube.com/watch?v=l-Mn6H72hgI
open.gl/
learnopengl.com/
Sup
openglsuperbible.com/
transfer.sh/L5Z2B/anton_gl4.epub
twitter.com/SFWRedditVideos

first for coqs and CoCs

>manager asked me if I had any suggestions to make programming easier/better/more comfy
Should I talk to him about programming socks, or?

>working on a game with a team
>artist is a diaperdev
>brings up his "uh oh dirty nappy" every time we have a scheduled chat about progress and direction
>have to sit around for a half hour while his "daddy" changes him
>grunts and groans from his end on skype while me and music/sound bro try to pretend we're just looking at our browsers
this is not what I signed up for

Attached: The-Charlie-Brown-and-Snoopy-Show-charlie-brown-37878841-500-375.png (500x375, 1.01M)

>that pic
Kinda funny how women used to be fit for programming back when the scope of a program was tiny, the algorithms were supplied by male mathematicians and academics, and the work was mostly tedious and mechanical; a natural continuation of the tradition of women being hired to manually perform tedious calculations for scientists and mathematicians while the men did all the creative thinking. Now that the focus is on large-scale design and creative problem-solving, they've been pushed out of the profession.

Attached: f522fe9a454f0bb8d08bd917057265d5.jpg (225x225, 7K)

How would you improve C, in as few changes as possible? You don't have to worry about backward compatibility or existing idioms at all.

RAII and templates

>How would you improve C, in as few changes as possible? You don't have to worry about backward compatibility or existing idioms at all.
Compile-time code execution, compile-time type reflection, meta-programming.

stop oppressing me

Killing it.

Also, please provide a brief description of syntax or implementation.

I'd add a Code of Conduct.

Probably something like better generics, non-null-terminated strings, basic data structures in the standard, lambdas.

This can't be real

What would you do instead of non-null-terminated strings ?
Saving the length field ? But then you can't really represent strings as simply an array of chars, or can you ?
Sorry, I'm a brainlet.

It's still an array of characters, you just need to fetch the size to determine how long the array is. Since C99 there has been a convenient syntax for this kind of structure.
struct nustring
{
size_t size;
char data[];
};
This is how it's done in Pascal, incidentally.

docwiki.embarcadero.com/RADStudio/Tokyo/en/String_Types_(Delphi)

Reposting from last thread — I implemented a tasking system and converted my texture loading to that, so now I can actually load textures/models concurrently and not on the main thread only. Each texture uses two tasks, one for loading/preprocessing that can execute on any thread, and another that uploads the data to the GPU that has to be single-threaded.
youtube.com/watch?v=Dp6cAi7lEOM

Attached: Screen Shot 2018-03-17 at 11.58.37 AM.png (2784x1708, 1.4M)

>represent strings as simply an array of chars
Why would you want to do that, other than maybe file or message formats that need to be compact? An array of bytes doesn't contain any information that it's supposed to be a human-readable string, it could be anything.

-betterC

>Also, please provide a brief description of syntax or implementation.
>Compile-time code execution
compiletime int foo() {
// code
}

const x = foo();

If you extend Clang, you can employ the ability of LLVM to execute bytecode or dynamically compile C for this.

>compile-time type reflection
typedef struct {
int x, y, z;
}Foo;

compiletime void tests() {
type t = Foo.x;
printf("%s", typename(t));

for(int i = 0; i < fieldcount(Foo); ++i) {
field *f = getfield(Foo, i);
printf("%s %s", f->name, typename(f->type));
}

// etc.
}

Types are first class values at compile time. Basically an API around internal compiler structures to be used from metaprograms and compile-time functions.

>meta-programming
Compile-time functions that manipulate the AST, with quoting and splicing.
macro astnode *magic(astnode *body) {
return quote {
for(int i = 0; i < 100; ++i) {
${body}
}
};
}

magic {
printf("fugg");
}

Thanks user.

Stronger typing so different enum types aren't compatible. I'd also like to see composite enums.
Less lax implicit conversions (compiler warnings and -werror deals with this though).
Standard library needs a GC, map and other containers.

But really I don't want to carry the C luggage. There's no sensible reason to. What you should do is reevaluate what's supposed to be UB or not, look into what people like about the syntax and then write a good language with modern ideas. There's just so few options that actually improve the language for the problems C is used to solve in a clean way.
Nobody does the fucking obvious compile time evaluation for instance. Why should systems programming languages have all these restrictions on compile time evaluation? If a programmer wants to read stdin at compile time it's their own fault if that's a bad idea. Most languages don't even let you do the obviously useful things like reading files.
Nobody gives you a nice type system like FP languages have.
Nobody lets you actually design the intent of memory layout into the language with good tools.
I can't pack/organize fields of a struct in two different ways depending on context in any language I know. I'm not even allowed to tell the compiler to choose how it wants it to be laid out in memory.
Nobody even does SOA vs AOS features. CPU vendors have been encouraging this for ages now.
There's no point to making that explicit aside from a type attribute in so many cases.
It's fiddling we do with memory for no reason. It's almost always obvious what you do. And there's no reason to let us express the obviousness. It just requires the bare minimum of introspection of structs.

Templates and compile time code execution.

constexpr and templates make C++ not only cleaner, but faster than C in many cases.

I'm not a C++ programmer, but one of the greatest examples I saw in a cppcon talk about how to move C programmers to C++ was an example for how to make memcpy safe with a simple line of template code.

in a pseudo-C, it would be something akin to:
template
void* memcpy (void destination[k], const void source[k]);


This then requires that the arrays be of equal length to ensure there is no buffer overflow. As well as requiring less parameters makes the call less expensive at the cost of a larger executable file. You can always have the original function available if you want to move, say only 5 bytes, 3 bytes offset into an array of size 20 or something like that, but any other calls would be COMPILE TIME PROVEN SAFE!

And this is just a simple example. There's a lot of ways of making C better with templates.

>Nobody does the fucking obvious compile time evaluation for instance. Why should systems programming languages have all these restrictions on compile time evaluation? If a programmer wants to read stdin at compile time it's their own fault if that's a bad idea. Most languages don't even let you do the obviously useful things like reading files.
>Nobody even does SOA vs AOS features. CPU vendors have been encouraging this for ages now.
Check out Jai.

>Now that the focus is on large-scale design and creative problem-solving, they've been pushed out of the profession.
I mean, the demographic cliff happened in the 80s in a bunch of different professions at the same time, and almost exclusively in the US, so if you take the blinders off and realize there's other kinds of work besides sitting in a chair all day and touching a computer, you'd immediately conclude there's something different going on
But also you're implying that women are wild animals incapable of any kind of foresight or abstract thought and that's 100% correct, unassailable scientific fact

ziglang.org/

Attached: zig-logo.png (386x91, 4K)

>using a DeaD language

Last I checked D was unnecessarily attached to the GC in many ways.

I would really like to see them allow you to pass in your own allocator to all the standard stuff.
Ideally at multiple granularity. Like setting the allocator for a specific scope or global scope.

>haha nerd sitting on the computer all day
>muh only in america
>i bet you just think all women are literally animals
All those solid counter-arguments... My position is in shambles.

Why did they fuck up the syntax so much? Wasn't it more like C/D before?

Classless eo-style object-orientation:
type Foo
{
int Sum();
}

// this creates an object
Foo bar = create Foo(17, 19)
{
int x, y;
Foo(int a, int b) : x(a), y(b) {}

int Sum()
{
return x+y;
}
}

// creating another object
Foo baz = copy bar(23, 29);

I assume it fills those two points? I don't think it's enough.
But I'm happy to see there's at least some people acknowledging this.

>object-orientation
>object methods are incapable of replacing self with an object that responds to different methods
You're ten years too early to defeat me

Lambdas
A feature where anonymous members are valid outside of composite types and can be non composite types themselves, thus giving us tuples for free (e.g. struct {int; char;})
Type inference
Optional first class types and type variables, which, if used, work by generating an enum metatype and a static lookup table at compile time so that each type used as a value can be used to lookup its size and memory layout on demand, allowing things such as for example:
static type const a = int;
a *alloc_a(void) {
return malloc(sizeof((a){}));
}
Generics using first class types

You're thinking of Smalltalk-style message passing, which is pretty cool, but how do you reconcile that with C's syntax? It's gonna be horrible, just look at Objective-C.

>how to improve C?
>MUH POO
>MUH LAMB DUHS
The ultimate state of mindless paradigm shills.

Rude.
But yes that was programming back then. Now it's more about systems design.
I don't think you're accurate in your assessment if you attribute the lack of women working due to the change in task directly. The situation is definitively more about the change in attitude towards the profession. Before it was a service job now its a job with control. Regardless of women's actual ability (speaking broadly) we all know how this story goes.

Why F# and Ocaml

Attached: Capturar.png (911x535, 19K)

Or with type inference:
static let const a = int;
let *alloc_a(void) {
return malloc(sizeof((a){}));
}

Not sure I follow this.
I'm not that well versed in OOP but is this just inheriting methods? What is the point?

How would this be efficiently implemented?

Polymorphism and encapsulation.
github.com/yegor256/eo

You could do better, you could use type inference and also generics.
static let const a = int;
let *alloc_a(void) {
return talloc(a);
}
// where talloc(type t) => t *

>You can always have the original function available if you want to move, say only 5 bytes, 3 bytes offset into an array of size 20 or something like that
You can do that with the template version though
>but any other calls would be COMPILE TIME PROVEN SAFE!
… but what you can't do is specify at runtime how long the buffers are; most of the time you won't know this at compile-time.

>static type const
>static let const
WTF

Where's the Pharo-learning user?

Attached: 1517523126799.png (1628x1020, 338K)

All these lists are just scrambled.
>why these two languages
Probably because they're used more in the west. Just a guess. That's how these lists go.

>finally get the hang of CMake
>Next step: Convincing ownself that doxygen generated docs are meant to be read, especially in the current millennium.
How am I going, Sup Forumsirls?

Attached: 1517919983256.png (205x269, 29K)

is 22 too old to lrn2program?

>I don't think you're accurate in your assessment if you attribute the lack of women working due to the change in task directly.
So you think the simple lack of skills and predisposition for what modern day programming entails has nothing to do with the scarcity of female programmers, and would rather attribute it to some vague forces and untestable mechanisms that you can't coherently explain, because it makes you feel better. Okay, then.

Or, better yet:
static let const a = int;
let alloc_a = curry(talloc, a);
// assume we also have overloading
// as well as more traditional style generics
// the type of talloc is t *(*)(type t)
// so the type of curry(talloc, a) should be:
// a *(*)(void)

>the sheer level of autism required to think that this is a good idea that will improve C

>vague
We've seen it all over the place. Male chefs. Something women have been expected to do for centuries and any status position within the craft is taken by women. Oh no wait a second, men.
Not appropriate discussion for a programming thread though.

>
why is it not though

>We've seen it all over the place. Male chefs. Something women have been expected to do for centuries
Exactly. We've seen it all over the place: if it's mediocre cookie-cutter work, women can do it just as well (and are often expected to do it); anything that requires an exceptionally high degree of skills and creativity is dominated by men.

C++

Because that's the point of using C.
I always thought people like it because of how barebones it is.
You rarely see a type byte or int8, it's almost always char in these contexts.

And the kicker is that now that programming is rapidly degenerating into cookie-cutter third party library plumbing work, tech companies are shilling for diversity at full force, because they know they can tap a greater supply of cheap "programmers" while looking so progressive.

it's literally called a char, for character, as in an element of a string

kys
The allocator library has been getting some love lately, check that out. They're also making progress on decoupling Phobos from the GC

Not her but even granting the supposition that it's due to an innate difference in skill level or intellectual capacity -- a matter on which I choose not to actually opine one way or the other -- if you allow such a statistic to color your automatic judgement of arbitrary individual specific women in programming, then you are committing a fallacy of "usually this, therefore always this." Or, worse yet, if most men are suited to some task X, and fewer women, but still most women, are also suited to task X, and because of this you automatically assume given any woman employed in task X that she's incompetent, not only is there a good possibility you'll be wrong, you'll actually *usually* be wrong, even if you'll be wrong less often than if you were to assume the same of men.
Basically, reserve statistics for groups of people and don't use them to assess individuals. Use the qualities of individuals to assess individuals.

Doing a livestream of C++/Common lisp programming on my Content-Delivery-network WIP.

youtube.com/watch?v=l-Mn6H72hgI

Attached: tinycdn_6.png (1280x720, 1.42M)

>being this much of a fipfag
pic related

Attached: int.png (800x3852, 2.38M)

>you are committing a fallacy of "usually this, therefore always this."
Statistics don't lie, user. If a rule is true 51% of the time, there's no risk in acting like it's true 100% of the time.

>even granting the supposition that it's due to an innate difference in skill level or intellectual capacity
Which you'd be forced to do, because it's been demonstrated countless times.

>you are committing a fallacy of "usually this, therefore always this
No one was discussing any particular woman, implying "always this", or committing any such fallacy.

>reserve statistics for groups of people and don't use them to assess individuals
Why should I? If I have more candidates than I can screen, it's perfectly rational to filter them based on broad statistics to increase the likelihood of screening a suitable candidate.

no, it's too young

>when losing the argument, she has to resort to blatant schizophrenia-tier strawmanning
Go back.

>Statistics don't lie, user. If a rule is true 51% of the time, there's no risk in acting like it's true 100% of the time.
Wrong. If a rule is true 51% of the time, there's a 49% risk in acting like it's true 100% of the time.
If you prefer a 0% risk, you should just take the time to learn what the specific person in question is like and what he or she is capable of, and then you can be 90-100% certain of whatever assumptions you'd like to make about them after the fact.

Homotopy type theory brain:
data Nat : Type where
zero : Nat
succ : Nat -> Nat

data Int : Type where
pos : Nat -> Int
neg : Nat -> Int
unsigned : pos zero = neg zero

Why are you responding to your own samefagging strawman posts?

>If I have more candidates than I can screen,
>candidates
Well, yeah, if you're talking about candidates then that's different, especially if you have more than you can screen. However, I do submit that having more candidates than you can screen is a sad comment on the current state of capitalism. This, however, would of course not be your fault as an employer.
But I'm not talking about selecting candidates. I'm talking about discussing specific people in a colloquial context. For example, bitching on /dpt/ about assumed diversity hires.

Can anybody direct me to an OpenGL tutorial that:
A) Doesn't start off by writing 600 different fucking sepples aids class extensions, basic rendering is NOT that complicated
B) Actually binds to the fucking GPU buffers, and doesn't do every single draw process over and over and over CPU side with gl.Begin/End?

I'm not samefagging, but of course that doesn't mean the person you're accusing of strawmanning wasn't strawmanning. It may well have been the user you were debating with before I stepped in, strawmanning on my behalf.

Attached: Screenshot from 2018-03-17 09-58-54.png (2160x1440, 396K)

open.gl/
Read the first paragraph

open.gl/
learnopengl.com/
arcsynthesis

>joining an argument you weren't originally part of
That's essentially the same thing as strawmanning though
By house rules, you lose this round, and the game

I've tried that one, his shit straight up doesn't render on half the systems we have here.

>>joining an argument you weren't originally part of
>That's essentially the same thing as strawmanning though
It's not the same thing at all.
Strawmanning is misrepresenting your opponent's position -- by way of false flagging, for example, in the case here alleged -- in such a way as to render it easier to refute.
Joining an argument you weren't originally a part of does not in any way imply any such misrepresentation.
Furthermore, there are no "house rules" to speak of, except for these: Sup Forums.org/rules
And, finally, see:

Do they support OpenGL 3.2+? Because that's what the tutorial is for. Before 3 the de-facto way of writing OpenGL was in immediate mode, everything else was a mess of semi-standardized extensions.

>using computers that don't work with modern GL
Appleserf detected

>if you're talking about candidates then that's different, especially if you have more than you can screen.
We're in agreement, then.

>I do submit that having more candidates than you can screen is a sad comment on the current state of capitalism.
Go back. Now.

>I'm talking about discussing specific people in a colloquial context.
In a company with "diversity hiring" practices, you no longer get to claim that the women are probably as competent as the men because they've undergone the same selection process with the same criteria. In such an environment, unless I somehow knew better (or there was no risk in finding out), I would still prefer men on my team, unless I thought the standards for male hires are also uselessly low.

Looks like just 3.0
Which is retarded because these machines are barely 2 years old
What the fuck intel

Not even close

Apple computers since 2010 have supported 4.1

I learned a lot of stuff off of openglsuperbible book. It's pretty good openglsuperbible.com/

how do I acquire a hotmail password

The reason Macs fell behind for so long was because Apple insisted on writing their own OpenGL bindings. While in my opinion this was kind of dumb they did it for a reason — they include all the supported extensions in the header files and have/had a fallback custom software renderer with OpenGL version parity. So with a few exceptions if your OpenGL code compiles on OS X it'll at least run. They're stuck on 4.1 now which is far better than the situation 10 years ago, but I think they're moving to Metal/Vulkan instead of spending resources on an incremental OpenGL version increase. Not what I do but I'm not in charge.

the best
transfer.sh/L5Z2B/anton_gl4.epub

>I think they're moving to Metal/Vulkan
MoltenVK came out recently, it's an open source Vulkan implementation on top of Metal. But I don't think Apple ever considered supporting Vulkan directly - not really that surprising given their tendencies to avoid standards and make up their own things.

I had heard recently that they were planning on supporting it, which no one expected them to because of Metal. I'll look into it again.

>Go back. Now.
No, fuck you, C is better.
>In a company with "diversity hiring" practices, you no longer get to claim that the women are probably as competent as the men because they've undergone the same selection process with the same criteria. In such an environment, unless I somehow knew better (or there was no risk in finding out), I would still prefer men on my team, unless I thought the standards for male hires are also uselessly low.
>unless I somehow knew better (or there was no risk in finding out)
So long as you wouldn't still prefer men on your team simply out of spite even if you DID know better, we're in agreement on this point. But regardless, this point is irrelevant, since, again, I'm talking about discussing specific people in a colloquial context, e.g. assuming that specific women, specific black people, specific mexicans etc. are diversity hires simply on the basis of the fact that they are women, black people, mexicans, etc., and then bitching on /dpt/ about them. It would be a great deal more okay if you were bitching about A) a known-for-a-fact diversity hire or B) a picture of a statistic instead of a picture of a specific person.

Ah okay it's Kronos that are doing it, interesting.

>So long as you wouldn't still prefer men on your team simply out of spite even if you DID know better
What I'm getting at here, Rustfriend, is that there are perfectly good reasons to engage in what you would surely decry as "prejudice" and attribute to the "dey just hate all wumin" strawman your sort usually clings to.

>I'm talking about discussing specific people in a colloquial context, e.g. assuming that specific women, specific black people, specific mexicans etc. are diversity hires simply on the basis of the fact that they are women, black people, mexicans, etc
Show me who's doing it.
>inb4 Klossy is a competent dev
>inb4 Briana Wu is a woman

Well obviously you should. He needs to know someone better could be doing your job

>What I'm getting at here, Rustfriend,
No, fuck you, C is better.
>is that there are perfectly good reasons to engage in what you would surely decry as "prejudice" and attribute to the "dey just hate all wumin" strawman your sort usually clings to.
No there aren't. There are no perfectly good reasons to engage in:
>>assuming that specific women, specific black people, specific mexicans etc. are diversity hires simply on the basis of the fact that they are women, black people, mexicans, etc., and then bitching on /dpt/ about them.

>Show me who's doing it.
>>that pic
>Kinda funny how women

>>inb4 Klossy is a competent dev
Hoo is Glossy?
>>inb4 Briana Wu is a woman
Hoo the fuck is Banana Woo?

Nice programming discussion.

Use slices (structures with a pointer, length, and capacity) anywhere arrays are used now. Use full words for function names instead of the idiotic removal of vowels.

Make type declaration syntax postfix to remove the stupid "spiral rule" for reading them. Make the type system stronger so there are fewer implicit casts. Make division return floating point always. All primitive types should specify a size and just use s or u instead of having the unsigned keyword. Make variables constant by default.

Assignment and increment should not be expressions. Comma operator should not be a thing. All blocks should require braces.

Force people to space between operators so we have a richer character set for identifiers and code is easier to read.

Introduce a real module system and don't have a global function namespace.

Tip: Nobody's going to read this shit. We already know it's you, because you're the ONLY person in these threads who actually gives a shit about having politics in technology. The rest of us are only concerned with the code. Sometimes we talk about the fact that women and black people write awful code, but that's not a political statement. It's just a true observation about reality. The topic of conversation is still the code itself.
By the way, this is the whole reason your pet fad language can't be taken seriously. It doesn't stand for itself on any kind of technical merits. The only reason it's being shilled is to push politics in the software industry. Again, we're not interested in politics. This thread is for talking about the code.

Attached: pills.jpg (626x456, 59K)

>There are no perfectly good reasons to engage in [list of things I never implied]
That's some weak stuff, Rustfriend. But go ahead and tell me that your ideological butt-buddies wouldn't explode in anger over the things I actually did say, which don't fall under your list.

>>that pic
>>Kinda funny how women
How does that constitute "assuming that a specific women is blah on the basis of the fact that she's a blah"?

Also:
>your sort usually clings to.
This has no place in a debate of any kind. You should never address assumed positions of your opponent unless your opponent actually proves to profess them, whether by word or by action.