/dpt/ - Daily Programming Thread

old thread: What are you working on, Sup Forums?

Attached: 1517844476612.png (500x700, 85K)

Other urls found in this thread:

nim-lang.org/0.15.0/backends.html
nim-lang.org/0.18.0/backends.html
twitter.com/AnonBabble

reimplementing C++ in Shen

nibba

Are enums just for named constants?

In C and most derived languages, yes.

Thanks for not using an anime image.

Daily Blog Update:

I'm finally done with chapter 2 of my book. I can finally switch APIs so my simulation doesn't take half a second to render itself.

Thank you for your attention.

Dog bless frogposters.

Is there a better way to test if you have free memory available?

Attached: 1454889130452.png (1073x669, 120K)

what your book mane?

It's actually MS tutorial form their site.

thunks very much

Is the resident Nim fag here?
How feasible would it be to use Nim for extending a CMake/C++ project? I'm under the impression that Nim does not like C++ very much. But I prefer C++ to C myself.
Thoughts?

building a physics engine in [spoiler] js [/spoiler]

I have a very beginner level C knowledge. I have K&R, should I do every exercise or read it skipping exercises and do my own stuff?

hey reddit

Do the exercises unless it's not going to teach you anything

The code doesn't make any sense

Do it regardless. If you don't code what you've learned you forget things.

>Is the resident Nim fag here?
There are at least two of them, unless I was talking to myself in the previous thread.

>How feasible would it be to use Nim for extending a CMake/C++ project?
You mean calling compiled Nim code from C++? Pretty easy. It compiles down to C anyway, and you just annotate your stuff with an "exportc" pragma to prevent name mangling.

>I'm under the impression that Nim does not like C++ very much.
Nothing likes C++ very much, but both using C++ from Nim and Nim from C++ is supported, and the latter is particularly easy.

Reinventing the strcat function in c for an assignment.

>latter is particularly easy.
nim-lang.org/0.15.0/backends.html
I just followed the C-Nim FFI example and tried to use it with C++. Didn't seem to work.
/tmp/cciNky0d.o: In function `nimUnloadLibrary(void*)':
stdlib_system.cpp:(.text+0x12ee1): undefined reference to `dlclose'
/tmp/cciNky0d.o: In function `nimLoadLibrary(NimStringDesc*)':
stdlib_system.cpp:(.text+0x12f0d): undefined reference to `dlopen'
/tmp/cciNky0d.o: In function `nimGetProcAddr(void*, char*)':
stdlib_system.cpp:(.text+0x12f42): undefined reference to `dlsym'
collect2: error: ld returned 1 exit status

Just tested the C example, it doesn't work as well.

not nim-user, but you should probably post the actual code, lad.

Attached: Screenshot from 2018-03-15 19-27-48.png (3840x2160, 324K)

NVM I fixed it

Depends on the language. Some languages do typechecking to make sure you can't give invalid values and others even allow methods for enums.

You're looking at some pretty old documentation.
nim-lang.org/0.18.0/backends.html

Only in shitty languages.

If I allow the end user to use and edit/modify the program and its source code to his/her needs but deny them the right to redistribute unless the modification and its goal adhere to certain conditions (to stop useless modification that suck cock), is the program free as in freedom?

Attached: craptcha.png (377x500, 423K)

In the real world:yes
In stallman's terms: no

It depends which kind of """"freedom"""" you're aiming for.

Lets say I invented GNU system and wanted to stop further useless GNU/distros that just differ by using another package manager and a shit init system.
Take ubuntu for example, if I tell them 'fuck off non-free faggots that bundle spyware and backdoors', is this free as in freedom?

No. because you're impeding their freedoms to make shit software.
But seriously, as long as you just stipulate they must share all the source, stallman probably won't care if you have the "quality" clause.

Though i doubt your quality clause will do much.

>in

oh pajeet my son

also your idea really isn't FOSS friendly in the first place, so better to just not bother.

>is the program free as in freedom?
What a bizarre question... it's freer than not being allowed to modify it at all, but less free than being able to distribute any modifications in any manner you wish. It should be trivial to see that there is a spectrum on this case.

Attached: pine.jpg (206x245, 7K)

essentially, yes. the details vary by language. some are strongly-typed/scoped, some are not, and some languages provide extended functionality (C# has an "enum flags" attribute which adds some helper methods for enums representing bit flags, for example). also worth noting that strongly-typed enums like "enum classes" in C++ can also be used to create low-level integral types which are distinct from primitive integral types, allowing for disambiguation at overload resolution and the potential definition of alternate semantics (through operator overloading and such). the C++ standard library does this with std::byte. its definition is

enum class byte : unsigned char { };


semantically, it's not always perfectly clear in a given scenario whether "unsigned char" means "unsigned 8(usually)-bit integer" or "byte". a type like std::byte eliminates this ambiguity and expresses intent. if you consistently use each where relevant, it's clear when you mean "unsigned 8-bit integer" in the numeric/arithmetic sense, and when you mean "byte" in the semantic sense (like for accessing object representations in memory as raw bytes). the standard library defines stricter semantics for std::byte than for unsigned char; it prevents potentially unintended implicit conversions (if you don't use brace-initialization to prevent narrowing conversions, any arithmetic type will implicitly convert to unsigned char, which is ridiculously lax), and it overloads bitwise operators for std::byte, but not arithmetic operators, since applying arithmetic operations to single bytes of multi-byte types generally doesn't make sense

Piss easy.
void strcat(char *dest, char *src)
{
while (*dest)
++dest;
while (*src)
*(dest++) = *(src++);
}

He does care. Fucking JSON license is non-free because of the "The Software shall be used for Good, not Evil" clause.

>semantically, it's not always perfectly clear in a given scenario whether "unsigned char" means "unsigned 8(usually)-bit integer" or "byte".
I don't remember all the rules of C++ anymore, but that's a stupid usage for "enum classes". uint8_t (as opposed to simply char) already strongly implies "byte", but even then, you could simply `typedef byte uint8_t`.

Why are you impeding my freedoms to do evil things with your software?

I feel uncomfortable just looking at it, zero-terminated strings were truly a mistake.

>Why are you impeding my freedoms to do evil things with your software?
Not him, but I guess the obvious objection would be that you're not the judge of universal good and evil.

I did forget the null terminator on dest, now that you mention it.

You can make an argument that anything is.

I dunno. Strings are supposed to contain the escape character.

Some functions depend on it.

>zero-terminated strings were truly a mistake.
Zero-terminated strings were a brilliant invention. Fuck off back to your soyboy safety lang.

Pascal strings > C strings.

My point is that freedom should be allowed for everyone (even non-free fags) but not those who 'unfree' free software. Freedom should guarantee Freedom while prohibiting abuse of Freedom to do things that are the opposite of Freedom.
Heh.. I guess that is not freedom after all. What is it then?

Attached: gldt1104.png (2020x7474, 2.22M)

Don't make me check how they are implemented and how C dose it better.

...

They're size-prefixed, not null-terminated. No more O(n) strlen retardation.

ZTS must be responsible for at least half of all the security vulnerabilities.

No. Programmers like you are responsible for all of them.

std::string/Rust's String >> Pascal string > C strings
Pascal strings lack the capacity field, so you have to reallocate them all the time.

How viable would it be for me to find and pay Terry Davis 200 dollars per day to teach me C? Is he out dated? Is he actually a hack? Do one of you believe you would be a better teacher than him? My offer is legit, I'm sick of these shitty normie tutors charging me 40 dollars an hour and teaching me fuck all.

use a language with some sort of exceptions system. This code is basically that for malloc

>No more O(n) strlen retardation.
Memes aside, O(n) strlen retardation is not a consequence of zero-terminated strings. It's just a consequence of you being too retarded to figure out the proper way to handle strings in your application.

>find and pay Terry Davis 200 dollars per day to teach me C
this is your brain on memetism

>proper way to handle array pointers in your application.
ftfy

pretty sure you could've just put the whole line of the malloc statement into a condition check rather than the whole method with all that other crap

>just reinvent proper meta/data grouping, this is the most appropriate way to develop non-critical applications!
turing tarpit poster pls go

>she relies on """teachers""" to """teach""" them C
Learning C is a solitary experience. Read a book, nigger. You should be able to figure out most of it on your own, and when you get truly stuck, a good teacher will throw you a small hint and let you figure it out from there before going back to his Unix kernel hacking.

Storing the capacity in your custom type is wasteful when your allocator already tracks that information. malloc_usable_size should be standard.

So it's just a string with an attached integer. I doubt your program magical know how long it is unless you preprocesses it.

>using exceptions
>ever

>having access to a more guide with proper knowledge about the learning process and experience in the target domain has absolutely no benefits
You actually believe that?

>every piece of memory out there is malloc-ated

I guess the "logic" here is that a linked list can only ever be "full" when you literally run out of memory...? If you're going to have such a stupid function function in the first place, at least do it right:
pt = malloc(sizeof(Node));
free(pt);
return pt;

>malloc_usable_size
Ooh, learned something new/useful today. Thanks!
>your allocator [sic] already tracks that
This is so obvious in retrospect, but I never thought to search for a reporting interface for that data.

You gotta be somehow independent if you work on the lower level. I ask question when I have problems.

>she completely fails at reading comprehension
No wonder you can't read a book.

If you're using a std::string, the allocator used is a template parameter. It should be possible to statically resolve whether your allocator exposes the capacity of a chunk with a function like that.

When you're at the stage that you're using the acquired knowledge, sure.
But learning the main body of knowledge e.g. 5x slower and 30% wrong, while missing 20%+ of the gotchas, just due to some kind of misplaced machismo, is frankly retarded.
As in most things, there's a time for DIY, and there's a time for making use of external resources.

>she
This guy...

>learning the main body of knowledge e.g. 5x slower and 30% wrong, while missing 20%+ of the gotchas, just due to some kind of misplaced machismo, is frankly retarded.
I guess this is what you get with public "education". ^

Attached: 552334255.png (251x201, 7K)

Posts like
and
remind me that I'm retarded if I think dpt's good for anything but shitposting.

That's actually quite clever.

dumb wojakposter

uint8_t is already just an alias of unsigned char. setting aside that its name certainly implies "unsigned 8-bit integer" much more so than "byte", uint8_t really only implies "byte" in practice because it essentially had to be used in byte scenarios because it was formerly the only real option for accessing object representations (a mistake which std::byte was devised specifically to fix); technically char is also an option, but that's even worse because char implies "character" and, worse yet, is technically distinct from both signed char and unsigned char. and with your solution, byte, uint8_t, and unsigned char would all be the same type. when they're not distinct, you can't do things like defining alternate semantics through operator overloading (or the intentional absence of invalid/nonsensical operators), or writing overloaded functions which behave differently for 8-bit integers and raw bytes (think about things like type-safe printing/formatting strings/output, de/serialization, etc)

Except the shitposter is you. Other people are just trying to explain to you that you're in a field where picking things up on your own and relying minimally on external help is the standard, and for a good reason, because this is what you're going to be doing your entire career.

Theoretical question:

Is it a good idea to write a daemon that automatically compiles my program every time I make changes to the source files? I want it to monitor my project directories and I'm mostly working on small code bases anyway.

Make ctrl+s do f5 instead of making a fucking fs watcher.

Or literally just hit compile which does that too.

only if you make it auto-run/quit as well.
The only decent thing web-dev has is a hot-loading obsession.

>uint8_t really only implies "byte"
er, i meant unsigned char here, not uint8_t. also, technically, uint8_t isn't even guaranteed to exist; on a platform with non-8-bit bytes, it could simply not be defined, or if it was, it would represent a signed 8-bit integer (in such a case, aliasing "byte" as uint8_t would result in "byte" having the wrong size for a byte, so you'd wanna go with unsigned char instead at the very least)

If you're working with cowboy code monkeys, sure. Witnessed more than enough to see the most egregious failure modes of "imma do it myself".
There's a balance to both modes of operation. The balance changes between different fields, projects, skill levels, etc.
But if you think that the average well-skilled developer wouldn't be disadvantaged by leaving solution approaches off the table, I've got news for you.

>uint8_t is already just an alias of unsigned char
That may or may not be the case depending on the platform, but it certainly doesn't stop you from making `byte` an alias of `uint8_t`.

>its name certainly implies "unsigned 8-bit integer" much more so than "byte"
You can't even write coherently.

>uint8_t really only implies "byte" in practice because...
Let me stop you right there: it implies "byte" in practice, so case closed. Pretty much the only case where you care about "bytes" is when you're working with raw binary data, in which case `uint8_t` is much more specific and useful than "byte". One way or another, the point is that your usage of enum classes if fucking retarded, because even if it did add useful information over `uint8_t` instead of redacting it, it would still be more sensible to just use a typedef.

You mean watch mode? That's done in plenty of the build tools for various languages.
If you just need to get shit done, see if there isn't a pre-existing solution for your tech stack that's good enough.
If you want to learn how to do it, go for it. Don't forget inotify.

>Witnessed more than enough to see the most egregious failure modes of "imma do it myself"
You've witnessed nothing in your life so far, because you're a LARPing kid. Now go drink your milk. If you're even remotely suitable for work as a programmer, you should be able to learn C on your own without a problem, and then move on to do more reading on software design and read through some small-ish open-source code bases, and try to participate and interact with other developers to gain some actual experience.

...

see my recent post

>it implies "byte" in practice, so case closed
as i said, its use in practice emerged from an admitted mistake

>Pretty much the only case where you care about "bytes" is when you're working with raw binary data, in which case `uint8_t` is much more specific and useful than "byte"
this is completely nonsensical. uint8_t is not remotely more specific than byte. it's technically not even related to bytes. if it's defined at all, it's an unsigned 8-bit integer, regardless of what the size of a byte is on the platform. and even on an 8-bit platform, it's still not as specific, because uint8_t has integral/arithmetic semantics (a very broad/generic set of semantics) and literally any arithmetic type is implicitly convertible to it. with std::byte you can have a function that takes *only* a byte (or can tell the difference between a byte and an integer). how is that "less specific" than a function that will accept essentially any integral type, floating point type, or bool?

>the point is that your usage of enum classes if fucking retarded
this is not "my" usage of enum classes. it's in the standard library (for a reason)

>it would still be more sensible to just use a typedef
you don't seem to understand that the primary benefit here is the types being *distinct*

Your weak attempt at posturing doesn't deserve a response.
Onto the actual content:
>If you're even remotely suitable for work as a programmer, you should be able to learn C on your own without a problem, and then move on to do more reading on software design and read through some small-ish open-source code bases, and try to participate and interact with other developers to gain some actual experience.
No argument there.
However, there is a very strong argument to be made towards working smarter, not harder.
Make use of whatever resources you need to acquire a *well-rounded* skillset, and do that relatively efficiently.

Attached: Screenshot from 2018-03-15 21-10-03.png (2104x1784, 187K)

>median

oblig. shitpost
>> (Dead)
Don't tell me you're actually typing out the post number digits.
No wonder someone who recommends inefficiency and refusing experienced advice would also post in the most inefficient, error-prone way possible.
Stuff like this doesn't strengthen your arguments, you know :^)

>C

Attached: 3EcvwuIxz_Jt8SMt-Sn-T7VwvFNALry3erLWSkb5mPE.png (320x625, 104K)

What's up with Clojure?

it's an ideal java alternative for companies willing to take the risk.

>a very strong argument to be made towards working smarter, not harder.
I'm sorry, but you're a fucking retard. I don't know why it's so difficult for you to understand that in this case, working harder also happens to be the smarter thing to do, because it's not about learning C at all; it's about developing the skill and habit of learning and solving problems independently, with only minimal external help, because out in the real world:
1. If you're not constantly learning new things and solving new problems, your programming career has reached a dead end
2. There are no teachers. Only busy programmers who may be willing to throw you a bone and point you in the right direction
As a victim of public "education", you are used to being spoon-fed, to receiving immediate answers to your idiotic questions, and to general learned helplessness, and I hate to tell you this, but it makes you unemployable. Just don't mislead other beginners with your milk-sucking bullshit.

>deleting your own post
>samefagging and lying to discredit the opponent when you can't address the argument
Pretty much what I would expect from a LARPing kiddie willing to pay $40 an hour for someone to teach him C of all things.