DPT /dpt/

Blue board edition.

Previous thread:

Other urls found in this thread:

github.com/maxpert/raspchat/releases/tag/v1.0.0-alpha
greenlab.di.uminho.pt/wp-content/uploads/2017/09/paperSLE.pdf
doc.rust-lang.org/std/primitive.char.html
doc.rust-lang.org/std/primitive.u8.html
strawpoll.me/14467509
twitter.com/AnonBabble

I want to punch this hot anime mommy in the jaw

I want this hot anime mommy to punch me in the jaw

I'm back to D, Sup Forumsoys.
static struct S
{
int i;
this(int i){this.i = i;}
}
Unique!S produce()
{
// Construct a unique instance of S on the heap
Unique!S ut = new S(5);
// Implicit transfer of ownership
return ut;
}
// Borrow a unique resource by ref
void increment(ref Unique!S ur)
{
ur.i++;
}
void consume(Unique!S u2)
{
writeln(u2.i); // 6
// Resource automatically deleted here
}
Unique!S u1;
assert(u1.isEmpty);
u1 = produce();
increment(u1);
writeln(u1.i); // 6
//consume(u1); // Error: u1 is not copyable
// Transfer ownership of the resource
consume(u1.release);
assert(u1.isEmpty);

Daily reminder that even Go shills realize that their language somehow manages to be even worse than node.js for its intended application, servers.

github.com/maxpert/raspchat/releases/tag/v1.0.0-alpha

Huh, the non-blue version of this thread was a lot more chatty somehow.

how do you do syntax highlighting on here?

Read the sticky, newfag.

>Construct a unique instance of S on the heap
dont do that

>Ditching Go for Node.js
the absolute state

It has the better image, after all.

What kind of algorithm is this?

>people see you as an aloof uppity NY cunt
"Wut if I eat a beef on stage xD"
No wonder she lost.

nodejs/npm is an unironically amazing thing

it's fake lad

>dynamically checked manual borrowing
...

Daily reminder that Rust is more energy efficient than C++.

...

how is this calculated

>rubi shills actually believe that

greenlab.di.uminho.pt/wp-content/uploads/2017/09/paperSLE.pdf

I want to try out Rust but I just don't like the assumptions it makes about memory.

Maybe I should get over my hate for unsafe blocks

How do you call a window within a callback function? I'm using the FLTK toolkit from c++ to perform this.
I already have my classes and callback set up. I have manged to close one already using hide() but I don't know what to do to open another window. show() doesn't work for me.
struct window_one:Grahph_lib::Window
//code is typed here
struct window_two:Grahph_lib::Window
//code is type here
window_one::cb_next_window(Address, Address pw)
{
reference_to(pw).next_window();
reference_to(pw).show();

}
void splash_screen::next_window()
{
hide();

}
#include "GUI.h"

int main ()
try
{
window_one win(...);
return gui_main(); // an “infinite loop”
window_two win2(...);
return gui_main();
}


catch (exception& e) {
cerr

In C# how do I count the amount of iterations in a foreach loop? For example, I can get it to print 12345 but I only want it to print 5

Just use a for loop.

int iterator = 1;
//++iterator in the foreach

Here solution Sir...please give me the thanks for the stackoverflow point

Figured it would come to that, how would I transcribe the following code to a for loop then? Just trying to count the amount of words in a string by the amount of spaces
int wordcount = 0;

Console.Write("Enter a sentence: ");
string sentence = Console.ReadLine();
string[] words = sentence.Split(' ');

foreach (string word in words)
{
wordcount++;
Console.Write(wordcount);
}
Console.ReadLine();

wordcount is literally the amount of interations in this loop you fucking retard.

the absolute state of Go. Even fucking JavaScript is better

Well if you just want to print 5, do you mean the fifth array index?

print wordcount after the loop

God dammit that actually worked

in c# arrays also have a length property
Console.Write(words.Length);

Tried that earlier and it didn't work, but I'm relatively stupid as fuck right now. Anyway, thanks

wtf why don't programmers write everything in C if this is true

is it because you're all brainlets or what?

Wait no, I tried wordcount.Length, not words.Length

You don't have to be intelligent to write decent software in C, just autistic.

>posting rigged benchmarks

garbage collection ruins everything

not really

delete foo; //bar * foo = new bar();

does this call the bar destructor

Yes.

>any benchmarks I don't agree with are rigged

A functional language based on sequent calculus and linear type theory rather than lambda calculus and intuitionistic type theory.

an ai based programming language

i vocally dictate what i want my program to do to a general ai and that ai then writes the theoretically best machine code imaginable instantaneously

Yes, amazing as in "how the fuck does my toy server already have 600 dependencies" and "oh joy someone found another DoS vulnerability in express"

(*struct).memb;
or
struct->memb;

You have to be intelligent, autistic, paranoid and, most importantly, dumb enough to waste your time with C when you could be more productive and happier writing something else.

>intelligent, autistic, paranoid
Nothing wrong with any of this.

>tfw no language will EVER be faster than C

Second.

Terry is schizophrenic, not autistic.

Assembly is language

struct->memb;
C is retarded for not using postfix defererence operators like pascal

C++ is though.

Oh wow, it's exactly like std::unique_ptr C++ has for 6 years now, only without move semantic so you have to manually call release.

>qsort vs std::sort

>Can't even inline predicates

>Doesn't even have generics, uses void pointers instead
>Fast
Pick one and only one

#include

static void foo_int(int x) {
puts("foo_int");
}

static void foo_char(char x) {
puts("foo_char");
}

#define foo(x) _Generic((x), int: foo_int, char: foo_char)(x)

int main() {
foo('x');
}

$ ./a.out
foo_int


Friendly reminder that C's type system is complete and utter trash and _Generic is completely useless because of it.
it's also misnamed, it should be called _Overload, because it isn't even real generics.

Why does it call `foo_int` if there's the `char` case? I guess it can implicitly cast char to int, but shouldn't it do it after considering all the cases? What happens if you put the char case first? So many questions.

character literals are of type int in C.
That's why it selects the int overload, because you passed in an int.

Now that's fucked up. Can't wait to see how Ctards are going to defend it.

If you don't think C's type system is trash you're basically a moron.

What happens if you rewrite main to say this?

int main() {
char x = 'x';
foo(x);
}

M-muh multi-character literals.

Also, C is portable.

then you'll implicitly cast an int to a char and pass in a char to the _Generic macro, and it'll select the char overload.

>C is portable.
Has absolutely nothing to do with character literals being of type int.

>Also, C is portable.
Then "char" should be the character type of the platform while there's another type for the smallest addressable unit.

>I guess it can implicitly cast char to int
No. Character literals were always ints to begin with.
What's happening there is completely expected behaviour.

So not only C has a totally retarded situation where char, unsigned char and signed char are three different types, it doesn't even use char for literals. How retarded should you be to end up with a situation like this?

This.
The char abuse in C and C++ is fucking retarded.
It makes absolutely no sense.

I told you, because of multi-character literals. Just another thing to remind Cniles of when they jerk off about how portable it is.

C++ fixes this actually.

shen ?

>I told you, because of multi-character literals.
Just make multi-char literals be of a larger type then, it isn't fucking hard.
Single-char literals should always be of type char.

No it doesn't.
If you're talking about std::byte, it's completely useless because it only has bitwise operations defined on it.

No. It's implicite int.
Char literal is treated as hardcoded number, and the compiler considered as int

Daily reminder this is how the char type in a proper system language should look like: doc.rust-lang.org/std/primitive.char.html .
>The char type represents a single character. More specifically, since 'character' isn't a well-defined concept in Unicode, char is a 'Unicode scalar value', which is similar to, but not the same as, a 'Unicode code point'.
>char is always four bytes in size.
And this is how the byte type should look like: doc.rust-lang.org/std/primitive.u8.html

I thought we were talking about char literals being typed as int. C++ types them as char. Were you talking about something else?

i.e. char literals are of type int.
What was the point of this post?

>Unicode
No.

Does Rust have a 'smallest addressable unit' type or are we supposed to assume it's u8?

>C++ types them as char
I know.

>Were you talking about something else?
Yes, how char gets abused as the byte type in C and C++.

Ah. We're in agreement then.

If you really want, the only time you ever have to use Unicode in Rust is to handle char/&str literals. You can convert (possibly at compile time) to whatever format you like internally, but you'll obviously not be able to use libraries that use Unicode unless you convert back as well.

Currently, yes, you assume it's u8.

long long ago;

heheheehehe

>Unicode
>No.
Wew, m8.
>Does Rust have a 'smallest addressable unit' type or are we supposed to assume it's u8?
No, Rust doesn't need such a type explicitly, u8 is just the smallest unsigned type, but the documentation uses `byte` extensively for stuff like `mem::size_of`. So yeah, Rust assumes 8bit bytes, C can have all the other platforms.

Nigga it literally says she shot a man in front of a crowd.
It's satire, or “fake news”, if you will.

>dereference uninitialized memory
>it not only works but also doesn't segfault

it's not a neuronet you dumb MIT nigger

>2017
>summoning nasal demons
Dumb frogposter.

duh C poster

REMINDER! REMINDER! REMINDER!
Vote for your most-hated programming language here:
strawpoll.me/14467509

>captcha

>no go

To other C programmers (mainly). What's your preferred way of doing metaprogramming/code generation? I have a system where I scan for a specific path of #include which takes parameters in the preceeding #if 0 block that then generates the code and lets the compiler source all the files through that include.

I'm wondering if there's other ways people do this generally. My main reason for doing this is because most of my code generation is for generating static results. So I'd like to have the compiler be made aware of the results cleanly. Since it's just textual code insertion it's obviously flexible too. But I'm still curious.

>What's your preferred way of doing metaprogramming/code generation?
Using Terra.

>strawpoll
>not the superior service: poal.me
Strawpoll should be abandoned. It has stagnated its voting facilities and is limiting users by their own complacency.

C clang: 9.971914301 seconds time elapsed

Racket: 3.618877176 seconds time elapsed
Haskell GHC: 0.454930841 seconds time elapsed
Java OpenJDK: 1.189619693 seconds time elapsed
Dart: 2.585097637 seconds time elapsed
Common Lisp SBCL: 3.015000773 seconds time elapsed
Crystal: 3.034363672 seconds time elapsed
OCaml: 0.747779583 seconds time elapsed
Scheme Bigloo: 3.604533288 seconds time elapsed
Scheme Chicken: 7.966139806 seconds time elapsed
Scheme Gambit: 1.932229635 seconds time elapsed
Nim: 6.083644424 seconds time elapsed
D LDC: 6.347934615 seconds time elapsed
C++ clang++: 5.518077517 seconds time elapsed
C++ G++: 4.659448453 seconds time elapsed
F# Mono: 4.429195035 seconds time elapsed
Rust rustc: 6.583698221 seconds time elapsed

Just use the Canadian Aboriginal Syllabics trick.

...

Why is Rust slower than Sepples?