/dpt/ - Daily Programming Thread

Pajeet gets triggered edition

Old:

Other urls found in this thread:

youtube.com/watch?v=TH9VCN6UkyQ
aix1.uottawa.ca/~jkhoury/app.htm
wiki.haskell.org/Applications_and_libraries/Theorem_provers
twitter.com/NSFWRedditVideo

TRIGGERED

>haskell
>real world haskell
good joke m8

First for C

Rust is the only good language.

POO

I'm doing a game in monogame. I haven't written any C# before and I know a little bit of Java (took maybe two courses in it, so I know OOP stuff, but no language details really).

Anything to be aware of?

Also I could optionally switch to F# now. But that doesn't seem like a priority since I'm likely only gonna do C# for this and perhaps small things in the future.

reminder

I don't think I've ever seen any serious discussion about Perl

It all happened before you were born.
Perl is quite literally the Python of the 1990s.

Probably because all of the programmers moved on and everyone realized that it's a write-only language.

But hey, Perl6 is out now :^)

>Anything to be aware of?

It can be broken in subtle ways. I'd use love2d if I were you.

Except Python isn't nearly as shit

Does anyone use Perl 6?

not as shit as perl but still really, really, REALLY shit

Why?

>inb4 whitespace meme

Any good learning material for C# ?

I wish it was easier to deploy new media formats and whatnot on the web.

FLIF, for instance, is an amazing image format: FOSS, supports lossy and lossless encoding, transparency, animation. Encodes all types of images about as well as or better than specialized formats like PNG and JPEG. Highly advanced progressive rendering: you can use prefixes of the full-resolution image instead of generating thumbnails, and it loads entire animations from low to high quality, rather than from start to finish, so a half-loaded throbber / progress indicator will play the entire loop in reduced quality, rather than pausing at the first unloaded frame.

But getting a format standardized + supported in all browsers is a huge pain, and is nearly impossible if you aren't Google, Apple, or Microsoft.

Why isn't there a principled way of doing essentially PolyFills? Something like the server sending a hash of the decoder, hash of decoded output, and record of number of steps to decode and peak memory usage in some standard virtual machine. Then the client can verify the decoder, verify the correct decoded output, can cut off the decoding process if it runs for longer or uses more memory than it claims, and the client can refuse to decode things that request absurd amounts of CPU time or memory.

It's used.

You just have to be competent and aware of its type system.

You should be aware of your chosen language's type system anyway.

You can decode FLIF in (ASM.)JS for now

I agree, it's cool

>You just have to be competent and aware of its type system.

I'm still waiting for people to care about apng.

FLIF essentially obsoletes APNG, but I'd still take APNG if that's all we can get.

Hi there!

You seem to have made a bit of a mistake in your post. Luckily, the users of Sup Forums are always willing to help you clear this problem right up! You appear to have used a tripcode when posting, but your identity has nothing at all to do with the conversation! Whoops! You should always remember to stop using your tripcode when the thread it was used for is gone, unless another one is started! Posting with a tripcode when it isn't necessary is poor form. You should always try to post anonymously, unless your identity is absolutely vital to the post that you're making!

Now, there's no need to thank me - I'm just doing my bit to help you get used to the anonymous image-board culture!

How computation intensive is it?

If we wait a few hundred years, then maybe it will have eclipsed GIF.

Thanks.

Hi there!

You seem to have made a bit of a mistake in your post. Luckily, the users of Sup Forums are always willing to help you clear this problem right up! You appear to have used a tripcode when posting, but your identity has nothing at all to do with the conversation! Whoops! You should always remember to stop using your tripcode when the thread it was used for is gone, unless another one is started! Posting with a tripcode when it isn't necessary is poor form. You should always try to post anonymously, unless your identity is absolutely vital to the post that you're making!

Now, there's no need to thank me - I'm just doing my bit to help you get used to the anonymous image-board culture!

tbqh I would use ocaml over F# if it had good parallel programming

I'm right, you know.

competent people don't use haskell

Thanks.

Competent people use what is ideal for the task.

Some people use Haskell, but only a few. Quit being a farquad.

some people literally eat shit, but only a few.

just because a language exists doesn't mean you have to use it.

Of course you are right, you should be aware of type system of any language you use to be good at it.

When people joke about Haskell being useless in the real world, it is generally in response to Haskell programmers who refuse to recognize that the language has drawbacks. It isn't the right tool for every job.

Sometimes you need to be fully aware of how much memory your program uses. Haskell makes reasoning about this difficult. Sometimes you have to mutate a byte array. Even if you don't need to, sometimes stateful mutation is the most intuitive way to write something. Haskell makes this ugly. Sometimes you need to reach below the abstractions you generally use to get all the performance you can out a hot loop. Haskell makes this retarded (see: Haskell entries on the shootout).

when you printf a bool as "%d" it can have values that are not 0 or 1

what are the caveats when using bools? in which situations does a true bool not get implicitly converted to 1?

>C#
>Java
>Cannot even explicitly destroy objects
HAHAHAHA how do you retards even consider these viable languages to program in?

I'm guessing your bool is actually just a byte. Values other than 0 or 1 are invalid, afaik.

Sounds like you want a char.

Does anyone have a PDF for Advanced computer architectures: a design space approach by D. Sima (Deszo) T. J Fountain (Terry J.); Peter Kacsuk?

>C
>C++
>Can leak memory
HAHAHAHAHAHA how do you retards even consider these viable languages to program in?

> Kacsuk
> Cock Suck

Any sane person would use C for the examples you give, yes. Now, try computing a byte array across 16 cores.

Machines have 8GB of RAM nowadays, and four or more cores, so we can write concise code in a functional language that doesn't needlessly hog resources. It's the future.

Java and C# can also leak memory.

the bool i'm printing is part of a struct array. maybe %d is reading 4 bytes while the bool is only 1 byte so it includes 3 erroneous bytes in the value?

That name is unfortunate.

Make a copy of your code and remove things until you have the smallest example that shows the weird behavior you are noticing. Then post that.

Anyone know a book on monogame?
What should I be reading? Should I read a C# book if I'm sortof experienced in C++?

I wasn't defending java or C#.

The difference is that in these languages you can extremely easily free those resources explicitly, unless of course you're retarded.

(You)

You can soft-leak memory (read: still have pointers to it, but will not read it again for the life of the program) in any language. The bigger issue is that you can trivially dereference invalid pointers in those languages.

never mind i was printing uninitialized bools so it was UB

If you're worried do
bool b=true;
printf("MyBool as int %d",b&000000FF);

Use warnings.

Are you using stdbool.h? You can't printf bools directly because there is no format specifier for a byte (except char, but that's not going to work.)

Just do printf("%s", bool_var ? "true" : "false") or something

Is it bad practice to have a reference to a parent object in a child object in C#?
Say that I'm working on a graph theory based application.
I want packets (objects traveling around the graph) to be able to get a node from an ID, so it would have to leap "upstream" twice: Once from the packet to the packet controller, and then from the packet controller to the graph controller class.

Is storing a reference to the graph logic in the packet controller, and then sending that reference to each individual packet bad practice?

I don't know why, but it feels like it is.

you ask for a banana, but get a monkey holding a banana and the whole jungle surrounding it

Everybody must be retarded, considering how often these issues happen.

Well yeah but it's just a reference, does it really matter? It's not like it's storing a copy or anything.

If you can really think of no other way then it's fine, but yeah it can be considered as less than optimal practice (not going to say "bad" because I'm not sure if you have an alternative)

most people aren't particularly bright

Pretty sure he's talking about what you have to consider when you break encapsulation.

I hate OOP fags.

Python isn't shit, it's just not a language meant for heavy performance needs. It can be used to write almost anything, with fast development time.

Sure the actual program will be terrible compared to being written in other languages but that's not the point of python...

look for an alternative solution

Well I mean the alternative would be just storing a reference to the actual array of nodes but that hardly seems better
ho hum

But user, OOP is fun

>Everybody must be retarded
No they're just taught bad ways of doing things. You should allocate memory a bunch of memory which you use your own allocate scheme for. Not new/delete everywhere. If you make your own allocator you will have a far too many places where things can go wrong and your errors are fairly unhelpful.

If you have a potentially massive memory footprint you can write your own dynamic allocator.

Memory arenas are the future.

this

new/delete or malloc/free everywhere can also lead to memory fragmentation especially for long-running applications

>I have limited knowledge of a vaguely similar OOP language on a completely different platform
>I could also do it in a functional language on. That same platform I know nothing about
Yeah, I'm betting this one never gets past "New > Project"

Notice you you didn't get a reply. That's your answer.

>python isn't shit
>Sure the actual program will be terrible

This.

I just want a language that lets me support them neatly. I don't want to do sizeof(type) all the time.
But they're so neat becuase if I have an issue I can just print all the allocations/deallocations to a file & pass a __line__ and __file__ to the function. Gets me where the dereference happened for instance.

There's just so much you can do just because you've actually made it yours.
Funny. I'm just not involved with that end of it. And i want to explore Monogame/XNA because people have said it's good. I really want to learn why people think what they think.

I'm a little interested in functional programming. That's why I suggested F# (because I saw it in my New>Project I've already done actually). But I'm not entirely sure if its appropriate. I just have no visibility into this space so I can't really tell. For instance this Java vs C# is news to me. Don't really see any differences aside from that Java seems to be very verbose and C# is a little bit more terse (but still more verbose than how I write C or C++).

please

>I'm too dumb to appreciate provable design
Wait - if you're here, then who's installing unpatched WordPress plugins on DreamHost right now?

>I just want a language that lets me support them neatly.

Time to start writing your own lang.

Haskell isn't the only purely functional language.

What are you working on?

I'm making a terminal spreadsheet program.

>Fast development time
Accumulate technical debt at the speed of typing!

I'd say Python is good for "rapid prototyping", not fast development time. If you've ever had to work with a >10k line Python codebase that doesn't use type annotations, you'll know what I'm talking about.

I'm putting my hopes in this man:
youtube.com/watch?v=TH9VCN6UkyQ
His heart is in the right place and he knows more than me.

haskell isn't any more provable than any sensible design in any sensible programming language

Could any of you make me a script that'd ping google and alert me when the ping reaches it?
It's raining a lot where I live and I'm having trouble with my internet. The script would help me knowing when the internet is back, like now.

Would really, really appreciate this.

Why is linear algebra important for programming?

vidya gaymz

Even if you dev for the next Nintendo console which won't have any 3D graphics at all?

I don't get it: why would anyone willingly code in Java?

3D graphics is literally applied linear algebra.

v

yes

>I don't understand what "provable" means
I guess ITT isn't really heavy on the math, huh?

Because lots of linear algebra happens in programming. Games especially.
There's no math required in programming beyond preschool math really. It's just that you mostly do math stuff because all computers do is math. They take some input (keypresses or whatever). Put that into functions or put that through a set of (sequential) operations and outputs other numbers (pixel colors for instance).

This is the best way to think of it imo.
2D has linear algebra. But really it's just vector math/trig. So it's probably stuff you can either figure out on the fly or stuff you already know.

When people first started shitposting about Jai, I was apprehensive, but it seems like Blowlang might actually be somewhat promising.

I would be lying if I said I wasn't at least a little excited to see how it turns out.

I/O is impure and should be used with great care and with explicit error handling

It's like a complex epic compared to a one-page short story.

explain yourself. are you talking about some hypothetical compiler proofs or what the fuck?

aix1.uottawa.ca/~jkhoury/app.htm
You don't need it to be a code monkey though

Pajeet, if I only had the time and you weren't so hopelessly in over your head.

If you're truly interested, here: wiki.haskell.org/Applications_and_libraries/Theorem_provers

>Tools for formal reasoning, written in Haskell.
they're implementations written in haskell, doesn't mean there's anything special about haskell as a language

Why is the main thing in Java to create an object for input?

Got a quick question about memory allocation. I'm used to languages that deal with the heap automatically, like C# and Python, but I'm diving a bit into C now.

Say I want a program that sums an arbitrary long list on numbers from the user, and stores them in an array.
This is trivial in GC languages because you just create an array of size n and the memory is automatically allocated and safe.

In C, do I have to allocate it myself every time? First time I tried to do a program like this where it read an input from a user and stored n integers in an array, it compiled and ran perfectly well. Did I just get lucky that I didn't get a segmentation fault?

tl;dr - Do you always have to allocate memory when dealing with arrays of arbitrary length, or is it enough to just initialize them with length n?