What are you working on, Sup Forums?
Previous Thread:
What are you working on, Sup Forums?
Previous Thread:
real thread
>made later
>advertised later
What's real about that thread?
le java POO IN LOO
haha, object oriented programming is GOOD FOR NOTHING. lawl code monkeys PAJEET
le HASKELL. and EMACS
Add a "webcucks can't average two ints in C" and we have a complete dpt going on.
this one doesn't have anime in the op
Have you ever seen two pictures of him at different angles?
No.
Because he's 2d.
important poll, which language should i use going through "cracking the coding interview"? strawpoll.me
emacs is the king my man
>shit language
>language only used out of necessity
>horrible language
>barely a language
>bash
nice selection
> Hence, the type for our insertion function is going to be something like a -> Tree a - > Tree a. It takes an element and a tree and returns a new tree that has that element inside. This might seem like it's inefficient but laziness takes care of that problem.
Can someone give me an idea of HOW laziness works internally, to deal with the apparent inefficiency.
Just handwaving the problem with 'laziness' seems... lazy.
No, it's because the side effects induced by IO are not (internally) observable due to the sequencing. How it's implemented is completely irrelevant.
Talking about IO in the way you are falls apart when you introduce unsafePerformIO.
I have an idea.
Someone make a repo on github and we all pull request stupid shit.
enjoy your TOS violation
The old tree is never mutated.
So you can just keep pointers to it.
>unsafePerformIO
non-standard
why
It comes from the observation that you don't always need to evaluate an argument to pass it as a parameter. Laziness allows you to pass unevaluated terms around. Technically, applying arguments to the insertion function doesn't actually insert anything until you evaluate it enough.
>performing mental gymnastics in order to justify IO monads
Are all Haskellfags this delusional?
So under the hood, every Haskell program is just a lot of pointer spaghetti?
misogyny
>mom how do binary trees work?
Haskell is de facto GHC Haskell.
???
Simple answer: Yes.
lmao
>Haskell is de facto GHC Haskell.
{-# LANGUAGE Safe #-}
That's Safe Haskell.
things aren't evaluated until they have to be
Are you saying normal Haskell is unsafe?
What about type safety?
AKA Real Haskell
is abstraction hated here?
Good to know. Pointer indirection can really slow a program down if done a lot.
Go to bed, Pajeet.
Go and Node JS have the best package importing
Depends on the abstraction.
Abstraction is welcomed, unnecessary abstraction is shunned.
what's dpt's favorite design pattern?
Why shouldn't I fall for the FRP hype train?
GHC provides non-standard unsafe functions
The OOP fans in this thread hate it because it reminds them how worthless their paradigm is
See how refers to "unnecessary abstraction" - he's probably going to follow that up with "just use xyz oop design pattern x500 lines"
>someone actually did
holy shit
please don't restart the go thing
no one learns anything
[spoiler]Math[/spoiler]
nah, its someone so pathetic that they wanted to create a new one just so it has a picture of some shit kids cartoon
So I take it you believe monads are unnecessary abstraction then.
Got it.
I like to wrap objects with dummy objects. It is possible to write wrappers four-levels deep that do absolutely nothing, and almost no one will notice.
Yes, actually. There's no guarantee of termination, so every type has an extra inhabitant:
bad :: a
bad = bad
Add in unsafePerformIO, and every type a has its expected values + an infinite loop + IO a. The programmer has to be trusted to keep these under wraps.
It's not practical to have a general-purpose programming language that is also 100% safe without programmer responsibility.
>import directly from github
>no versioning
>have to rename and copy everything into project repository
>best importing
No, they're not only necessary but they aren't good enough
Monad transformers and type classes are pretty good
Any second now some autist will bitch about linear type systems. Don't listen to him. Also don't listen to idrisfag, effect systems are a meme.
>Pointer indirection can really slow a program down if done a lot.
True, and heap management is one of the biggest factors slowing Haskell down, but this is why GHC has a lot of optimizations to avoid these scenarios when it can.
But obviously you don't want to use Haskell in an embedded system. If you're getting to the point where you have to use a lot of tricks to enhance performance, it better be localized to small part of your program or you shouldn't be using Haskell.
>It's not practical to have a general-purpose programming language that is also 100% safe without programmer responsibility.
But that sure doesn't stop Haskell from trying tho!
Which idiomatic strlen implementation is better?
The correct answer will be posted momentarily.
char *strrev(char *dest, const char *src)
{
unsigned len = strlen(src);
int i, idx = 0;
for (i = len - 1; i >= 0; i--)
dest[idx++] = src[i];
dest[idx] = '\0';
return dest;
}
char *strrev(char *dest, const char *src)
{
unsigned len = strlen(src);
src = src + len;
while ((*dest++ = *--src))
if (src == src - len)
break;
*dest = '\0';
return dest - len;
}
this isn't generally a problem because the program can never do anything after calling bad, bad will block evaluation indefinitely
>abstractions are bad, oop is shit because abstractions
>but monads are good, but they good be even more abstract
>Any second now some autist will bitch about linear type systems. Don't listen to him. Also don't listen to idrisfag, effect systems are a meme.
Haskell is seriously falling behind when it comes to safety.
I suppose it depends if you consider non-termination to be safe or unsafe. It's safe in the sense that it's not a vulnerability per se, but it can lead to denial of service.
that's not what he said at all
I just like to be aware of the real world limitations of the tools i use.
A node in a binary tree consists of an element and two pointers to additional nodes
why does that book change lifes?
how can it be so strong
It's the second best book
MENTAL GYMNASTICS
>tasty code
>mfw
I would prefer the former. As side note, the latter has a random unnecessary pair of parentheses.
Makes sense.
we get it, you don't understand him
>i am angry
>angry about controlling side effects
WHY CAN'T I PERFORM SIDE EFFECTS IN THE MIDDLE OF STRING LITERALS?!
FUCKING MATHEMATICIANS
>A node in a binary tree consists of an element and two pointers to additional nodes
Don't need pointers, mang. A binary tree can be represented in a linear array where a node at index N will have it's leftmost child at position 2*N and rightmost child at 2*N + 1
root
/ \
A B
/ \ / \
C D E F
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7
| nil | root | A | B | C | D | E | F
thanks for the heap user
Fuck
root
/ \
A B
/ \ / \
C D E F
--no-preserve-root
No, this setup can only represent complete binary trees.
>and two pointers
I was going to argue with you that this is not necessarily true, and there are other ways to implement a binary tree, but then i realized i'm wrong because all the other ways essentially are equivalent to pointers.
That doesn't mean we want our data structures fragmented all over the free store, causing cache misses on every operation, though.
Doesn't need to be a heap, you can represent a binary search tree or red black tree the same way. It just makes more sense for a binary heap, because of how you populate each tree depth.
False
Give me an array which would represent a non-complete binary tree.
See You can easily replace any of those indexes with NIL. I'll make an example, two secs.
And what is the computational difference between array indexing an pointer indirection?
Why do monads trigger codemonkeys so easily?
monads trigger codemonkeys because codemonkeys fear the unknown
Because we don't understand it, and for those of us who've studied a little philosophy there's also the additional trigger of Leibniz being bullshitty.
So if I add stuff down one branch for 100 levels I need to use 2^100 * node memory just to store one leaf and 99 nodes?
That's retarded. You would guarantee a Theta(2^h) memory cost for no reason, and nodes could be entered with NIL parents.
root
/ \
A B
/ / \
C E F
/ \
G H
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10| 11| 12| 13| ...
|NIL| r | A | B | C |NIL| E | F |NIL|NIL|NIL|NIL| G | H | ...
A memory lookup, which ***can*** lead to a cache miss in worse case.
But that's really not an argument, because any large tree represented as an array with size > page size will have the same issue.
[1, nil, 2]
Well, yes. But then you can use linked array lists (where each array is a node in the linked list of a fixed size), in order to avoid that.
>nil
Not a number.
[root, A, B, C, D, E]
Sorry for being a total moran,
Could someone do me the favor of explaining what the 'new' keyword in Java actually does and when I should use it, or if I even have to use it?
In return a cute penguin.
>A memory lookup, which ***can*** lead to a cache miss in worse case.
So, exactly the same thing then.
never use new, use the factory design pattern instead
So you're making it an array of boxed types just to suit this retarded data structure?
In theory, I can represent a red black tree as a linked list too, as the breadth-first traversal accompanied by an integer how many NILs we passed by horizontally to reach the next element. But that's so retarded it's not even a factor of consideration when implementing a red black tree.
Oh fuck, he actually said it.
That doesn't even make sense. The first element is unused, not the root, since otherwise the parent math becomes more complicates.
It allocates and initialises a object instantiation of a specified class.
class Person {
String name;
}
class MyProgram {
public static void main(String[] args) {
Person john = new Person();
john.name = "John";
Person alice = new Person();
alice.name = "Alice";
System.out.println(john.name);
System.out.println(alice.name);
}
}
In Java, you use 'new' to create instances, aka objects, of your classes.
complicated*
>design pattern
Don't listen to this memester.
It's hard to explain "new" in a vacuum. Take a look at some guides like docs.oracle.com
I've been teaching myself TemplateHaskell for improved shitposting capabilities in inane programming "challenge" threads.
I wrote code to generate functions that will uncurry lists. It's totally useless and there is no sane reason I know of to uncurry a function so it'll take a list, but I had fun doing it.
genCurries :: Int -> Q [Dec]
genCurries n = mapM mkCFun [1..n]
where
mkCFun n = do
fun
Do I have to use it to create a new instance/object, or is it just good practice?
Thanks I'll look at this.
>So, exactly the same thing then.
Depends on your access pattern, tree size, node size etc.
>So you're making it an array of boxed types just to suit this retarded data structure?
You can call it "retarded" if you want, but it's not far from how B trees are implemented.
It's not far from how B trees are implemented desu.
I am trying to learn c++. but this program does not loop correctly
How does everyone feel about Java becoming a compiled language?
>Do I have to use it to create a new instance/object
In Java, yes. Or at least implicitly (which means that it is used at some point somewhere).
Some object instances require a lot of setting up, where you might use a factory pattern or a builder pattern. But even so, 'new' is still used to instantiate the object itself. Patterns are anyway way down the line, you should not focus on that until you get a good grasp on object-oriented programming.
First of all, you don't necessarily need boxed types to implement B trees, and second of all, I can name random data structures too.
//repeat asking
std::cout