/dpt/ - Daily Programming Thread

old thread: What are you working on, Sup Forums?

Other urls found in this thread:

youtube.com/watch?v=676FMfkYxOk
cryptopals.com/
eudyptula-challenge.org/
learnpythonthehardway.org/book/nopython3.html
twitter.com/SFWRedditImages

thanks for not using a fag image

>C# C++ java python

Please post a (non-fag) anime image next time.

template
D& BTItr::operator*(){
return node->data;
}

why invalid ihitialization from non const ref of type "double&" from an rvalue of type "double"?

returning a reference to a member var, shit makes no sense. WOrks if I add "return (D&)(node->data)

>>>/fb/

>you will never get to sit in bjarne's lap while he lectures you on C++
why even live

im crashing this thread... with no survivors!

to what limit should i unit test my code?
say im writing a poker game and want to test the hand types (full house, royal flush, etc), should i automatically generate code to permute all (or tons) possible hands for full house to test it's correctness?

Employed Haskell programmer reporting in

Unit testing is for fools.
Real programmers formally prove their programs to be correct.

this

but i dont want to formally prove my program everytime i refactor or add some trivial lines.

Was getting employed part of your plan?

facebook uses haskell to detect spam/malicious links in messenger

you're a big guy

Hoare logic is modular, so you can simply add at least one more proposition to the proof.

Can you comment on the alleged meme status of category theory?

>What are you working on, Sup Forums?

I am procrastinating on an assignment that uses Artificial Neural Networks with Back Propagation. I am also procrastinating on writing a fucking CV and Statement of Intent.

Should probably get to all of this after Turkey Day.

How do I avoid using the long double literal suffix if I want to use long double in c++?

if (op.image.isXKCD == False) {
reply ("Please post XKCD next time!");
}

This is a good idea.

agreed. it also has the micro-optimization benefit that you only shuffle a step when required.

give us more context, like what is node, data etc etc

unit testing is more for games. for any other kind of data modelling you're going to want be rigorous and create relationships from the data to model that rigor.

science isnt about truth in any conventional sense. its about clarity. "clearly its true, scientifically proven so." we didnt make these systems to fight religion, we made them to hold off bad marketing and bad practice in general. if you can read the matter of the science, usually in the form of math, then you can tell when someone has given you a bad figure and thus see the truth in it. otherwise, people in the right positions can make any claim and you have to pay any amount for whatever reason regardless of any "ill conceived" prejudice you may have against the market or its peoples.

still science is rooted in the senses, individual senses, and its up the individual to abstract the data in a way thats feasibly direct enough to be apportioned via service or consul.

How do I read a Python source file, instead of executing it on click?

right click -> open with?

or right click -> edit

Sure, why not.
There are only about 2.5M hands, so you can check every single one in less then a second (assuming your code isn't shit)

well, I'm writing it in assembly so it can't be that shitty.

if you're serious you should benchmark it against compiler-optimized C/C++

>Hoare logic is modular, so you can simply add at least one more proposition to the proof.
Almost nobody uses whore logic in practice.

>whore logic
Kek

If Haskell is the best language,
why do Computer Science PhDs exist who choose Java?

They fell for the silly memes.

youtube.com/watch?v=676FMfkYxOk

Has anyone here done
1. The Cryptopals Crypto Challenge - cryptopals.com/
2. The Eudyptula Challenge - eudyptula-challenge.org/

Zed Shaw is silly

learnpythonthehardway.org/book/nopython3.html

>Currently you cannot run Python 2 inside the Python 3 virtual machine. Since I cannot, that means Python 3 is not Turing Complete and should not be used by anyone.

>silly
This whole post is absolutely retarded.

How would I format the values of a vector in c++? So when I input some values into it it's "1.0f" instead of "1" automagically?

Is there a better alternative to Project Euler? I love it and don't really want to switch but I wonder if there is a better choice

vector of a different type of number

More progress on my Forth compiler.
CREATE (variables and data) working.
Now the tricky part, implement DOES> (which is one of the things which makes Forth, Forth). It's enough of a mind bender just using it in the language itself. Implementing it is gonna require generating code which compiles code that compiles code in the current definition, it's like three levels of nesting to keep track of.
I find myself needing some better notation, I though x_FOO and c_FOO was enough for execution semantics and compilation semantics, but now there's also execution semantics of resulting compilee defintions, almost getting lost in my own code.

coding isn't the same as programming

Programming is a proper subset of coding
Scripting is a proper subset of programming

>Is there a better alternative to Project Euler?
In what aspect? What is it you want to learn?

Passpoint/Hotspot 2.0 profile generation for Android devices

>note
>Yes, that is kind of funny way of saying that there's no reason why Python 2 and Python 3 can't coexist other than the Python project's incompetence and arrogance. Obviously it's theoretically possible to run Python 2 in Python 3, but until they do it then they have decided to say that Python 3 cannot run one other Turing complete language so logically Python 3 is not Turing complete. I should also mention that as stupid as that sounds, actual Python project developers have told me this, so it's their position that their own language is not Turing complete.
You're giving him the attention he so desperately wanted by using that clickbait quote that you just linked for clickbait.

>claims to know C++
>can't solve this simple problem


//assign goo to bar then call bar
class Foo
{
public:

void goo(){};
static void (Foo::*bar)();
};

what's the difference between scripting and programming?

p.s. This is an entirely misguided issue because once you've started concerning yourself with these kinds of issues you've removed yourself from productive programming and are basically just getting some job security from obfuscating the development process.

Which you really don't need if you can manage to do this in the first place.

Basically I've heard about sites that make future employment easier, like a ranking of coders. Euler's learning value is incredible indeed. What I meant by writing 'a better alternative' is 'an alternative with broader perspective'.

Foo::bar = &Foo::goo;
Foo f;
(f.*Foo::bar)();
now fuck off

>claims to know C++
Alas I have made this mistake, nobody actually knows C++. Not even the committee. It's simply the nature of a large stupid language.

This just tests your ability to work with what's essentially C's function pointers though.
>(f.*Foo::bar)();
Can you not just do
*(Foo::bar());
As it's a static variable.

>(f.*Foo::bar)();
Why can't you just do (f.bar)() ?

Won't work.

It's a static variable but it points to a function that needs an object. That solution won't compile anyway.

tits and/or age

>tits
Too big but rapidly shrinking.
>age
23

Anyone?

Oh yeah you're right. Obviously an implicit this pointer. Man I love OOP. It's just so clear and easy to read.

From last night:
>me: How do i interpret a byte sequence as any arbitrary data type, where the data type is given by the user during runtime?
>user: Call a function?
>me: So i need write function for every datype that exists?
>user: Generate it compile-time using templates or a similar system.
>me: That still means i have to manually call the template once for eath data dype.
>user: Generate that too with metaprogramming. Generate the entire function with the multiple codepaths.
And then i fell asleep.
So to pick that up, how do i do this? How do i generate everything with metaprogramming?
Working in C++, btw.

I'm into eudyptula challenge, but its pace is too slow. I'm awaiting two weeks for the results of challenge 16.

>arbitrary data type
What do you mean by arbitrary? You obviously can't support any possible data type, and that would require infinite code.
What data types do you actually want to support?
What are you even doing with that data?

Zed Shaw is a close-to-borderline idiot.

Bascially all his writing is about "stop liking what I don't like".

I just sent them an email saying I want to join. What should I expect from the challenge?

what the fuck with that huge ass binary file?

what?

the more challenging part is configure your email client to behave (send as plain text and don't mess with the attachments). For the challenge in itself you can do every chapter in matter of minutes or hours if there is a long compilation.

I just learned about binary trees, and there was an exercise to mirror it. I've always heard about that problem as a super hard interview question. But irt's easy as shit. 8 lines in c.
Are graduates really this stupid?

>Programming is a proper subset of coding
>proper subset of coding
>proper
>coding
kek

How can tripfags be this retarded?

I hope I have done that part correctly.
Here's what is says in the email header of what I sent them
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit

yes they are

>I've always heard about that problem as a super hard interview question.
Seriously?

>How do i interpret a byte sequence as any arbitrary data type, where the data type is given by the user during runtime?
Sounds like you actually want to just handle the byte sequence rather than a data type, as this is what they give you. If they're just supposed to give you a name like "vec2" and you call the function for vec2 then yes templates do seem like easiest solution within C++. Unless you use your own metaprogramming.

If you're aiming to have the user input something like:
struct Foo{
int a,b,c;
}

And then have the code respect the types and fields of that struct. That sounds very difficult. You'd have to write a basic C/C++ interpreter/compiler, allocate some executable memory and compile the code into that memory to then run it. So you're basically just doing templates at run time. I'm not sure what the goal would be here, how would you know the implementation without knowing the data at all? Do you presume certain things about the names of the types?
>me: That still means i have to manually call the template once for eath data dype.
That's not a big deal. Write your own pre-processor which extracts all the types in the program and inserts them into a header which you then include where appropriate. It's probably easier to do that way than with template metaprogramming.

If C++ had decent reflection and or introspection it would be much easier.

Yeah I don't know how to do this.
Is this a case where C++ simply doesn't work or is there some trick?

Well, primarily every primitive data type.

But in the longer run it would be nice to let the user extend that to any data type the user desire, using scripting. But that would require me to learn how to embed scripting support in the my program, which i don't know how to do either.

I'm thinking about how i would write a hexeditor that supports data type reinterpretation. Kind of a program to help analyse binary files.

Kind of like
>i want these bytes to be interpreted as unsigned int
>these bytes as short
>these bytes as flot
>these bytes as chars
>these bytes as long long
>these bytes as float64
>these bytes as a big endian float
>those bytes over there as MyStruct, whatever that may mean.

>I've always heard about that problem as a super hard interview question.
It's more like a basic, intro level question. If you can't answer it, there is no need to talk further when we talk about real positions.

>Are graduates really this stupid?
They are, plus there are a bunch of so called "software engineers", "coders", "app developers", etc. who are actually really shit autodidacts, far away from the skills of a real autodidact and/or someone who attended university. They think they're some kind of actual programmers, developers on the same level as people (regardless whether we talk about an autodidact and/or university educated guy) who are a lot more serious (and talented) about their shit and aren't afraid of babby-tier math. Most binary tree stuff is quite trivial.

So when these people get a basic question to test whether they can actually think in terms of basic computer theory than they flip and start spamming twitter/tumbler/blogs/etc. with their shit.

I don't want to generalize here, but most guys with a "cool" hipster sounding job are usually shit. They're like at a technician level (or even lower) compared to an actual university graduate (or to people with an equivalent knowledge).

Real good autodidacts are hard as fuck and most of them gets a degree sooner or later even if they only get it for the paper. These are the kind of people who can easily beat a university grad in skills and knowledge.

ghc does not optimise by default, use '-O2' and then strip it with 'strip'.

Woooo, calling native code from a shell!

rubyist@Overmind:~/Desktop$ powershell
PowerShell
Copyright (C) 2016 Microsoft Corporation. All rights reserved.

PS /home/rubyist/Desktop> $src = @"
>> using System;
>> using System.Runtime.InteropServices;
>>
>> public static class NativeMethods
>> {
>> [DllImport("./foo.so")]
>> public static extern Int32 add_numbers(Int32 a, Int32 b);
>> }
>> "@
PS /home/rubyist/Desktop> add-type -TypeDefinition $src
PS /home/rubyist/Desktop> [NativeMethods]::add_numbers(1,2)
3
PS /home/rubyist/Desktop> ls
foo.c foo.o foo.so
PS /home/rubyist/Desktop> cat foo.c
int add_numbers(int x, int y)
{
return x + y;
}

>so called "software engineers", "coders", "app developers", etc.
That's a strange way to spell 'Pajeets'

>Write your own pre-processor which extracts all the types in the program
I don't think i know how to do this either.

I wrote it like that because there are legit people holding those titles, although it gets more rare these days.

Scripting is a special case of programming wherein a scripting language is used.

A scripting language is defined as a programming language wherein the executable file and the source code file are one and the same.

Why yes, it is a proper subset. All programming is coding, therefore programming is a subset of coding. Not all coding is programming (markup languages are "code", but not traditional programming languages), therefore the sets are not equivalent. Since programming is a subset of coding and is not the same set as coding, it is a proper subset.

>the executable file and the source code file are one and the same.
what?

>They are, plus there are a bunch of so called "software engineers",
If I translate the name of my bsc word for word, it be software engineer in english

>executable file and the source code file are one and the same.
Nit-picking, but can't you write a compiler and an interpreter for every language.
Also, what would you say about python? It generates .pyc files but you can execute it directly in the interpreter too.

>the executable file and the source code file are one and the same.
That's just a convoluted way of saying interpreted language

Meaning you do not compile it ahead of time to a bytecode or native code format.

Look up a recursive decent parser and parse a tokenized version of the code.
Figure out how you recognize what's a defined type (words like class, struct and union often come just before the type).
Process the tokens and extract the types into a set of preference (no duplicates).
When you have that you just fprintf() your template calls it to a .h file.
Include the file where you want to call all the different versions. You can put in a volatile static variable and branch around all the calls if you so choose to not have to actually call them as initialization.

It's not that much when you know what to do but it's probably a bit confusing if you're new to non-template metaprogramming.

But it's not uncommon to have tools that process your source to construct other code or files. So it's worth learning.

So you don't agree that there are no aspects to programming that do not necessarily involve writing code?

>Nit-picking, but can't you write a compiler and an interpreter for every language.
Yes, and if you use an interpreter, you are scripting. If you use a compiler, you are not.

Just wanted to be more explicit.

that's retarded. first, a source code is not a executable, it's either interpreted or compiled. second, with your definition, every programming language is also a scripting programming language.

Oh, i see.

>A scripting language is defined as a programming language wherein the executable file and the source code file are one and the same.
No it's about how the programming language is used/intended to be used.

haskell is not the best language

java is the best language

#!/usr/bin/env bash

echo "Hello World"

is an executable, and the operating system will see it as such.

Also, every programming language has the capacity to be a scripting language, but unless it is being used as such, it is not scripting.

>t. Pajeet

It's like a parody. That name.
But it's a good explanation of a recursive decent parser.

There is a trick.

Only the use of code as a verb triggers me.
>Writing code can either mean writing programs or other shit.
Your logic is sound though.


There is a reason I put it inside "". I was naming jobs and the titles some people call themselves (even without qualifications). My point was to point out the lot of bullshit going on with jobs and how despite the title the work is far away from what you would except, people are too in terms of skill.


Translating can be a bit tricky, we have (had) the same kind of naming shit around here yet most of the courses are what a computer science degree would encompass rather than an SE degree. Course material matters more than the name of the degree.


As far as I know lot of SE degrees in the US are basically stripped down CS degrees with a focus on enterprise software development, etc. instead of theoretical foundations (or at least on a lot weaker level than someone with a CS degree).

Btw, we had Software Designer Informatician (one to one translation) as the name of our CS degree here. Back in the day it was called Software Designer Mathematician. For the newer term a fitting translation would be Software Engineer, yet the course material is ~80% theoretical CS stuff.