/dpt/ - Daily Programming Thread

Old thread: What are you working on Sup Forums?

Other urls found in this thread:

github.com/bitemyapp/learnhaskell
seas.upenn.edu/~cis194/spring13/lectures.html
youtube.com/watch?v=8G9QIIvSpzE
twitter.com/SFWRedditVideos

first for C++

second for C++++

Trying to get into programming strictly as a hobby. What's a good language if you don't plan on doing collaborative work? I heard Java, for example, requires a lot of code compared to python.

Also what's a good website for inspiration for software to develop?

Learn C. Every programmer should know C, regardless of anything.

nice meme

This -> After grasping the core concepts you might wanna move to a higher level language or just use libraries.

If I'm piping data into a C program, is there a measurable advantage to encoding it in binary instead of plaintext? Is that faster?
Does it even matter if I'm only sending 5-6 integer messages?

>He doesn't know C

why?

Continued from last thread.. Capitalization cleaned up let's make this more "Enterprise".

int PennieConversion(float decimal_money)
{
int n;
long b;
decimal_money *= 100;
b = round(decimal_money);
n = b;
return n;
}

int HowManyInNum(int num, int num_to_count) {
int count = 0;
if (num < num_to_count) {
return 0;
}

while (num >= num_to_count) {
num -= num_to_count;
++count;
}

return count;
}

int Blub(int& total, int worth) {
int amount = HowManyInNum(total, worth);
total -= (amount * worth);
return amount;
}
int main(void)
{
int total = 1214;
int amount_paid = 0;
int quarters = 0;
int dimes = 0;
int nickels = 0;
int pennies = 0;
int difference = 0;

amount_paid = PennieConversion(9.75f);
difference = total - amount_paid;

quarters = Blub(difference, 25);
dimes = Blub(difference, 10);
nickels = Blub(difference, 5);
pennies = Blub(difference, 1);

cout

I get distracted if I don't have music going. Too many thoughts going on I have to block some out.

Lisp/Scheme

The language is basically geared towards creating your own little autistic playground where you have infinite power but nobody else knows what exactly you're trying to do.

What these anons said.

But anyway, what do you plan on doing with it? Anything in mind? (e.g. web programming, games, emulators, etc. )

I don't think it will matter, I forgot the reasons to encode to binary though, one of them was to make inviable to let the user modify the text.

you learn how to properly program

you learn how a language influences a computer

just do it

Is there an easy way to make gpxe shutdown when it fails to boot? I can wake my device with WoL, but if it doesn't boot correctly I have to walk over and manually power it down.

Binary encoding is faster, but it would only be noticeable for much larger sets of data.

there needs to be a newline between return type and function name

Can C read binary from stdin, or is the process different?

by "larger sets of data" literally only hundreds of thousands of lines it will make a difference

C is one of the worst languages for a beginner. It has fucked up semantics and irregular syntax.
If you want to learn how to "influence a computer" just learn assembly.

>What's a good language if you don't plan on doing collaborative work
functional languages

Try Haskell
github.com/bitemyapp/learnhaskell
start with
seas.upenn.edu/~cis194/spring13/lectures.html

What shop did you work at that ever did that in c / c++ code? I should probably be thankful I haven't seen that yet.

>Can C read binary from stdin
Yes. It's like any other file.

You're retarded.

I just decided to learn my first language like an hour ago too and started with python. Did I fuck up? Should I start over with C?

So I shouldn't even bother if I'm just sending a 6 byte, MIDI-like message?

Well ascii is 8 bits so input an 8 bit string at a time and cast it to a char

it's really not a realistic option

it makes grepping for function definitions easier

If you don't know C, you basically aren't even human.

t. Cmen

Stick with Python. Once you mastered it, move to C++ if you want to go lower.

If you mix races, you basically aren't even human.

C is good for learning how computers work.
But something like Scheme is better for learning how computer science works.

When you read SICP, it's not full of "well you could do it this way, or you could do it with *this* function, but don't use *that* function; it's considered depreciated" footnotes like C texts are.

It really depends on where your interests lie. If you want to treat your computer like a racecar where you tweak X to make Y happen, learn C. If you want to treat it like a magic tome where you just say X and Y happens, learn Scheme. They're both useful.

>but don't use *that* function; it's considered depreciated
List of deprecated C functions:
gets

Even then, it was actually removed, so you don't even have to worry about it anymore.

scheme got deprecated by Haskell honestly

Just stick with it. After you're comfortable with python move onto to something lower (C, C++, Rust).

C is a language every programmer should eventually learn to at least read.

If you want a job.. It's still mainly SQL, C#, Java, and C++. That is if you actually count SQL as a programming language.

>not homoiconic

haskell is pretty navel-gazing and bean-counting itself, and it's easy for newer programmers to write themselves into a corner
it's not nearly as conducive to learning as a Lisp, which is much looser and freer

You're right, nigger lover.

Haskell in general is built for and by academics. In general the language always chooses academically interesting approaches rather than increasing programmer productivity.

I'm pretty sure about the only place it has seen successful commercial use is at a few financial institutions.

why is it better than other languages? what defines proper programming?

Probably websites that offer some built in service. Want to build a website that helps video content creators match up(ie: video editor with guy go gathers footage of some extreme sport or whatever). Have a couple of these ideas but no clue on how to go about it. I took intro to java in college last semester. don't see many people using this language for these kinds of projects though.

youtube.com/watch?v=8G9QIIvSpzE

They're just recommending C because that's what they've been exposed first. They see anyone who doesn't go through that particular rite of passage as a lesser programmer no matter what. There is no rationality behind it.

yea i figured. doubt these advocates even know how to program properly themselves.

>I'm pretty sure about the only place it has seen successful commercial use is at a few financial institutions.
and even then most of them just use Ocaml.

Lisp and Haskell are both largely functional but they're night and day really. Haskell has none of the joyous spirit of exploration that Lisp provides.

I made that picture you fuccboi. :)

What you can do with a degree

Apply for jobs,
Do interviews,
Get rejected continuously because no one wants a fresh college student anymore.
Give in and finally start working retail / flipping burgers and continually try to get a job that isn't fucking you in terms of money.

The places that would want a college student can just get the work done cheaper overseas.

does Ocaml have HKT and typeclasses yet

>C++ has no standard to_string function for objects
why

>HKT
meme
>typeclasses
meme

What would you need it for

You just skipped the steps of spending several years in unpaid internships, working on open source projects, and getting scammed on Upwork in order to build your reputation.

Not so much a meme as a sign of the transition from unsafe code with side effects to mostly safe code with well regulated zones of side effects. This is transition is probably a result of the growing demand for asynchronous multi-threaded applications.

Oh I don't know, for printing objects?
>inb4 stream operators
Whoever thought that was a good way to do it was fucking retarded.

Figuring out how I'd like to design and create a VM. I'm thinking of segmenting everything like the instructions, data and stack, but I'm not quite sure on the logistics on that.

this is a transition that is probably a result of the growing demand for asynchronous multi-threaded applications*

I really need some caffeine right now.

Because there is no "object" type that all objects inherit from.

How should the C++ language know what your object should look like as a string anyways if you don't define the method yourself?

just use functors

No you don't understand.
I can't define the member and have printing functions use it because print functions don't know about it.
Sure I can call it manually, but I don't want to.

They use overloading their memestreams for that.
It's horrible, I know.

Yeah I'm aware of that but I do not want to use it.
It's a terrible fucking hack. Fuck stream operators all together, format strings are superior.

namespace std {
string to_string(const your_obj& o)
{
// Fill in code here, you lazy twat.
}
}

you can override cast operator for const char* m8

No. I was exposed to C only after shit scripting languages and C++ and I still recommend it first. It's a beautiful language.

Why is operator overloading acceptable?

Why should + mean "add two numbers" as well as "append two lists"

How is this not considered harmful as FUCK?

Templates + Design patterns.. If you really need a bunch of objects that can be used by a print function then you should probably be programming to a interface of some sort. I'm pretty sure that's what most languages with a to_string built in are doing.

Because freedom. + Is just a function.

nice coming of age narrative you have there

append two lists is actually ++ in good languages

operator overload is a thing so you can use + for both int and float (and other shit like vectors)

> freedom
Whenever you hear this word, you know it's going to be followed by a justification for something unjustifiable.

because it's controlled by the programmer to be safe. How would it be harmful?

No need. Freedom is its own justification.

Why are you bitter about C? Because you're too stupid to learn it properly?

> cin >> and cout

>you can use + for both int and float
Probably the only case where it makes some sense
And even then it's not too much trouble to have separate syntax for int and float like Ocaml does

I was writing an iOS app the other week and ran into some ridiculous compiler errors caused entirely by operators being overloaded up the wazoo.

I know, I know
>Swift

but there's a deeper, theoretical flaw going on here

One function should do one thing.

Come to Russia, buddy. We all hate freedom, you'll feel at home. I assure you.

it also makes sense for vectors and matrices and everything else where you want addition

Yes. Get rekt, user. C++ is not for mere mortals such as yourself.

I'm sorry, but I have some semblance of taste.

luckily my own country is slowly becoming more nationalist and less liberal, but maybe not quickly enough

I'm not trying to be edgy but 'freedom' can seriously go to hell. It's always wielded as an excuse to defend a sub-optimal state of affairs. The average person has nothing to show for 'freedom'

Indeed. There is no safer way to objectify console in and console out.

Is there a quick way to overload functions where the only thing that changes is the parameter.
Dealing with opening files and I need to open a char* or a tchar*. You cant convert a tchar to a char to my knowledge easily.

It makes sense for lots of things. any string library, any big number library. IO. matrix libraries.

and most people assumes that operator overloading is just math operators. you have [] -> in C++, or casting operators . or assignment. these have lots of usability

If you are using operator overloading when it makes no sense, it is no different than bad function names. it is programmer's responsible to make use of it.

Do you hold your pinky up when programming faggot?

>semblance
>taste

generic programming is a good thing, addition function is just adding 2 types together
it is the same with other functions which were made to be generic

you know every language started out as their creator's little autistic playground right?


except they didn't want you to make your own, so they told you the right way to program.

REKT

What?
You don't?

strcpy
strcat
strdup
strtok
bcopy
bzero
...

Just like terrorism is always a excuse to infringe on privacy, due process, and the things that differentiate a citizen from a convict in prison.

Very true. It's a sad fact of life though that a lot of "programmers" abuse it for stupid nonsensical shit.

>strcpy
>strcat
>strdup
>strtok
None of those are deprecated.
>bcopy
>bzero
Those aren't even standard.

God bless America.

>there are people that does a.add(b.multiply(c.add(e), f.subtract(g))) because they can't afford operator overloading
this is a sad world

yea, java and Go kek

(add a (multiply b ( (add c e) (subtract f g))))


This makes no sense are you multiplying b by the sum of c and e and then multiplying the result by the difference of f and g or what?

>his language does not have a comma operator

The comma operator functions differently depending on what b is in this case. It could be a class that performs a multiplication between c.add(e) and f.subtract(g) or it could be doing a multiplication using all three.. It's far too ambiguous in this example.