/dpt/ - Daily Programming Thread

Old Thread: What are you working on Sup Forums?

Other urls found in this thread:

en.wikipedia.org/wiki/Chronological_snobbery
open-std.org/jtc1/sc22/WG14/www/docs/n1570.pdf
open.gl/
docs.oracle.com/javase/tutorial/
twitter.com/NSFWRedditVideo

First for tensorflow

1st for alpha pureness

>gets 2nd
beta asf

>What are you working on Sup Forums?
Trying to do a very fucking complicated task in SQL.

Will probably do it manually for now so I can finish this project, but I'm trying to essentially generate a set of queries to retrieve all the underlying rows from all the tables joined within a view.

if const char *cuck[4] holds 4 pointers to strings(who aren't the same length) then how does the compiler determing (cuck + 1) if the storage units aren't the same

6th for C-style syntax is ugly

>SQL
are you also using php

No.

9th for c++

Because the pointers have all the same size.

>to strings
To arrays of chars. Not all char arrays are necessarily strings.
>if the storage units aren't the same
The pointers themselves are the same size.

>mfw it's almost 2017 and all Sup Forums knows is C

trudeaux pls go

Sucks to have no face

C function pointer syntax and macros are ugly. Most of C style syntax is fine though.

Identifier : type = value
Vs
type identifier = value

Which do you prefer?

it's the same

en.wikipedia.org/wiki/Chronological_snobbery

it's the current year and haskell is still shit

no it's not this extremely minor aesthetic difference is VERY important to me and needs to be discussed on my peruvian llama herding forum

The latter is what I'm used to most of the time. I have no problem with either. I also have no problem with the Ada-style := syntax instead of using = for assignment. Unfortunately, Go has hijacked it, and Go is a shit language.


void f();
...
f(1, 2, 3);


This compiles fine in gcc but clang throws a warning.
What am I supposed to do?

What is the warning?

too many arguments in call to 'f'

Although some versions of C are supposed to allow this.

Wait, actually, does work, but my actual case is the exact same thing with a different name and an actual definition of the function.

long f()
{
return 0;
}

int main()
{
f(1,2,3,4,5,6);
return 0;
}


This won't compile in clang without warnings.

are you compiling it as C++ or something? which version of C?

I'm using clang 3.8.1, the file has a .c extension, I'm not passing any flags.

See for a practical example, my original post was off.

The real question is: why the hell are you doing that?
I'm not sure if passing too many arguments to a non-varidaic function is undefined behaviour or not, but it's a really stupid thing to do.

C11 made empty function parameter declarations obsolete (i.e they might be eligible to be removed in future versions).
Disable that warning in clang if you don't want it complaining about it.

I just looked it up:
open-std.org/jtc1/sc22/WG14/www/docs/n1570.pdf
>6.5.2.2 Function calls
>If the number of arguments does not equal the number of parameters, the behavior is undefined.
Yup. Don't do that shit.

Okay, I see your problem.

Yes this is allowed in standard C.
No it isn't good practice.

This declaration:
void f();
Means that any number of arguments can be given to f.

Meanwhile, this declaration:
void f(void);
Means that f cannot take any arguments, and providing any is an error.

When writing standard C, you should prefer the latter.
When writing C++, you may prefer the former, as it is equivalent to the latter in C++.

While I prefer id : type = val for single definitions multiple definitions in one statement are either just looking stupid or have unnecessary repetitions:
id1, id2 : type = val1, val2
id1 : type, id2 : type = val1, val2

first of all i dont even know C

Well are you going to learn it now, or are you going to pride yourself on not knowing it for a few years until you have to learn it, and then bitch about how much you hate it the whole way through?

How can I become more like Grand Magus Jon Skeet?

>until you have to learn it

Does anybody really NEED to learn it?

What are some good resources to learn OpenGL and/or Cocoa?

Not unless they're in school.

Similar to a company telling me I have to mop the floors, I would also put in my two weeks if they informed me that I must use C in a project.

opengl bibble

>write a SO bot
>write a few more sock bots to upvote/ask questions your bot can answer
>???
>profit

For online tutorials, the best one I've come across was open.gl/

it's extremely basic it's good for getting started but you need to do a lot more reading after it

Thanks!

Except that's literally not true

stupid frogposter

identifier := value // Type is inferred and var identifier type for declaration only.

Can I see a double int or a long double in action in c++? Who's variables go up to 15 goddamn digits? Is it used primarily for random generation or what?

fresh oc

scientific computation and/or internal representation in numerical methods would be my guess, maybe something in fintech

so, for a normal programmer all a double and long double are for is tracking fractional numbers? (1.163211) for storage purposes, otherwise it's for high level shit?

Isn't it possible to use only unsigned shorts as all your types? It seems much more efficient if you're not working with huge amounts of out-to-the-billionth variables

>all your types?


By this I mean the ones which obviously aren't negative

>Isn't it possible to use only unsigned shorts as all your types?
not if you want to work with fractions a lot.

Is that a suggested way of doing things, though? Using unsigned types whenever you can?

no

Not really getting what you're saying. You use unsigned ints when it makes sense to. When it doesn't make sense, don't.

of course not, you need to pick the right resolution for your work

Isn't unsigned faster than signed, though? I'm only asking because the books I'm reading now imply they're easier on data allocation, so I'm mostly asking for myself

I have no idea myself. But if you have a reason to specifically make an int a signed int, then there's no reason to make it unsigned. Unless you're maybe making some utterly insane sort of optimisation that I've never heard of before...

for some things, like bit twiddling, you're supposed to use unsigned, but for many other things you should obviously use signed, and signed isn't (significantly) slower for most things

identifier = value : type

And, of course, the annotation can be omitted if the type can be inferred.

Does any language do this? Looks more like a type casting.

kill yourself

Some functional languages do, but they also allow to separate the type annotation from the (often pattern-matching) definition.

datafag

literal fag

faster does not make sense in this context, the performance of floating point instructions vs integer depends on the cpu architecture mostly, but it has many factors and I would wager the actual calculation is not the dominant

Haskell with ScopedTypeVariables or PatternSignatures
(x :: a) = b
x = (a :: b)

*b :: a

So, pretty much a total newfag dipping my toes into OS shit so I can actually debug stuff in the real world

so I downloaded c:dda and started to build it

immediately hit with redefinition errors, supposedly something about time management is all fucked up.

Is this something that's safe to ignore? Isn't this bad practice?

>If they're not all the same size
They are the same size though, even if the variables they point to aren't

Even specifying C89 or GNU89 on the command line doesn't get rid of the warning.

>Means that any number of arguments can be given to f.

Yes I know the difference, that's on purpose. I have inlined asm in there that mess up with the parameter registers, so I don't need named parameters.
The only other way to have variable arguments are this and having a named argument + ..., but the prologue gets filled with irrelevant things I don't want.

clang is trash lol just use gcc

You can't how much I wish I could.

My friend is starting java in uni in a month, I told him it would be a good idea to get familiar with it in this time, anyone got a good resource for learning java?

docs.oracle.com/javase/tutorial/

You should generally use your default data type for everything, that is, Int, unless you need autistic perfomance fains

>My friend is starting java in uni in a month
May God have mercy on his soul

I know right

He actually said he might be doing hasklel later though
I will have a look at all these links, but it seems like these are different tutorials for different areas of the languages, I was hoping for one which goes from basics like hello world to more complicated stuff like muh OOP, a start to finish tutorial if you will

*gains

>He actually said he might be doing hasklel later though
That was fast, God's already given him mercy

>need simple string SHA1 function in JS
>stumble on package with 2K monthly downloads
>it's just two functions that wrap Node's stdlib
var crypto = require('crypto')

function sha1 (buf, cb) {
var hash = sha1sync(buf)
process.nextTick(function () {
cb(hash)
})
}

function sha1sync (buf) {
return crypto.createHash('sha1')
.update(buf)
.digest('hex')
}

module.exports = sha1
module.exports.sync = sha1sync

JS has already overtaken Python?

>I know right
nice meme
>He actually said he might be doing hasklel later though
why the fuck, even to a completely clueless person there is no compelling argument to learn haskell

>there is no compelling argument to learn haskell
mind expanding

JS is probably worse off than Python. Python may have the """enlightened""" pythonic crowd, but JS is swarming with retarded frontend web designers that couldn't program a fizzbuzz without importing a trendy hip library/framework.

what's the problem?

>where's the API?

it's shit, very few people use it and even fewer people use it commercially and for good reason, it's obscure and irrelevant

>JS is swarming with retarded frontend web designers that couldn't program a fizzbuzz without importing a trendy hip library/framework.
Just like Python

That there are 2K people every week that import 15 lines of code, because they can't read 2 paragraphs of documentation.
You're looking at it. The entire API is 2 functions.
Fair point.

Good concepts in it though, even if it is a shit language for real work.

>Good concepts in it though
nope it's just a bunch of shitty memes

Learned assembly and made an exokernel and unix like OS, did some hardware programming, decided I don't like this shit and now I'm doing webdev lel

envy

>That there are 2K people every week that import 15 lines of code, because they can't read 2 paragraphs of documentation.
Why bother when they don't have to? This is certainly something I'd use. I'm not even certain how the API it's wrapping works, and I don't care enough to bother learning. If I just want to encrypt a string and not spend long doing it, why not spend 5 seconds downloading a library that is tested by 2k people every day? Instead of learning an API i probably won't be using again anyways and then debugging my implementation and whatnot.

b8

looooooool k

>looooooool k
hello facebook

You sound like a nigger talking about how science class is racist and shit while he painfully remembers not being able to understand anything that all the smart kids understood with ease.

kill yourself and fuck off to

How code rectangle xxxx
x x
xxxx pls