/dpt/ - Daily Programming Thread

What are you working on, Sup Forums?

Previous thread:

Other urls found in this thread:

docs.python.org/3.6/library/string.html
vmwaros.blogspot.com
twitter.com/AnonBabble

>C(ancer)""""++""""

Is Object Pascal, dare I say it, /ourlang/?

Branching into making GUIs since I released people are too stupid to use CLI in 2017

bumping for help

Trash.

C is GOD

Reposting in new thread:

Why is C++ networking programming such a clusterfuck? cpp-netlib and asio rely on boost, and even if they don't it's a mess of namespaces and other junk. I really want to use C++ for a server project, but I'll have to stick with Java because its standard library actually has a network api...

Any reason you can't use socket.h or winsock?

>not an anime op
Thread officially ruined

Because C++ is shit.

that's even more of a clusterfuck if you're used to c++14

Cross platform-ness. If I understand correctly, socket.h is UNIX, winsock is Windows. Besides, to the best of my knowledge, those are both C.

If there's something missing, I'll gladly learn about it.

Hey boys~

I've recently been exploring two Scheme implementations; Guile and Chicken. Overall I'm favouring Chicken more, both for rational reasons (Eggs are far easier to work with) and irrational reasons (GNU and FSF are disgusting bureaucrats who ruin everything)

At present, I'm thinking about going nose to the grindstone and working on something significant in Chicken Scheme, however I'm a little apprehensive - should I just bite the bullet and go for it, or is this perhaps a fool's errand and I should just use Common Lisp like the rest of the world?

>irrational reasons (GNU and FSF are disgusting bureaucrats who ruin everything)
How the fuck is that an """"irrational"""" reason? If anything, your first reason is more "irrational" than this.

Working on a GUI here.
Is it better to keep the GUI in a class or within the main program itself?

nigga you even mvvm or mvc?

I don't know what you mean user, are you writing a GUI library like GTK to wrap win32/x11 based on target platform? Are you writing a GUI system in opengl/directx?

What is the easiest way to line these all up nicely? Or do I have to do the calculations manually?

>python
why don't you ask on reddit?

just use iomanip and column widths

Object Pascal is essentially Rust for non-hipsters.

This is for python

Why are you so angry?

Did I sound angry? I just told you to go back to where you belong, no hard feelings, "buddy"

Pad it using a printf control code. (lol if python doesn't support it)

Don't you have some string-formatting functions in your meme language?

Just calculate the max width per column and pad the rest to equal that width. Not that hard.

>meme language
go join that retard on reddit.
it isn't a """"meme language"""", it's just plain trash.

what is the best algorithms book?

>Object
Didn't even read any further but I already know it's garbage

tabstops you retard

fuck you faggot, object oriented design is the next big thing and its gonna dominate the rest of the 90s and 00s!!!

We will send objects over the internet instead of abstract data so that the data can operate on itself when it arrives!

Should I learn C#?
I mostly use Linux, but people says is a good language for multiplatform applications...

kek

It's fucking garbage.

Just learn Java, its on more platforms.

It's basically Microsoft's version of Java. You won't see much use for it outside windows machines IMO, but it's pretty intuitive and looks good on a resume.

It's fucking garbage though.

how is c# multiplatform? only with xamarin you can dev ios and android apps. no mac or linux

Man, are you seriously so peer-pressured by Sup Forums that you'd use C# over Java for multi-platform? Sure it's a prettier language but it's worse in pretty much every conceivable way for the task.

Both are ugly as fuck and should be avoided at all costs.

which programming language has the highest chance of making me the most moneyyy

Have you tried sockets? Ever programmed a networked application in C? Learn something and don't use a library for once.

Also libevent.

Your own language.

useful tip of the day:
if gcc complains about missing template instantiations for a function that isn't even a template, it's actually because of an auto argument which creates a template behind the scenes.
just discovered this.

Oh boy. Man are you going to love this link.
docs.python.org/3.6/library/string.html

Especially the Format Specification Mini-Language portion

Do people not consult the documentation before running for help?

socket programming without exceptions, just shoot me now senpai.

Honestly COBOL or Fortran

most questions on Sup Forums can be answered with RTFM

I can understand some questions being asked because they are more abstract and if you can't pinpoint the root problem you can't read the manual but shit like string formatting? Holy shit. Just fucking read the documentation.

Dont think its supported

The fuck is that?

I do use string format but I don't think theres any easy way to line up the header (the hard coded print statement up top) be in line for every element I print out

>Dont think its supported
The string format thing should be able to handle it.

>I do use string format but I don't think theres any easy way to line up the header (the hard coded print statement up top) be in line for every element I print out

Align you mother fucker.
Do you even know what Align is?
Did you read
>If a valid align value is specified, it can be preceded by a fill character that can be any character and defaults to a space if omitted.

Please FFS read the fucking documentation. I literally already spoonfed you. What is your end game? You want me to write your code for you?

Here is your psuedo code

Fill with String, set alignment after the spacing you want. Fill with String. Repeat till you have proper alignment.

>What is your end game?
Would you perhaps be interested in knowing the next step of his master plan?

how does one get good at reading documentation and knowing what he needs for a specific problem?

Crashing /dpt/ with no survivors.

Python is the best language

just read the entire thing once, its not that bad, and you can generally skim a lot of it because it will be repetitive or obviously things you won't need.

Learn how to use Google and control f. The rest comes with experience.

Identify problem
>Hmm I need to align my text
Look at current implentation and the tools being used
>Well I am using string to print out my text
Look at X documentation
>Since I am using string, lets look at its documentation and see if I can identify inbuilt methods to solve my problem

There

For some god damn reason I decided I wanted to learn C, C assembly, and C++ all at the same time to build onto this os.

vmwaros.blogspot.com

Its actually pretty cool it just has shit drivers and no one wants to make it better 3: all the devs are worried about the SMB and 64 bit compatibility that they will never use.

just \t user

\t doesnt werk in a general case, if certain strings are too long it'll come up looking fucked

I'm working on making this site not blinding as fuck to myself.
Source, am retarded and going to sleep now.

>Forgot to bind scrot to printscreen

Is CSS programming?

just use 4chanx and oneechan, dumbass

Nah. It's scripting though.

just use tomorrow style gorilla

waste of time

I have a question for all you C programmers. What projects have you done in C? What kind of things do you personally use C for?

my poli sci minor

socket programming is bretty fun

- University assignments
- Firmware on microcontrollers (majority)
- Libraries that are to be used by a variety of different programming languages (mostly stuff that communicates with said microcontrollers)
The latter two professionally.

I haven't programmed very much lately, but in the past I've written a few programs (for myself) in C. Quite often, even before starting, I thought C probably isn't the easiest language to to do in, but often it was just for the learning experience or testing something out, and I think C is a no bullshit way of learning to do something.

I wrote:
An IRC bot that grabbed shit off of other IRC bots with XDCC.
An audio visualiser with OpenGL.
A JSON parser/implementation.
A shitload of smaller, usually single file programs doing all sorts of shit.
Some shitty X11 program which didn't actually do anything (I was just drawing shit).
The flimsy start to a Wayland compositor (Again, I just got to point of drawing shit).
Several university assignments where the choice of language was up to up (Most of them involving networking).
And another whole other bunch of half-baked project ideas.

Most of the stuff I actually wrote which are the most useful to me, and which I actually use regularly are just bash scripts.

>up to up
up to us*

Should I learn Ocaml or standard ML?

sml

reasons?

Actual standard
Multiple implementations
Saner syntax

I tried to learn rust, I tried...

But fucking hell. Why does it try so hard to be different about fucking everything? Not even remotely similar.

Neither, ml is fucking useless.

Why should it be similar? To accommodate retards such as yourself?

A controller for the drive by wire accelerator in my car.

Dijkstra for Dummies

It's akin to learning a deep African tribal language, instead of German.

I mean sure, the Africans can run fast, and yeah, the Germans have a bit of a overflowing issue. But at least I'm able to converse with more than the local Shaman.

>Multiple implementations
Which one should I use, smlnj?
it would be useful to me so it's by definition useful.

>that extreme case of reddit spacing
Post disregarded. Didn't even read it.

>reddit spacing
What the hell are you even talking about? Is this fucking "meme arrows" for 2017?

Analyze your post, maybe even a retard like you will be able to understand. Hint: you didn't do it this time (which leads me to believe you actually know what it is)

These algorithms have something common in their cores. They use a while loop for traversing nodes and a data structure which defines an order of an evaluation of that loop.

BFS uses a queue, DFS uses a stack, Dijkstra uses a priority queue.

I like Cormen's book on algorithms, it's more concise in comparison to Knuth's ones. I would also recommend MIT OCW 6.006, 6.046 and 6.851, they have online video lectures.

is it normal for gcc to ignore volatile at -O3?

No. You're probably doing some other stupid shit.
If you're modifying from a signal handler, it should only be a volatile sig_atomic_t.

What do you mean? Example?

volatile int count = 0;
int tmp = 0;
while (true)
{
if (tmp != count) std::cout

Is that indicative of the whole program? Is "count" really in the same scope as the rest of it?
GCC (rightfully so) will still skip volatile checks if it can reason that something can never change.

Post the whole, compilable bit of code. People can't tell what the hell is actually going on if you cherry-pick bits of your program and leave out what could be very important details.

>GCC (rightfully so) will still skip volatile checks if it can reason that something can never change.
But it shouldn't. Because it really can't reason that.
Sounds more like whoever fit GCC to a microcontroller fucked up (or generally has no clue).

> An object that has volatile-qualified type may be modified in ways unknown to the
> implementation or have other unknown side effects. Therefore any expression referring
> to such an object shall be evaluated strictly according to the rules of the abstract machine,
> as described in 5.1.2.3

int jwdpmi_main(std::deque args)
{
std::cout

>But it shouldn't. Because it really can't reason that.
If it's an automatic (stack allocated) variable, and a pointer to it has never been taken and given to somewhere else (i.e. it could be register qualified), it can assume that it can NEVER change, as there is no way for it to.

Cheked. I know that feel. Rust is fucking insane. I got used to D so quickly in comparison. Thinking in Rust way is like learning how to use your left hand to do daily tasks.

Unfortunately, this is not how computers and interrupts work. I guess it'll works in some ivory tower magic machine though.

Jesus christ, C++ is such a fucking shitshow. How do people defend this garbage?
I don't know the semantics of C++'s god-awful lambdas, and whether they take a reference to or a copy of a variable.