/dpt/ - Daily Programming Thread

What are you working on, Sup Forums?

Previous thread:

Other urls found in this thread:

cs.cmu.edu/Groups/AI/html/cltl/clm/node44.html
youtu.be/YA-nB2wjVcI?t=2570
en.wikipedia.org/wiki/Family_resemblance
github.com/owainlewis/awesome-artificial-intelligence#books.
pastebin.com/GHS7sH8Q
en.wikipedia.org/wiki/GNU_Build_System
quora.com/profile/Bruce-Robert-Fenn-Pocock
docs.python.org/3/library/timeit.html
en.wikipedia.org/wiki/Stockholm_syndrome
twitter.com/AnonBabble

Lisp is the most powerful programming language.

That's its problem. it's too powerful

c has the footgun of too little and difficult to create abstractions
lisp has the footgun of too little and easy to create abstractions

>200 github stars in the first 10 days

professional developer here, get out of my way, i am better than you.

where do i collect my medal?

c++ is the greatest language of all time

employers don't even look at your github lmao

Yes they do. Why wouldnt employers want to see how you code? Moron lol.

yeah, tell that to all the jobs i'm being offered right now

i'm literally drowning in offers

>yfw my software manager threw away a resume based on racist commit messages

It's the language of The Wired, after all

Can you add all that type wizardry to LISP with macros?

I've evaluated candidates who have done well or poorly largely due to the code they have published

>Implying you would want to

I am going through the node chapter in code school. Learning how to use node and Express.

Anyone else here studying?

please please tell me you didn't pay cash money for this

cs.cmu.edu/Groups/AI/html/cltl/clm/node44.html

Nope. Someone else did, and gave me their log in and user name so that I can take the course. :)

who is that titty temptress

that's standard common lisp code
i'm actually surprised
element-type, initial-contents and shit is all in the common lisp standard

(defstruct world size current next)

This is now a Lisp thread.
What's the best lisp and why is it Common Lisp?

Then why does nobody use it?

It was oversold as the AI language, but the brainlet developers thought that that meant it would do all of the thinking for them

What is Lisp's fate?

Eventually all languages will converge to LISP as people realize its features are superior.The world will basically rediscover lisp's existence by accident.

To be an inspiration for other, more mainstream languages.

youtu.be/YA-nB2wjVcI?t=2570
Not sure I like this attitude but hyperloglog is a cool algorithm.

Are you saying, that Lisp (CL) is perfect as is?
Don't you think that a more modern Lisp is needed?
What constitutes a lisp anyways.
Some people say Clojure isn't a real Lisp.

>LISP
Nobody who uses Lisp writes "Lisp" in all caps.
>its features are superior.
No static types in any of the good ones. CL's "types" aren't; SBCL doing some type checking at compile time is just an optimization. The closest you get is Type Racket, but it is a chore to use and feels like poor man's OCaml.

>What constitutes a lisp anyways.
en.wikipedia.org/wiki/Family_resemblance

Common Lisp is statically typed, as all objects are of type T

...

What usefulness does static typing have?

>t. Robert Harper
Not that I disagree, but it isn't a very interesting observation.

You can add static types if you like

Opinion discarded

And have you seen any projects that do?

>static typing
>lisp
pick one

Looking to get into AI programming (previously a 3D game dev), any advice/books to look into?

Learn Prolog.

Anything I can do in a language I already know? Looking more for theory/equations. I know java, python, c++, but I mainly use java

you could look at source code for WEKA

By default SBCL does runtime type checking (if you declare types). That would be "static" typing (at least in the programmer's point of view).

>python
>c++
>java
>AI
choose one

It is useful for refactoring, tooling and keeping the API designer from going nuts with overloading the meaning of a function's arguments. The last one isn't a big problem in CL, but it is in other languages like JavaScript.

JavaScript is really bad when it comes to this. Popular libraries are full of functions that do different things depending on the types of the arguments. You see how complex they are if somebody has written the type definitions for TypeScript, which is a statically typed language backwards compatible with JavaScript code. If the developers of the libraries had used TypeScript themselves, they would have been forced to notice that their functions were a mess of overlapping concerns.

It's useful for safety.
Why would a "modern Lisp" be needed when you can just edit Lisp?

>theory
Look at github.com/owainlewis/awesome-artificial-intelligence#books. The Russell and Norvig book is a common recommendation.

for machine learning ive heard great things about some udemy courses. there is one called something like "Machine learning A-Z" which ive been recommended by several people. it uses python and costs like 15 usd when on sale (right now)

How the hell do people write these massive makefiles? Are they written by hand or somehow machine generated?

GTK2 objectwidget connection for CL (code below is an example of its usage):
(define-widget some-complex-object object
("name" (name object))
("age" (age object))
("code" (object-code object)))

Then you just:
(bind-complex-object gtk-builder object)
;; ...
(setq new-object (extract-complex-object gtk-builder))

It uses CLOS (OOP), dynamic typing and macros.

they hit f5 in their IDE

If it has a Makefile.am or CMakeFiles.txt, that's the human written one. If it's just a massive Makefile, then they just wrote it.

source?

What did you make?

>the hell do people write these massive makefiles?
How massive are we talking, 200 lines or 2000? The latter are often generated.

>linux subsystem for windows
>ConEmu
>vim

I don't think it can get more comfy than this

Yeah, that's what I mean, the multi-thousand line ones. I can't even imagine trying to manually maintain something like that, it'd be more work than the project itself.

pastebin.com/GHS7sH8Q
It uses alexandria and cl-gtk2-gtk

Usually en.wikipedia.org/wiki/GNU_Build_System is to blame.

I want to get on this level.
You wrote this?
>eval-when
I've seen this so much but still don't understand it.

If you don't add it a defmacro in the same file won't know about those functions.
I think they tells that it should be evaluated (yes, evaluate the defun) at compile and load level (when it's a compiled file, aka reloaded). I added execute because I don't know if it's needed.
Oh, yes. It's my code.

What's wrong with any of these languages and an AI program? Shouldn't you be able to acconplish similar feats with languages of the same level? I'm looking to do something basic and move my way up to, say, a bot that can converse based on previous data from real conversations. Most advanced AI I have done so far are things like chess AI or path finders

>python
>it's the pythonic way!
Just a redundant language. Probably the best idea.

>c++
It's just not a good idea for AI, unless hardcore performance is required.

>java
>java

>tfw you will never be this guy
quora.com/profile/Bruce-Robert-Fenn-Pocock

In a process with multiple threads running, if a thread makes a syscall will it suspend the other threads?

you were saying

Any Android pajeets know their way around Google drive Android api?

Not an expert but I believe some calls like I/O don't, and others like process exiting do

Genuinely curious, what would be wrong with making it in java? It's the language I know best, and I know people shit on it but the performance cost isn't a huge deal to me, as long as I accomplish my goal

Makes sense, thanks.

>java
Use scala

This sounds like a troll question, but I promise it's not.
Python
If I have a set with a single integer element, is the most efficient way to get that value just to pop it?

Trying to wrap my head around pointers and dereferencing operators.
I'm working on being good at C++

Actually just looked it up cuz it was bugging me. On Linux all threads are separate processes so no they don't all get blocked.

You're relying to memer.

>muh GOFAI

>the most efficient way
>Python
tell you what
why don't you run some tests and find out?

>just to pop it?
your_set.pop()

How do I test an operation in that small a time?
I ran the profiler on my entire function with 9999 elements and it usually said

if an operation is too small to measure in one iteration, just run it a billion times

>tfw you will never be these ???

Trying to get a serious answer here. I know java has garbage collection, which isn't entirely efficient, but other than that wouldn't the cross-platform compatibility be useful and OOP the best to mimic human brains? Or am I completely wrong (and if so, can someone explain why)?

Okay, I get what you're saying. Call pop some ridiculous number of times on the same array, do the same with anything else I can find, etc. Profile the total runs (not just individual runs)
Much appreciated.

This is exactly what the timeit module is for.
docs.python.org/3/library/timeit.html

Well, at least I know how to fix this.
Looks like profile (what I was using) is for detailed based on multiple different function calls.
Timeit seems to be exactly what you want for overall time. Thanks again.

Why do sepplesfags always act as though newly added misfeatures suck because the committee has been dealt a bad hand before them?
There are many features that are shitty and had no precedent to be shitty like move semantics and std::variant.

en.wikipedia.org/wiki/Stockholm_syndrome

You're asking a really general question, most of the time the answer depends on what you need the most and what you're doing, one language isn't the best for all things.

Can you point to a single example of sepplesfags acting like this?
Doesn't have to be from this forum.

This whole conversation for a start.

Scuse me.

is converting pass by value for a function that will not modify the variable to pass by const reference always improve performance?

If the value is small and simple like an int then pass by value will be faster. If the constructor does more than nothing then const ref will be faster.

I don't think that's true. The bad hand the committee keeps dealing themselves is a common issue. But it's not omnipresent. It couldn't be. Part of their failings must come from them because otherwise they'd not have a language riddled with these problems. They'd have some foresight. But some of this is just political.
Like why do structured bindings respect std::tie as a feature and doesn't aim to fully supersede it? Probably because the idea would offend someone.
The proposal says that the author finds std::tie sufficient but splitting syntax between declaration and assignment there just because std::tie exists is just crazy. There's some pride or something else behind this.

>linux subsystem for windows

but y tho

>The first function we'll write will need to greet someone differently according to gender.
REEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE

Not that simple.
References can (do, unless you're at the edge of a call graph pretty much) degrade compiler optimizations. Because getting rid of memory is hard and if you've passed in by value it's likely your compiler can just put everything in registers.
If you've got an input parameter or an output parameter you should most likely pass in by value or return a value.
Combo input/output parameters are a special case.

But I say you should prioritize your API over this.

What's wrong with this? The kids are very interested in genders for some reason and it'd be good to introduce them to dynamic polymorphism early.
Maybe even gender generators.

...

To clarify, based on the previous responses I've gotten on here it seems to me that the general consensus is that I cannot use java for my particular AI. Namely my goal is to make an AI similar to a Twitter bot that can analyze my normal mannerisms and posts to formulate responses if it's own to mimic human posts or conversation. While java may not be the best language per sé, surely it's doable?