Calculus edition
Old:
Calculus edition
Old:
Other urls found in this thread:
lua-users.org
learncpp.com
fms.uofk.edu
drive.google.com
seas.upenn.edu
scs.stanford.edu
twitter.com
First for programming traps
What are you working on, Sup Forums?
Reading through K&R but not doing many exercises because I am not that good with C yet.
>I am not that good with C yet.
Maybe that's because you've not been doing many of the exercises
How else do you want to learn.
I'm working on one of them timesink projects
I am not understanding them.
Well I plan on reading the book, then looking at source code then understanding it from experience. The excercises make me feel dumb and weak when i don't understand them so I feel that me being different I have to power through the book and learn that way then apply my knowledge.
Sounds like a cuck but I can't help it.
Post some then
Like I feel that I am not learning enough from the past section to actually make a program. I attempted them and failed miserably. Don't think I don't try user.
I just wrote fizzbuzz in Emacs lisp
(defun fizzbuzz ()
(interactive)
(let ((counter 1)
(fizz nil)
(buzz nil))
(dotimes (counter 101)
(when (not (= counter 0))
(insert (format "%s%s%s\n"
(if (setq fizz (= (mod counter 3) 0))
"Fizz" "")
(if (setq buzz (= (mod counter 5) 0))
"Buzz" "")
(if (not (or fizz buzz))
(number-to-string counter) "")))))))
It's probably shit, but my previous version was even worse.
who cares
I'm at work, reverse enginering 1200 lines long legacy postgresql procedure responsible for managing documents in clients office. Kill me allready.
That died a couple months ago
Yep
Your "let counter" is superfluous: dotimes binds it anyway, it's also the reason it starts at 0. You could also get rid of this habit of binding all of your variables at the top, then you wouldn't need that setq. In short: just _let_ it be!
I just found out that based haasn (from mpv) actually knows haskell and used it frequently a few years ago
If I want to get into AI, what language should I use? Imperative languages preferred.
How are resources vs performance for each language?
I tried to start with Prolog but after programs start getting bigger, adding just one parameter propagates too much and needs to many changes.
use ocaml, its pretty imperative
python because of normies
there are a lot of libraries for it as well (as well as scipy, numpy, etc.)
mine:
(dotimes (n 100)
(let* ((n (+ 1 n))
(d3 (= 0 (mod n 3)))
(d5 (= 0 (mod n 5))))
(message "%s" (concat
(if d3 "Fizz")
(if d5 "Buzz")
(if (not (or d3 d5)) (prin1-to-string n))))))
Praise Emacs!
TRying to make an exercise work.
I am almost finished with it and it shows me the correct time.
Currently trying to pack the condition for leapyears into my code and somehow be retarded.
So a leapyear can be divided by 4, but years which can be divided by 100 yet are not dividable by 400 no leap-year.
something like this should suffice ,no?:
((year/4) || ((year/100)&&(year/400)){
lea pyear
}
right?
sadly, java doesn't accept conditions like these, so I do not know how to express them :(
What to use for unit tests in Lua?
>pic unrelated
Who else uses based DDD?
No you're wrong
>2000/4 == 500
>2001/4 == 500
How do you expect n/4 to be an indicator for n being divisible by 4?
You need to test the remainder of the division. In Java, it's the percent sign that does it:
2000%4 == 0
2001%4 == 1
As for java not accepting your condition, yep, I think the error is pretty clear:
ERROR: bad operand types for binary operator '&&'
first type: int
second type: int
It wants booleans for && and ||. Luckily, (year % 4 == 0) does yield a boolean, so your problem is solved.
Did you already try the options listed here?
lua-users.org
It's the first page you get when you Google it but which of the ten should I pick? Looking for recommendations.
I'm writing something in C++ and it's working even though I've included just header files in main.cpp. I thought I had to include the respective cpp files which have the definitions for the functions? When I do I get some duplicate definition error which doesn't make sense to me. What am I doing wrong?
In your main.cpp you include Foo.h
and in your Foo.cpp you also include Foo.h
That's the problem it depends on your use case and what you're comfortable with. There isn't a catchall answer. In general you should pick the library that is simplest to use and won't be too much of pita when you have the code your testing change.
Personally I found luaunit to be adequate for my small random projects.
If your tutorial doesn't explain it, just accept that you're not meant to understand it. Given that you probably compile with a magical compile button, it is not our place to explain it on an imageboard. Just try compiling something with 3-4 files manually and google it.
The linker will find the .cpp files for you
Cpp files are added to the finished executable via linker. You don't need to include them to your main.cpp. As for the weird errors it is probably because you didn't add include guards to the header files.
This is most likely a way oversimplified answer
Googling header files in general actually gave me the answer.
learncpp.com
Protip: Sometimes it's easier to solve a problem by making sure you understand the basics, than just googling the specific problem.
Good! I almost asked you the clue question: what would headers be for if you included the cpps? That would have been on point then.
>Protip: Sometimes it's easier to solve a problem by making sure you understand the basics, than just googling the specific problem.
110% agreed
Here is a random pic for your success
If assembly is known to have portability issues, how come rollercoaster tycoon works on any processor that meets or exceeds requirements despite being made in x86 assembly.
>linker will find
these webshits still dare to speak in 2016
Because it runs on x86 processors.
>If assembly is known to have portability issues
it doesn't
I've been told that not every x86 program will work on all processors because some interpret the instruction set differently.
No.
Before x86 became dominant, there were portability problems, that's one of the reasons C was made.
OK, so as long as I use x86 and none of the instructions I use are obsolete, it will work on any x86 processor that's equal or greater without issues?
t. dumb fuck
If you're such a genius, please point out what I'm saying is wrong.
What about SSE?
Recommend me some good Java books for beginners to learn from /dpt/
Thank you.
www.google.com/search?q=self+help+books
If I want to go into creating software for CGI and animation, what should I learn?
glsl? Why?
Do you even SIMD Sup Forums?
>C._Thomas_Wu_An_Introduction_to_Object-Oriented_BookFi.org.pdf
Stop being funny, this is serious
You could just post as well
www.google.com/search?q=404
On top of that, do you have any recommendations for the online video tutorials? Especially for Object oriented design to grasp on. Youtube free courses are a bit vague and outdated.
Deitel then. It's good enough for Cambridge.
drive.google.com
Pic related explains OOP design perfectly
I use Mpir.NET (GMP fork for Windows)
Thank you for the book, gonna read through it.
also
That wasp logo of Deitel looks high af.
...
...
I don't get the Java one
Does anyone know where you can read about good procedural API design? Because I keep writing shit code (imo) and want to fix it.
>C++ vs java
>there's programmers making charts that don't understand that all C/C++ does is give you control over performance.
You're wrong, assembly is the most un-portable language of all time. Your code will only work for that specific processor.
Is there a way to use 'getch()' in C, but have a time limit? As in, only accept input for 5 seconds and then move on?
threads (not portable)
Learning Haskell senpai
at' (x:[]) 0 = x
at' (_:x) k = at' x (k-1)
Code for accessing the k-th element in a list, seems alright to me but doesn't work, why?
For a start, it doesn't cover all cases
Empty list - at' [] _
0th item of a non-empty list - at' _ 0
So for the first pattern
at' (x:[]) 0 = x
You should generalise it - the 0th item of any list is going to be the head
at' (x:_) 0 = x
I don't think any of them make sense.
Simplest solution would be to do a spinlock and read stdin.
#include
clock_t startTime =clock();
clock_t currentTime=startTime;
while((currentTime-startTime)/CLOCKS_PER_SEC
All but Java make sense to me
Right forgot to update currentTime inside the while
so just do currentTime=clock(); once per loop. Probaby at the start of the loop. Doesn't matter to any significant extent.
It's quite possible that your tolerance level for making sense is very low.
It's just "everything but Java is underwhelming".
Also there should be a step before starting position which is 'goal' or something. Because As(s)embly C and C++ are obviously building something far greater than a shelf. And haskell isn't doing anything that has to do with a shelf but somehow constructed a unicorn using nothing but a blackboard and chalk.
Also the 'import everything in python' meme is so damn old. It's just dumb. Practically every language has plenty of libraries to use to similar ends.
whats a project i can do in .NET to put on my resume?
Maybe learn enough about .net to figure out what you want on your resume?
Why is /dpt/ so dead?
Python/Tensorflow
Lua/Torch
Those were used to build alphago
But programming is the easiest part, it is harder to learn the math and read the research papers
I've fixed it
Why do you ask this every morning when most Americans are asleep?
>Americans
Burger = shit.
Working on a small DOS clone. Still writing the bootloader (second stage) that gathers all of the system info.
>Visual C++ 2005
LOL us Haskell users am I right???
kys
wow that's gotta be pretty useful
If your interpretation of that image is that Haskell is bad, you're a moron
I've been understanding Haskell, but I have some questions.
Why arguments sometimes are wrapped by () and sometimes by ()?
Also, what's the difference between [a] and a in a return value?
Yeah, it can output a really stripped PE Executable, which imo is much nicer than getting an ELF. I've used GCC for this shit and it's a massive pain in the arse, this just weks.
The first stage is all asm (kill me).
I'm not the femanon you're replying to, but the interpretation is "Haskell is so deep, and you have to be so smart to use it."
It's pretentious and wrong.
The image definitely paints Haskell in a bad light. All stages feature overly complex designs and the end result is nothing like was requested.
Haskell pioneered puzzle-oriented programming paradigm. It is a language that attracts attention of autistic-but-not-bright users that happen to like to solve puzzles while programming.
>Why arguments sometimes are wrapped by () and sometimes by ()?
what
It's just a bit of fun. You learn a heap of cool stuff on different platforms and how to interface with hardware. Plus you get
>muh freedom
Because there's no APIs to deal with haha.
Mostly for patterns, a pattern decomposes a single argument
[a] is a list of a
Sorry, [] and ()
Example:
len [a] = 1
len (x:xs) = 1+ len xs
len [x:xs] gives me error and so does 1+len [xs]
IMHO creating your own simple ISA and a CPU that implements it (in verilog), and then creating a simple OS to run on it would be cooler.
I have done only the first half of this though.
But you are dealing with closed source hardware. Your code is being executed by a closed source firmware. This is not much more freedoms that alternatives.
Has programming gone too far?
no not even that
the end result is a fairy tale, i.e. it doesn't exist and can't exist
The image is a wink wink nudge nudge "our language is sooooo hard haha. not easy being a genius who Fucking Loves Science :P"
in reality Haskell and all ML languages are the wrong way to approach functional programming, and their userbases are 100% pretentious redditors who have no idea what they're even trying to do
they're not even homoiconic
dude you are just going in blind? Read this
seas.upenn.edu
then this
scs.stanford.edu
(or Learn you a haskell I guess, or haskellbook)
We did it reddit! We proved functional programming is shit!
Fuck those elitist 1%IQ haskell elitists
Uploading a pic to celebrate