Tfw will never live in the glorious early days of bit twiddling and memory hacks

>tfw will never live in the glorious early days of bit twiddling and memory hacks
Why live?

Programming today is just code-monkeying and gluing APIs together. It's no longer a task for true men, but something random photo models teach to 13-yo girls.

Other urls found in this thread:

web.archive.org/web/20020207041318/http://www.ensemblestudios.com/openjournal4/story/five.shtml
gamasutra.com/view/feature/131503/1500_archers_on_a_288_network_.php
norvig.com/design-patterns/ppframe.htm
en.wikipedia.org/wiki/Fast_inverse_square_root
twitter.com/SFWRedditGifs

MICROCONTROLLERS

Web "development" is
Software like os'es, libraries, engines etc still needs to be written

/thread

You're still free to twiddle your bits, just do it in the privacy of your home thank you very much.

Even an Arduino has beefy af specs

>I have not programmed anything besides FizzBuzz
>I have not worked on any medium-sized project
No wonder you retards like 80s programming so much. Things were much simpler then, even kids were able to do it. You probably feel threatened by the increase in complexity in hardware and software.

Even web development is going to change. With web assembly, it's going to become important to write your clientside code in a language that produces concise code that is downloaded and executed quickly by the client and doesn't manage your memory for you. The current "5 Seconds till the JS reveals the page content" is unacceptable.

Also stuff is going to be multithreaded. In fact, you can even sort of do that right now, with web workers (as if. We need to support IE 8, muh customers!!!)
And everything is going to have to be offline first with web workers.

Web development is eventually going be like mobile development, except you also have to target the desktop form factor and screen readers and the tooling and APIs you have at your disposal are prehistoric.

The jQuery stackoverflow developer is going to die a violent death. I can't wait for it.

Is that age of empires?

Yes, the first one

WOLOLO

Someone who knows network programming, explain to me how the fuck could you play AoE on 56k modem with like 500 units on screen?

No dude, the future is webUSB.
>he doesn't have the information contained on every single usb device stolen with Javascript without even needing to locally mount the USB devices.
ayylmao

sending and receiving packets isn't a heavy task, user

Sure, but we see a fucking lag even today in fighting games with broadband internet, how the fuck did AoE guys did it back in 90s?

It is user. It is

By not hiring Pajeets

Anyway, can anyone find the actual presentation that answers this?

web.archive.org/web/20020207041318/http://www.ensemblestudios.com/openjournal4/story/five.shtml

Seems archive.org only archived the download page

gamasutra.com/view/feature/131503/1500_archers_on_a_288_network_.php

wow thanks!

wow, I don't give a fuck about games programming, but this was pure gold. Thanks, user

Old games are very interesting.

This might be what OP wants to say.

Well, the games are very interesting without regards for their age; Borderlands 2 AI, for example, is really well-designed.
As long as I do not touch the graphics/3D models/GUI/etc. I could be interested in their programming.

Why no one has posted this gem yet?

>Sawyer wrote 99% of the code for RollerCoaster Tycoon in x86 assembly language, with one percent of the functions written in C for interaction with the Windows operating system and DirectX.

I can't even imagine how one could write something as complex as RCT in assembly.

People still use IE 8?

chinese still use ie6 because that is what came with xp

I'm no expert, but I bet Nintendo developed a lot of their games in assembly

I always remember that coaster crashing halfway to the end of scenario

Essentially all NES and SNES games were made in Assembly

You are such an idiot..
It's a glorious time to be alive!

Right now there's a big comeback of functional languages and various interesting stuff's going on. Just lurk more..


Also this glorification of "bit twiddling" (as you put it) is fucking stupid. Back then you needed to be an engineer/scientist to program, programs were ugly, errors were common, software projects prone to failure. Everyone was writing spaghetti code. Fuck that shit.

>muh design patterns

Not OP, but honestly "bit twiddling" is the only thing that is any fun to me when it comes to programming. I do use languages like python when I need to get shit done, but when I want to program for fun, I use C with a bit of asm here and there.

Whosyourdaddy

there is more high level versions of assembly with more abstraction . like function calls i think. its not like he wrote it in notepad

You have no power here this is a blue board!

design patterns are bad, since they incourage code repetition and thus something that should be authomated.

That's why I'm learning functional paradigm. Fuck Java & OOP spaghetti.

i dont think you know what design pattern are.

>Even an Arduino
>Even

SMALL INDUSTRIAL MICROCONTROLLERS

>stone age
Rather apt.

Oh, you can bet I do... and the repetition and regularity is in the meaning of the word itself. Java-code-monkeying for a year to pay college taxes docet.

You can make high level abstractions in the form of function calls in all assembly that I've seen. All C does when you write a function is handle register(s) used as args/return value (and program counter, et al.) according to whatever architecture you're running on.

It's just really fucking tedious.

>First programming module
>Assembly
>Learning how to do things by using a processor's ability to quickly and accurately iterate through basic operations
>Feels poetic

>2nd programming module
>Java
>public static void main(String [] args)
>Hello World Enterprise Edition
>Everything you want to do has a class that will do it for you, here's the documentation

Now I just fawn autisticly over Verilog

"bit twiddling" is still possible. I love digging into internals of various image formats, for example.

You can always make some FLIF / APNG converter/optimizer, whatever. Add it into a browser / Qt /gdk-pixbuf / Photoshop. Digging into bitstream is so much fun.

I do it in C++ but my programs still look like C.

This is what stops me from learning high level languages like python/go/whatever.. I just feel like it will be boring to do anything if I don't use C. Anyone else?

no but i mean you have it all setup for you and an assembly IDE if you read closer to what programs he actually used for this it makes it looks more doable. its still tedious compared to today ofc

>and the repetition and regularity is in the meaning of the word itself

ye so i understand that you got the confusion about thinking its about writing in patterns. the meaning is actually NOT in the word though.

there is many design patterns out there also, so im pretty sure you can't make such generalizing statement about design pattern being repetitious . how is for example singelton pattern making more repetitions

Every pajeet can twiddle bits. It is not a creative task, nor it is hard. Designing your code in a clean and beautiful way that is meant to be maintained is however creative and hard.

He just an average Sup Forums retard who faps to ugly C code and thinks that hacky piece of shit C is the best.

Tons of commercial 16-bit games were developed in asm: you needed to in many cases. I used the Devpac macro assembler at the time, if that helps you appreciate the general dev environment.

Higher-level languages were a comparative rarity - still, some were developed that way: I can point at Dungeon Master and Civilization as two that I've personally seen written in C (Civ were nice enough to leave the debug symbols in).

You can learn a lot about efficiency that way. But don't get hung up on optimisation too deeply. We only did it because we had to. Perhaps the limitations imposed helped define boundaries for creativity to smash; but programming of all shapes and sizes is way more accessible than it used to be. I was there, and I think it was worth the trade.

>2 KiB RAM
>32 KiB program ROM
>16 MHz 8 bit CPU
>"beefy af"

user...

Relatively to other microcontrollers in the industry, it really does.

Ever seen a small pIC? The 8 pin package ones?

what can you even do with a tiny PIC if it has such anemic hardware?

norvig.com/design-patterns/ppframe.htm

Not everyone wastes memory like a retard

You certainly can't do much of anything, you have almost no stack space to do anything of value unless you put everything in main.

>99%

Sawyer's metal as fuck.

I noticed a lot of the prebuilt coasters start with age on them, or have designs that lend themselves to stressing a particular piece.

using a weakling pic for a uni project to control a motor (speed, torque, response etc), including a GUI on a small OLED display and a few buttons soldered on.
>1KiB rom
Its got limitations, but efficient programming does the job well if you are a little creative, and its not difficult to take advantage of really.

>can't do anything of value unless everything is in main
Pretty much, but considering changing the torque produced requires around 10 lines its not an issue.
I'm actually surprised at the degree of control you can exert with such little minimal code.

How do i get into microcontroller programming?
I had an arduino, but I broke it, and I wasn't too fond of the non-standard arduino wrapper libraries for every little thing.

I didn't even feel like I was learning much of anything.

>How do i get into microcontroller programming?
Buy a microcontroller and download the documentation.
Why do you even need to ask?

>Why live?

You shouldn't. Fucking kill yourself, shit stain.

>I've never programmed outside of codecademy
>I am too stupid/young to realize most programmers back then were writing fucking Business Basic

Get a pic with display, download datasheet, download example starter program and ctrl-f the document to see what is going on.
Then write hello world and from there just start a project. They're really flexible, and use c so you'll be fine for anything.

>tfw will never live in the glorious early days of bit twiddling and memory hacks
Hold out 6 more years, user. It'll be back (for high-end projects) post-Moore's Law.

>>tfw will never live in the glorious early days of bit twiddling and memory hacks

I do the same kind of hacking today while programming multi-threading.

>latency is throughput
Can we please not?

Stop trying to fit in.

Wait 4 hours for hotpockets

>tfw I still play AOE 1 with my family sometimes
Can't be the only one, right?

Is that you Klossy?

>It's no longer a task for true men.
TIL Grace Hopper had a bepis.

>The code on the left is bad! It is hacker code used to steal nude photos! Stay away!

w a t

Also, the code on the right follows neither Java standards nor C standards, as it does not indent. It's also polluted with unnecesary comments that distract from the functionality of the code.

fake pic mate

Gender isn't defined by any physical features

Being manly is a state of mind

That looks like clash of clans

Well the one on the left, god knows what it does, and the one on the right at least explains what int means

That's because we spent years saying that we just need to get it done, and we will get stronger computers.
It is still true in a sense.

Why o fucking why do you think it is appropriate to /thread your own post

I welcome the glorious future. But I'm a scientist, not a programmer (although most of my work is essentially coding).

No longer do I need to write 800 lines of FORTRAN code to read or write some HDF5 or netCDF4 file, or to plot a hex-binned map using any given projection. I can slice through multidimensional arrays in an interactive shell, have a shitload of fast, vectorised commands at my fingertips - I can actually spend time thinking about the science, and not having to spend 98% of something that a codemonkey can do better than I ever will.

Sure, there are probably a number of good reasons why Python is a shit language in the eyes of an OS designer or whatever, but it lets me get my work done quickly.

Gender isn't defined by genitals or chromosomes.

Being manly is having the ability to grow a beard.

If you can, you're a man. If you can't, you're a woman.

this. I hate retards who think writing menial boilerplate memory management is the hard part of software development

>tfw will never live in the glorious early days of bit twiddling and memory hacks

You can live in the glorious present of bit twiddling and memory hacks.

In non-embedded software you still find this in codecs, ciphers, high performance networking, some types of compilers, renderers and sometimes even in high performance javascript (I know people who hand-write ASM.js in tight loops).

In embedded software, you find it in e.g. storage devices and network appliances.

OP you can still fucking do these things.

This. It amazes me that I always see comments on Sup Forums about how shit python is. Of course it would be shit to use it for certain tasks, but for scientific computing there isn't much better. Why would a scientist spend years learning to write code, manage memory, etc when the run time difference between python and the same program in C++ might be something like 5 minutes to 1 minute. That extra 4 minutes of time saved really doesn't mean anything to 99% of scientific computing applications. Maybe if you're the LHC you might need more optimized code, otherwise it's totally unnecessary.

Python and dynamic languages are nice in general for experimentation or things that are going to be done just one in your life. Which is how a lot of scientists see their code.

For code reuse or when you need to implement your own complex stuff, going out of the premade binding for existing stuff, it's terrible.

I was forced out of dynamic languages not by my own will, seriously. But I found functional programming, and I'm happy.

Also, if your programs take 5 min to run, just forget, you are doing right. My problem was choosing between days or years of runtime, so...

>Falling for an obvious troll pic
Your and idiot, kill you'reself.

Seriously, the vast majority of the anti feminist pics floating around on Sup Forums are either outright lies or at best half truths blown out of proportion. I'm not a feminazi and I don't believe in this patriarchy bullshit but these are the same tactics they use. If we're to address this as a real issue we need to be better than them, not worse.

You can get free samples from microchips if you are a student. You can code in C or ASM. You will also need a programmer (you can build one easy or buy one)
The real fun is to build you own hardware.

>Programming today is just code-monkeying and gluing APIs together. It's no longer a task for true men, but something random photo models teach to 13-yo girls.

I agree with you, in general. However, keep in mind that probably less than one in ten programmers "back in the day" did any interesting bit-twiddling. Honestly, the most interesting work was done by the early OS and programming language vendors, and the games programmers.

9/10 of programmers were writing dumb business applications. I'm not talking Word and Excel, I'm talking "take the salary number and tax rate from form A, multiply it and put it into form B".

The thing I find weirdest and most disturbing is that people these days don't even care what's under their software. When I was a kid first getting into programming, it always bugged me, "but what's actually making this happen?" Then I learned assembly language and thought, "but what's making the CPU perform these functions". Eventually, I took a low-level course where we covered logic gates and wrote a multiplication function using, what, OR and NOT, or something like that. THEN I was satisfied.

Now people program like they drive cars. Just do what the interface tells you and ignore what's going on under the hood. You would think this would drive up the value of engineers with a really solid grasp of the full technology stack, but not, companies mostly want duct-tape engineers who can just assemble MVPs quickly.

Nobody ever said "menial boilerplate memory management" was hard, except for you. That's a bit telling, don't you think?

In my opinion, it was a lot more fun when you drew things by stuffing a value into a memory location, and where fiddling with optimization really paid off.

These days, optimization is more likely to be about putting an index on a database column, using some Javascript API in some more-correct way, or, at best, changing up the graph algorithm you're using for data lookup at Faceoogle.

Check out the TI MSP430. It's kind of like Arduino, except DIFFICULT AS FUCK.

Seriously, TI documentation is the most godawful shit I've ever read. It's the exact opposite position on the difficulty spectrum from Arduino. It's marketed to real EE's, so they assume you know all sorts of terminology, and they don't bother to provide sample code for important functions.

For example, I blew a week trying to adapt some i2c library to my particular MSP430 variant so I could display output on a little LCD array. I got busy with other stuff and STILL haven't gotten it working.

But the LaunchPad itself is pretty cool. You can use it to program a MSP430 chip, pop it off, and put it into a live circuit for real deployment. They cost cents apiece. That's the whole reason I got into them: I prototyped something on Arduino, but there's no obvious method to take an Arduino prototype to a marketable product. With the MSP430, there is.

>early days of bit twiddling and memory hacks were glorious
>Programming today without bit twiddling and memory hacks is just code-monkeying
>Nobody ever said memory management was hard
literally in the OP. he implies that writing much more complex modern software is just easy code-monkeying compared to the old software, while it's just less prone to careless mistakes and allows you to shift you attention from boilerplate to more complicate logic

Instead of sending information on every unit they sent the commands players issued instead.

It's very efficient and most modern RTS games still do this.

for someone that knows as much as you, it baffles me how you fail to see the answer to your question at the same time.

>companies mostly want duct-tape engineers
All that stuff has been abstracted away and a large percentage of software today (in the mainstream market) is about the so called apps. Everyone is riding the "Computers/tablets/etc and apps are cool! Let's make 2454325 more apps!" train. Those guys don't "need" to know the details and this is good for business because they can hire more stupid people than before.
So to OP's ( ) question:
These:
Plus I want to add that you will find what I think you want to find in (low-level) systems programming and in embedded systems in general. You can add ASIC stuff too and some special fields you can think of.
Embedded systems stuff includes everything from the well known MCUs known by most guys like AVR, PIC to more powerful ones with ARM cores or actual ARM cpus (like your mobile phone or rpi). This can include various real time operating systems (not linux!) too. Mechatronics is a thing too. Apart from that device drivers are a thing too especially for legacy stuff. Low-level systems programming stuff is probably easier to find where there are some legacy systems.

For those who don't get OP. I don't think so he is talking about spamming bitwise operators or writing boilerplate code, he is more likely talking about building a system from the ground-up and the interesting aspects of being close to the hardware or doing something cleverly on a constrained system and the creative problem solving at least occasionally required in an environment where there is no solid abstraction taking everything away, in contrast to everything being rigid. You have to "think" there. Plus while some code might seem boilerplate when you use something like Java, in an embedded env. it often requires a lot more thought.
Low-level understanding gives you a very good perspective on things too.

>not knowing the algorithm for the quick square root

Someone explain what the manly code on the left is doing.

It's a coding gem: en.wikipedia.org/wiki/Fast_inverse_square_root

Sarcasm

Poe's law

It should be given away by the fact that i called out int in the secondary code

Things still get hacky when you really need performance. VR is really starting to push things, now you gotta render stuff TWICE and do it fast enough or else people start to puke.

For big simulations, you gotta know computer science and physics.