ITT: Chicken or Egg Problems

In order to make a 3D printer with sub-micron precision you first need a machine to calibrate it by making gears etched with teeth perfectly to handle sub-micron movement.

But then you need a machine to create that machine and so on.

At some point down line a human had to calibrate by hand that first machine. But human hand's aren't perfect.

How did we create more precise things using less precise tools.Won't the defects carry on into the creations of the machines like a genetic disease? How do we create perfection without some sort of reference point to begin with?

Other urls found in this thread:

en.m.wikipedia.org/wiki/Engineer's_blue
en.m.wikipedia.org/wiki/Shortt–Synchronome_clock
newport.com/f/xm-ultra-precision-linear-motor-stages
twitter.com/NSFWRedditGif

You start with a gross approximation of what you want to achieve by building always smaller and more precise equipment to build other equipment.

Because the machine making the ultra-precise printer probably won't be a printer.

As long as one makes more valid copies than invalid ones it'll be ok

we can't, second law of thermodynamics

This is an important argument. For example, silicon manifacturing process introduces a number of defects per surface area, which are more relevant for smaller and smaller integrated circuits. This does not stop manufacturers from achievi an acceptable of yield - i.e. a low number of defective dies (which can just be discarded) over the total of produced units.

>achievi an acceptable of yield
I fucked up, what I meant is "achieving an acceptable yield"

Some day an autist will be born that can achieve this perfection.

Fun fact- Intel only makes i7 processors. i5 and i3 processors are just i7s that didn't work quiet right.

...

You don't use printers to make printers just like you don't use scalpels to make scalpels.

//Thread

i3 are different silicon. Pentium/Celeron are failed i3's.
i5 are "failed" i7

They're all failed xeons

For the sake of argument lets start with a CNC lathe machine which is essentially that machine that is used to make all other machines and parts including copies of itself. How would you go about calibrating the first CNC machine to sub-micron precision?

Are they? Neat.

DNA

Missed the purpose of the question. The OP was essentially asking how precision emerges from imprecise beginnings.

Yes it can take a look at human civilization we went from stone tools to perfect spheres at the atom scale. Life in general maintains internal order and build complexity at the expense of increasing entropy in the universe. But this does not violate the 2nd law.

No such thing as perfect spheres.

Exploit physical forces. By putting hand-held forces through filters you can make things more precise than a hand can create through direct manipulation. Like for instance, swinging a weight can make a smoother circle than one 'drawn' using the tip of a finger. So think in terms of forces: you want to damp, smooth and average things. Once you get into the digital realm you can run things multiple times and then average them using math.

Well you can derive standards from physical laws. IE a second is this many vibrations of a rubidium atom and we know light travels this far in a second. If we have a time standard, we can use this to calibrate a lidar and get distance.

And nigga you don't use gears for submicron movement, too much friction at that scale.

There are precision techniques that don't require advanced tools to create. For example engineers (or prussian) blue.
en.m.wikipedia.org/wiki/Engineer's_blue
In the 1800s that marking three surfaces with blue paint and rubbing them against each other until the blue spots would dissappear or dissipate. The created surfaces were much flatter than what could previously be achieved (so much so that even today surface plates can be touched up by rubbing them against each other). This led to an explosion in manufacturing and precision improvements because a higher quality reference had been achieved.

So to answer OP's question improvements in precision tend to evolve from using geometry or physical laws as a reference. Thus you will always have an ideal (which is impossible to achieve) and your actual reference (whose deviation from the ideal you can quantify). These become tolerances which are then accounted for when engineering a product.

You do realize that humankind had clocks and star navigation equipment and shit for hundreds of years now?

Oftentimes extremely high precision can be achieved however they're extremely labor or material intensive.
en.m.wikipedia.org/wiki/Shortt–Synchronome_clock
Pendulum clocks are relatively accurate (based on the idea of undamaped oscillation) however the contact of an escapement and wind/temperature changes throw off their period and causes errors in timepieces.

The shortt-Synchronome clock got around this by using two seperate pendulums. A slave pendulum and the master pendulum which was kept in a vaccum chamber. The master pendulum was free of wind resistance and temperature variation. It updated the slave pendulum using electromagnets instead of an escapement (meaning it had no physical connection to the slave pendulum). Free of external forces, the master pendulum would only experience the minute forces of the electromagnet, the friction in the bearing and inertia due to earth's rotation.

So it's often possible to create extremely accurate tools but so cost/labor intensive it's rarely done (the earliest advances in gear cutting were driven by the need for more accurate clocks in response to railroads needing more accurate timekeeping methods)

Interesting. How do they do sub-micron movement? I assume they would be gear coated with some sort of gel to smooth the friction.

Not him but gears are too inaccurate even at the micron stage. Piezo linear motors would work for micron movements. Not sure about sub micron

Actually nvm they can do nm movements
newport.com/f/xm-ultra-precision-linear-motor-stages

Just learned something new today. Thanks.

For really fucking fine movement, ie sub-angstrom resolution, for microscopes and shit. You use piezoelectrics and special control algorithms with feedback from what you are seeing. Doesn't work over long distances though.

prove it

you should honestly look up the beureo of weights and measurements. they love talking about this stuff.

The solutions people come up with are actually very clever. Thats why we have things like engineering tolerances.

>Have a machine that can move individual molecules
>Use it to fax a memo about stealing office supplies

99% of the time we just use black fucking magic like photolithography or some bullshit chemical process that does all of the work for us
the other 1% it's already like that outside of human influence and we just go into full cock mode and half-assedly hook into that process

god did it, duh

>the nerve of this guy

i7 extreme edition are failed Xeons. Regular i7 are their own silicon.

what compiler was used to compile the first compiler???
what was text editor was used to write the first text editor??????