# The Feynman path may look like shrinking modular components
The Feynman Path (as kicked off by [[There's plenty of room at the bottom]]) may entail a framework of modular components assembled via a scale-independent process.
We need to look at the whole thing through the lens of process improvement within a paradigm.[^1] To set the stage, let’s look at the current dominant paradigm for nanoscale manufacturing: photolithography. Photolithography improves by increasing the resolution of the lasers+masks (inputs) which results in smaller features (outputs) but the scale of *the process itself* stays the same. That is, the process is always non-hierarchical macro-scale -> final-scale.

By contrast, the ‘spirit’ of the Feynman path is that the process itself shrinks as the final product shrinks. He frames it as machines making smaller machines making smaller machines. That’s a bit misleading because it’s stuck in a industrial-mechanical metaphor but at its core is the idea that improvement to the process entails scaling down the process itself. [[Eric Drexler]] (and folks like [[J Storrs Hall]]) refined it: You want to keep the same system architecture and shrink it. However, they still hewed to the industrial-mechanical mindset, leading to the widely criticized “macro-scale machines but made out of diamondoid parts.” This narrative muddled where we’re talking about the process or the output and perhaps jumped too quickly to the end-state of full self-replication.
Most discussion of atomically precise manufacturing gets very confused between process and output. In a way this is excusable because it was seeded with the idea that nanoscale things can make themselves. Like the idea of 3D printers that can print themselves, it’s a cute goal and perhaps a useful measuring stick ([[A useful yardstick for atomically precise manufacturing might be how many components of the process can be made with the process]]) but that’s not really the goal. The goal is to get the useful things that happen when you are able to specify arbitrary arrangements of atoms.
The jump that’s needed is the idea of being agnostic to whether the architecture of the *output* changes, but instead focusing on a scalable *process* architecture.
Abstractly, modular functional components are a great way to implement a scale-agnostic architecture. [[28 Feb 2023]] is going after that approach specifically for electronics but if you abstract one level up, it’s clear we’re talking about how [[You could think of nanoscale assembly as material voxels]].
This approach seems like it works well for “static” systems: electronics, plasmonics, materials, etc. but might break for “dynamic systems.” Conceptually, the difference between “static” and “dynamic” at this scale that static systems act on things smaller than themselves (circuits act on electrons, photonics act on photons, materials act on … themselves? Perhaps “materials” is just too broad to be useful); dynamic systems act on things bigger than themselves (ribosomes, nanobots, flagella). [[At nanoscale static systems act on things smaller than themselves and dynamic systems act on things larger than themselves]]. It’s not yet clear to me if this mismatch between modularity and dynamic systems is intractable or just a failure of imagination. For now though, it seems fine to caveat that the Feynman path/modular approach is constrained to work only for static systems. That nuance may be why it felt nonsensical until now.
More concretely what would this look like? Like any modular system, the whole thing would revolve around interfaces. It seems like there would be many different properties of interfaces: informational, mechanical, electrical, chemical, etc. What level of abstraction would be preserved as you scale down? One could imagine having an “electrical interface” between black box components that remains as you shrink. But an electrical interface could be for energy/power or for information. So that may need to be the level of abstraction that is scale-independent. My concern would be that so much abstraction wouldn’t actually be useful.
Another possibility is to look at different modules as functional blocks that each do some step of a process.
### The downsides of modularization
Of course, [[All else held equal modularization makes systems less efficient]]. Modularity generally introduces inefficiencies because each component needs encapsulation and a ‘legible interface’ ([[Modularization requires legible interfaces]]). So the burden of proof is generally on the modular system to justify why it’s better than the monolithic system.
### Related
* [[Designed proteins assemble antibodies into modular nanocages]]
* [[Modularization enables pieces to be used in many different technologies]]
* [[Modularization enables pieces to be used in many different technologies]]
[^1]: See [[fillerFundamentalManufacturingProcess2020]] for a good framework about how to think about process paradigms
[Web URL for this note](http://notes.benjaminreinhardt.com/The+Feynman+path+may+look+like+shrinking+modular+components)
[Comment on this note](http://via.hypothes.is/http://notes.benjaminreinhardt.com/The+Feynman+path+may+look+like+shrinking+modular+components)