# Optimizing individual components of a system is often at odds with the system itself
This is one of those (Talebian?) statements that are obvious on their face but explains many things and is often ignored in practice.
Argument by analogy is usually bullshit but I think it may work here. In the simplest form the principle is evident just from optimizing a single variable function like f(x)=1-x^2 - obviously making x as big as possible does not make f(x) as big as possible. You can extend this to any number of variables. Then you can replace all the variables with functions themselves. Then you can replace those functions with whatever you want - systems, unknown probability distributions, utility functions, etc.
[[Goodhart's Law]] could be reframed in this light as the idea that if you try to maximize the variable that you can measure, you’ll end up working against the systemic output you care about.
### Related
* [[Slack - concept]]
* [[alexanderStudiesSlack2020]]
* [[“The System” of AT&T]]
* [[Institutions are the second level of a group selection evolutionary system]]
* [[Efficiency is overrated]]
* [[You can think about institutions as systems]]
* [[Emergent behavior happens when a system has properties that none of its subsystems have]]
* [[Academia is not a good environment for systems engineering]]
[Web URL for this note](http://notes.benjaminreinhardt.com/Optimizing+individual+components+of+a+system+is+often+at+odds+with+the+system+itself)
[Comment on this note](http://via.hypothes.is/http://notes.benjaminreinhardt.com/Optimizing+individual+components+of+a+system+is+often+at+odds+with+the+system+itself)