The free-energy principle: a unified brain theory? (2010)
-
Biological systems aim to maintain homeostasis (i.e. they want to be in a limited set of physiological and sensory states). Entropy is defined as the average surprise of outcomes sampled from a distribution. Low entropy means that the outcome of a sampling is relatively predictable. A biological system maintaining homeostasis is relatively low entropy in that there are a few states you will often be in and many states you will rarely be in.
-
Biological agents aim to minimize the long-term average of surprise.
-
The long-term imperative of maintaining states within physiological bounds translates into minimizing short-term surprise (note: seems like a greedy approach).
-
Surprise is not just minimized in the state itself, but also in the movement between states. This ends up resulting in states tending toward global random attractors (i.e. stable states that “self-correct” small random perturbations).
-
Free energy is an upper bound on surprise. While an agent can’t directly minimize surprise, it can minimize free energy as free energy is a function on sensory states and recognition density. Recognition density is a probabilistic representation of what causes a particular sensation.
-
Agents can suppress free energy by acting on the world (change sensory input) and changing their internal state (change perception).
-
Free energy minimization requires agents to have a generative model of the world.
-
Discussion of Bayesian brain hypothesis: 1) hierarchy is important because it allows establishment of priors and 2) these priors are physically encoded in the brain (likely using a form of sufficient statistics e.g. the mean and stddev for a normal distribution).
-
Bayesian brain ultimately views the brain as an inference engine that attempts to optimize probabilistic representations of what causes sensory input. This view of the functioning of the brain can be derived from a free-energy approach.
-
Stopped: principle of efficient coding section (p5)
-
Discussion: surprise as -log(p). entropy is the expected surpise. For most processes, correctly modeling minimizes entropy but surprise will likely be irreducible.