Do Bayesian statistics rule the brain?

This week’s New Scientist has a fascinating article on a possible ‘grand theory’ of the brain that suggests that virtually all brain functions can be modelled with Bayesian statistics – an approach discovered by an 18th century vicar.

Bayesian statistics allow the belief in the hypothesis to shift as new evidence is collected. This means the same evidence can have a different influence on certainty, depending on how much other evidence there is.

In other words, it asks the question ‘what is the probability of the belief being true, given the data so far?’.

The NewSci article looks at the work neuroscientist Karl Friston, who increasingly believes that from the level of neurons to the level of circuits, the brain operates as if it uses Bayesian statistics.

The essential idea is that the brain makes models upon which it bases predictions, and these models and predictions are updated in a Bayesian like-way as new information becomes available

Over the past decade, neuroscientists have found that real brains seem to work in this way. In perception and learning experiments, for example, people tend to make estimates – of the location or speed of a moving object, say – in a way that fits with Bayesian probability theory. There’s also evidence that the brain makes internal predictions and updates them in a Bayesian manner. When you listen to someone talking, for example, your brain isn’t simply receiving information, it also predicts what it expects to hear and constantly revises its predictions based on what information comes next. These predictions strongly influence what you actually hear, allowing you, for instance, to make sense of distorted or partially obscured speech.

In fact, making predictions and re-evaluating them seems to be a universal feature of the brain. At all times your brain is weighing its inputs and comparing them with internal predictions in order to make sense of the world. “It’s a general computational principle that can explain how the brain handles problems ranging from low-level perception to high-level cognition,” says Alex Pouget, a computational neuroscientist at the University of Rochester in New York.

Friston is renowned for having a solid grasp of both high-level neuroscience and statistics. In fact, he’s was the original creator of SPM, probably the most popular tool for statistically analysing brain scan data.

Needless to say, his ideas have been quite influential and ‘Bayesian fever’ has swept the research centre where he works.

I was interested to see that his colleague, neuroscientist Chris Frith, has applied the idea to psychopathology and will be arguing that delusions and hallucinations can be both understood as the breakdown of Bayesian inference in an upcoming lecture in London.

This edition of NewSci also has a great article on how cosmic rays affect the brains of astronauts, so it’s well worth a look.

Link to NewSci article ‘Is this a unified theory of the brain?’.
Link to article ‘Space particles play with the mind’.

6 thoughts on “Do Bayesian statistics rule the brain?”

  1. If we’re talking about Bayesian stats and the brain, I should mention Jeff Hawkin’s book ‘On Intelligence’. His HTM model is essentially Bayesian networks with two enhancements: 1) time (ala DBNs) and 2) hierarchal models. It’s a good pop-sci introduction to the subject.

  2. I’ve never really understood bayes-fever. I can’t see that the framework gives us scope to make any more detailed general prediction than “there will be both top down and bottom up elements involved in perception”. For specific tasks it’s nice to have an optimality theory, since it gives you a standard for comparison, but the way the brain works in any specific domain is always going to be a matter for investigation.

  3. Anne K. Churchland (Patricia and Paul Churchland¬¥s daughter both of them are distinghuised [neuro]philosphers) has published an article in “Nature” about how the underpinnings of the neurobiology of decision making it is in consonant with a probabilistic view of the functions of the brain.
    It is a current theme within the neurobiology and cognitive neuroscience community to see the brain as a bayesian machine.
    Link:
    http://www.nature.com/neuro/journal/v11/n6/abs/nn.2123.html

  4. Thanks for the link Anibal, an interesting paper. It doesn’t actually mention Bayes directly of course, which would kind of support my point…

  5. This seems like a very interesting theory. Certainly, when faced with little visual information, we instinctively perform a saccade (eye movement) to face the stimulus to allow more visual information, but I don’t think it follows that this can be extended to a grand theory of the brain.
    For example, when we display heuristic bias and confirmation bias. We can be faced with a problem that causes Cognitive Dissonance by presenting us with information which conflicts with our prior knowledge of a given situation (perhaps Cognitive Dissonance is analogous to the prediction error in the Friston theory).
    However, if we then use an heuristic to “solve” the problem, or display a confirmation bias, we are doing the exact OPPOSITE of what Friston’s theory would predict. In this case, we reject conflicting evidence which disagrees with our prior knowledge and predictions, which also serves to reduce the unpleasant cognitive dissonance.
    This means there are two ways to reduce cognitive dissonance or prediction errors
    1) Accumulate new information and change our predictions or knowledge
    2) Ignore the new information and continue to use our old predictions and knowledge, producing stereotypes, irrational beliefs and stubborness
    Although as scientists we try to use option 1 the majority of the time, as social scientists we are aware that many people use option 2 quite a lot of the time.
    I would suggest that if Friston’s theory held for “higher” cognitive functions such as Reasoning, then we would be far more rational human beings than is in fact the case.

  6. In late 2007 I patented a technology that uses standard binary gates, and uses feedback to change the input parameters in a similar manner to the way STDP functions in biological synapses. The synapses are dynamic and the output is integrated over time in a neuron. A single neuron with 20 synapses requires just over 4200 gates. Over the last couple of years I have been fine-tuning this system and developed a hierarchy of such devices in FPGA. I have been looking at intensity as a second learning mechanism. Having read these papers I consider that STDP or BCM learning may be too limited – that feedback from higher regions in the hierarchy need to be considered in the equation as well.
    BTW. ‘On Intelligence’ also introduces the idea of prediction, through advance triggering of columns before sensory perception arrives.
    On the subject of stubborness; I believe that the brain forms ‘belief systems’ that feed back into high level sensory perception and modify how that data is handled. Eye witness accounts, related to experience and that persons’ belief systems appear to support that statement

Leave a comment