Our brains construct mental maps of the environment from the experiences of our senses. This allows us to orient ourselves, remember where something happened, and plan where we go next. In our new publication in Nature Communications, we developed a new computer model that can finely watch the brain as it orients in space and remembers things. We show that newly formed memories affect how we perceive the world around us: the more familiar our environment is, the fewer information need to be integrated. This is directly reflected in our brain activity, and can now be measured!
The remarkable ability of our brain to represent maps of the environment and to retrieve them when needed requires multiple complex neural computations. Many of them are implemented along a neural pathway spanning from early visual cortices up to the higher-level memory hubs in our brain. We believe that understanding how this pathway guides behavior requires studying the activity of each area in great detail while simultaneously taking the complex network structure of our brains into account. How do we solve this scientific challenge?
In other domains of neuroscience, a specific class of predictive models called ‘encoding models’ have revealed how sensory stimuli are represented in the human brain. These models typically utilize certain information about the stimulus (e.g. a picture) to predict brain activity.
Once trained, these models allow to predict brain activity even for previously unseen stimuli. In our new paper, we developed this approach further to study not how stimuli are represented but to link brain activity to the behavior of our participants directly. Specifically, we were interested in how the human brain maps the environment during a natural and hence very complex behavior, that is spatial navigation.
To obtain the desired broad perspective on the brain we used functional magnetic resonance imaging (7T-fMRI), a neuroimaging technique that can index whole-brain activity via blood flow. Because participants need to lie in an MR scanner, we used virtual reality to simulate navigation similar to a computer game. Participants navigated in a virtual arena using a keyboard and memorized the location of hidden objects. To understand how our participants oriented themselves in this task, we then analyzed how their brain activity tracked which direction they faced inside the virtual reality at every moment in time.
To do so, we built an encoding model of their facing direction to estimate the influence of direction on neural activity in each part of the brain. We then used this model to predict how activity unfolded in new data of the same participant performing the same task. Importantly, our encoding model simulated not one but many possible versions of how precisely a brain area might represent direction. By finding which version predicted new data best, we were able to map the fine-grained details of this directional code across the brain.
This novel analysis method made it possible to measure human perception of direction with unprecedented detail. We not only observed an increase in brain activity in regions that filter spatial information from visual stimuli, but were also able to find clues, particularly in the higher memory regions, as to how well the participants remembered the locations of the objects they were looking for.
The results suggest that the process of encoding the world and the objects in it, i.e. cognitive mapping, influences how the entire network of regions in the brain processes information that we are currently deriving from our environment. The interaction of different visual and memory-forming brain areas is thus much more directly related to memory-controlled behavior than has been known up to now. We typically think of memory as discrete events from our past that we can call to mind. But why do we have memories in the first place? They enable us to learn from our past and to adapt our behavior accordingly in the future. Here, the scientists show that our memories have a direct influence on how we perceive the world around us, and that this interplay between perception and memory guides our behavior in everyday life.
We are making the newly developed method available to the scientific community as an open-access analysis tool so that it can also be applied to other data, such as electrical brain data (EEG), or even in animal models. In this way, the neuronal processes involved in memory formation and spatial navigation can be investigated even more comprehensively.
Read more about this in the published article:
Matthias Nau, Tobias Navarro Schröder, Markus Frey, Christian F. Doeller (2020). Nature Communications, https://doi.org/10.1038/s41467-020-17000-2
Sample code for the behavioral encoding model can be found at Open Science Framework: https://osf.io/j5q9u/