Projects

Research projects

Full-body signatures of somatosensory decisions

2026-2029
Funding: Fonds zur Förderung der wissenschaftlichen Forschung (FWF)
Principal investigator: Tobias Heed
Project members: Sabrina Nasri-Roudsari

Project description

Everyday actions often appear simple, but they involve complex and constantly changing decisions and behavior. Imagine gathering ingredients from all over your kitchen to bake a cake — which one do you reach for first, and how do you move about? Psychology and neuroscience have long studied how we plan and control movements, but most of this research happens under highly artificial laboratory conditions—sitting still, moving only a hand, or reacting to images on a screen. Real life is messier. It's fast-changing, often unpredictable, and involves the whole body.

This project investigates how the entire body expresses decision-making in motion. When we move before we fully know our goal, such as walking toward the kitchen shelf to pick up several ingredients, we must plan, adjust, and coordinate many potential actions at once. Our research asks: Do our bodies show traces of this decision process in our posture and in the way we move our arms and legs? Can we see decisions develop over time as we approach and interact with real objects?

Wireless motion-tracking and brain-recording technologies will allow research participants to move freely in space. We will design interactive, wirelessly controlled objects equipped with sensors and lights; these objects can prompt participants to interact with them, creating ambiguous contexts that can change from moment to moment. With this setup, we can record detailed full-body movements and brain activity while people perform natural actions such as walking, reaching, and grasping. To master the technical challenges and integrate scientific perspectives, the project brings together researchers from the University of Salzburg, the Mozarteum University Salzburg, and Simon Fraser University (Vancouver, Canada).

Understanding how the body expresses decisions helps reveal how we adapt so effortlessly to changing environments. We combine psychology, neuroscience, and technology to bridge the gap between controlled laboratory experiments and the rich complexity of everyday behavior. The results will help us understand how the brain integrates perception, decision, and movement in real-world situations. We hope these insights will inspire new approaches in rehabilitation, robotics, and brain-computer interfaces.

Assessment of parallel movement plans by probing spatial attention

2019-2024
Funding: German Research Foundation (DFG)
Principal investigators: Christian Seegelke, Tobias Heed
Project members: Carolin Schonard

Project description

Our environment constantly presents us with multiple opportunities and demands for action. At any given moment, we must select one of all possible actions, and specify the corresponding movement metrics. Evidence suggests that the brain simultaneously prepares multiple actions in parallel, and action selection results from continuous competition embedded in bottom-up driven sensorimotor processing that is biased by top-down decision-relevant information. However, there is ongoing debate about which aspects of actions are actually represented in parallel. Reaching trajectories of human participants usually reflect an average between the direct trajectories to several currently relevant targets; this aspect of motor behavior has been interpreted as indicating that the executed movement results from averaging of individual movement plans. This interpretation, thus, assumes that all aspects of the different actions are represented in parallel. However, averaging behavior appears to sometimes depend on strategic considerations. For instance, trajectory averaging is abandoned when targets are far apart, or when movements have to executed be very fast; this strategic behavior change improves participants’ overall performance. The use of such strategies has been taken to indicate that movement trajectories are not, by default, averaged. Instead, only a single detailed movement plan would be computed, and this single plan could be derived as an average when this is strategically advantageous. Thus, this theoretical stance assumes parallel representation of final movement goals, but not of movement plans that specify aspects such as the movement’s trajectory. The central challenge to solving this debate is to find convincing measures that indicate whether several movement plans (as opposed to just goals) are currently active. We propose here a series of experiments that address this challenge. To this end, we employ the well-established relationship between movement goals and attentional deployment. Attention is shifted towards one or several sequential motor goals already prior to movement initiation, expressed in enhanced perceptual discrimination performance at target as compared to irrelevant locations. We extend this experimental approach to probing the parallel representation of multiple motor goals for hand reaches. Critically, we probe the attentional deployment to trajectory-defining locations, such as regions near obstacles, along potentially relevant movement trajectories. As a second approach, we induce multiple potentially relevant trajectories through a motor adaptation paradigm. Finally, we address top-down aspects to motor plan selection to elucidate how such aspects affect the averaging of bottom-up sensory information. Together, the proposed experiments will provide substantive and cogent evidence about which levels of movement planning underlie parallel representation.

Dynamic coding of tactile-to-motor transformation in human and macaque posterior parietal cortex

2019-2022
Funding: German Research Foundation (DFG) / French National Research Agency (ANR)
Principal investigators: Tobias Heed, Suliann Ben Hamed
Project members: Celia Foster

Project description

Posterior parietal cortex (PPC) is a central structure for sensorimotor transformation. Yet, its contribution to planning movements towards the own body is still unclear. This project investigates the implementation of goal-directed tactile-motor processing in human and non-human primate (macaque) PPC. It aims at (i) identifying the involved parietal regions, (ii) elucidating the spatial codes used by them, and (iii) characterizing the dynamics within and between regions involved in transforming tactile information from skin to space in dependence on the involved effector executing the motor response. The three key approaches are (a) to devise homologous, directly linkable experiments across the two species and across different methods (fMRI, behavior, neurophysiology); (b) to investigate tactile behavior across two effector systems (saccades, hand reaching) to identify common and specialized processing mechanisms, and (c) to complement these common experiments by human-specific research where directly comparable paradigms are not feasible. The project’s overarching hypotheses are that (1) common principles underlie human and macaque tactually guided motor planning across all effector systems, (2) posterior regions currently associated with eye-centered motor planning, such as macaque LIP/MIP, more generally code all sensory information in an eye-centered code, (3) anterior regions, associated with self-motion and body representation, such as macaque VIP and SPL, more generally code all sensory information in a skin or body-centered code, and (4) all regions dynamically recode spatial information from a sensory to a motor goal-related code. The project’s aim is the extension of current visuo-motor control concepts into the tactile domain, as a first step of incorporating information on body and self in sensorimotor control, and to offer new perspectives for understanding the organizing principles underlying the functional and regional organization of PPC.