Step-Synced Vision: Walking Rhythm Alters Visual Perception

Summary: Researchers discovered a novel link between walking rhythms and changes in visual perception, demonstrating that our visual sensitivity fluctuates in sync with our steps. This study found that during the swing phase of a step, our perception sharpens and reaction times improve, whereas footfalls lead to a dulling of visual sensitivity.

Utilizing virtual reality and motion tracking, this research not only challenges previous understandings of constant visual perception but also suggests our brains sample the world in a strobe-like rhythm that aligns with our walking pace.

This groundbreaking finding could have significant implications for diagnosing neurological disorders, optimizing athletic performance, and understanding the fundamental nature of human perception and movement.

Key Facts:

  1. Perception Aligned with Step Cycle: The study reveals that the brain’s sampling of visual information aligns with the walking cycle, showing improved visual sensitivity during the swing phase and reduced sensitivity during footfalls.
  2. Innovative Use of Technology: The use of virtual reality headsets and motion tracking enabled the continuous and precise sampling of visual perception during walking, a method not possible until now.
  3. Potential for Diagnostic and Performance Applications: This research opens avenues for developing diagnostic tools for neuro-muscular and psychiatric conditions and could inform strategies to enhance athletic decision-making and reaction times.

Source: University of Sydney

For the first time, neuroscientists have established a link between shifts in our visual perception and the cadence of our steps while walking.

The research, published in Nature Communications, shows that the brain processes vision in a rhythmic manner, rising and falling in sensitivity in a cycle that corresponds to the rhythm of our steps. When swinging from one step to the next, human perception is good and reactions fast.

During footfall, however, our vision is not as sharp and reactions are slowed.

Lead author Dr Matthew Davidson from the School of Psychology at the University of Sydney said: “This work reveals a previously unknown relationship between perception and movement. It bridges a gap between experimental psychology and our natural, everyday behaviour.”

The study also confirms our understanding of the visual brain sensing the environment in a strobe-like way; our perception takes regular samples of the world before stitching them together to create our seamless experience.

However, the new finding that reveals shifts in our visual perception has important implications for understanding human behaviour, how we interact with our environment and make decisions.

The work was conducted by Dr Matthew Davidson with colleagues Professor David Alais and Professor Frans Verstraten in the School of Psychology, University of Sydney.

Dr Davidson said: “We are consciously aware of a seamless stream of vision but this is deceptive. I use the analogy of a duck swimming on a pond. Beneath the smooth motion on the surface there is a lot cycling activity beneath.”

This study extends earlier work from the same lab showing that perception of vision and sound is cyclic, with our brain taking around eight samples per second.

Professor Alais said: “The critical new finding in this study is that these oscillations in the brain’s sampling of the world slow down when walking to match the step cycle.

“Humans take about two steps per second when walking and generally keep to a consistent rhythm. The reported oscillations in visual sensitivity also occur at about two cycles per second and are locked to the step cycle. In some participants these rhythmic oscillations occur at four cycles per second but these were also locked to the step cycle.”

This work is the first time that visual perception has been finely and continuously sampled during walking. Without virtual reality headsets and motion tracking it would not be possible.

Dr Davidson said: “Thanks to VR technology we have discovered that our vision moves through a good and a bad phase on every step.”

It is unclear why our brain’s perceptual processes are so closely linked to walking.

Professor Alais said: “One possible explanation is that vision becomes secondary to motor control while your foot is grounded and the next step is planned. Once you are in the swing phase between footfalls, the brain switches back to prioritising perceptual sampling of the world, creating an ongoing perceptual rhythm that harmonises with your step rate.”

The findings open questions that the research team will pursue in further studies. For example, does perception of sound and touch also modulate as we walk? And what about neural activity?

The research team plans to follow up these questions to further understand the implications.

Dr Davidson said: “An obvious question is whether these oscillations in perception are more pronounced in the elderly given difficulties with balance and coordination as we age.

“It also raises the exciting possibility that we could develop cheap and easy diagnostic tests using VR headsets, or use this information to develop tests for early onset of neuro-muscular disorders or some psychiatric illnesses, which can manifest in abnormal gaits.”

He said it could also be applied to further research in sports science to see if the findings could be applied to optimise decision-making and reaction times in athletes.

Underlying all this research remains a persistent mystery. If the world is sampled by our brains rhythmic pulses, why is our conscious perception so seamlessly smooth?

Professor Verstraten said: “This was once a question for philosophers, but with access to technology neuroscientists have been able to shed light on how the gaps get filled in. The current view is that the brain is a predictive machine that actively constructs perception and predicts what ought to be there and fills in the blanks. But clearly, we need more research to deepen our understanding.”

Methods

Researchers tracked the walking of 45 subjects walking back and forth along a 10-metre path in a virtual environment. During each walk (lasting about 9 seconds), subjects were required to respond to between zero and eight random visual stimuli. The same stimuli were also presented in stationary trials. Eye and head movement was tracked along with gait and walking information.

Of the 45 subjects, insufficient data was collected for seven subjects. In the datasets for 38 subjects, reduced perception at footfall was recorded 83 percent of the time.

The behavioural data generated in this study have been deposited in a public database under accession code https://osf.io/8djtq/ 

About this visual perception and neuroscience research news

Author: Marcus Strom
Source: University of Sydney
Contact: Marcus Strom – University of Sydney
Image: The image is credited to Neuroscience News

Original Research: Open access.
Walking modulates visual detection performance according to stride-cycle phase” by Matthew Davidson et al. Nature Communications


Abstract

Walking modulates visual detection performance according to stride-cycle phase

Walking is among our most frequent and natural of voluntary behaviours, yet the consequences of locomotion upon perceptual and cognitive function remain largely unknown.

Recent work has highlighted that although walking feels smooth and continuous, critical phases exist within each step for the successful coordination of perceptual and motor function.

Here, we test whether these phasic demands impact upon visual perception, by assessing performance in a visual detection task during natural unencumbered walking. We finely sample visual performance over the stride cycle as participants walk along a smooth linear path at a comfortable speed in a wireless virtual reality environment.

At the group-level, accuracy, reaction times, and response likelihood show strong oscillations, modulating at approximately 2 cycles per stride (~2 Hz) with a marked phase of optimal performance aligned with the swing phase of each step.

At the participant level, Bayesian inference of population prevalence reveals highly prevalent oscillations in visual detection performance that cluster in two idiosyncratic frequency ranges (2 or 4 cycles per stride), with a strong phase alignment across participants.

Reference

Denial of responsibility! Elite News is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a comment