Spot N Sync
According to a classical control-theoretic perspective on self-movement, our brains pilot our bodies to achieve goals, which indicates that the synchronization of self-motion with a moving target is a simple control problem, where error between body and target is minimized by some combination of responsive and predictive control. However, not all self-movement can be described in terms of goals. A second, more subtle, driver of movement is the reconciliation of multi-sensory prediction errors. When subjected to visual/proprioceptive conflict, our bodies respond spontaneously and unconsciously to reduce or resolve it.
These two drivers of movement are unified within the theoretical framework of Active Inference. This theory proposes that self-movement is always in service of resolving or avoiding prediction error, and that the intent to move is best understood as a prediction of a certain proprioceptive experience that is spontaneously fulfilled through self-motion.
Active Inference offers an elegant model of synchronization of, for example, a hand movement with a visual target: we predict that our experienced hand motion will align with the target and with the image of our own hand, and our resulting movement unfolds to minimize a weighted sum of the prediction errors from these two sources. This perspective on movement may offer us new insights into movement-related neurodiversity.
It has been suggested that people with autism make proprioceptive judgments that are less affected by visual-proprioceptive conflict. Separately, it has been demonstrated that people with autism are less responsive to timing perturbations during finger-tapping synchronization to an auditory pacing sequence. They could follow from reduced sensitivity to sensory/proprioceptive prediction errors. We aim to explore this hypothesis using a visual synchronization task in VR.
Through our research we hope to understand:
- How their perception-action couples in real-time during this dynamic changing task
- Would our participants make better prediction and adaptation over the practice
- Possible improvement of task performance, retention, and transfer
Visual Perception and Motor Execution in VR
Research on motor impairments for individuals with ASD has been growing in recent years. These motor impairments are thought to be caused by difficulties in processing of motion perception in a global environment, which lead to inefficiency in sensorimotor integration and difficulties engaging in dynamic activities in daily life, from catching a ball to driving a car.
We are designing a simple dynamic task with varying visual and spatial complexity on movement in an immersive virtual reality environment, to assess the visual/motor functionality in children with ASD.
Loom is an experiment that connects physiological behaviors to the motivation between individuals on the Autism Spectrum and another player. We’ve designed the experiment as a form of a two-player cooperation game game where players replicate the model of a patterned, colored block model, but each player is limited to only interacting with a specific type of block. The use of VIVE Pro Eye to measure eye-tracking and pupilometry.
In academia, we recognize that a multiplayer network is an essential next step for cooperative testing. Our goal is to establish a fast network for the Multiplayer VR Community Center to link multiple Vive Pro Eye VR headsets and place players into the same virtual room, regardless of their physical location in the real world. This VR Community Center will also be the central hub that encompasses VR assessment tests and interventions that come out of our lab.
Interested in Participating in our study?
Fill out the
Study Recruitment Form!