This is the setup of an experiment where I investigated the
role of sensory information in action correction.
It consisted of a custom made cube with two embedded mini pistons controlled via Arduino.
Participants had to grasp the cube with their right hand while, in some trials, one of the cube's sides unexpectedly expanded.
I checked the on-line position of the participant's hand through a motion capture system.
When the hand crossed a spatial threshold, the cube opened. Link to the paper
In the next video, I show the goggles that participants were wearing during the experiment. These were controlled via Matlab
to modulate the participant's vision throughout the trial.
This custom made eyetracker is part of an experimental setup where I investigated the modulation of movement kinematics when grasping objects in peripheral and central vision.
The eyetracker was used to check whether participants were shifting their gaze toward the target object when positioned in peripheral vision.
The precision was 1.5 degrees. Data were sampled via Matlab.