Auditory Motion Lab

From biophysics
Jump to navigation Jump to search

Scientific supervisor is Marc van Wanrooij

to technical info

Introduction

Location of Auditory Motion Lab

In the Auditory Motion lab (aka Robot Arm) at the Huygens basement (floor A -2), we investigate how the human auditory system tracks a moving sound source in any direction. A speaker (with LED), attached to the end of two-link robotic arm, equipped with two independently controlled motors, can be moved along a spherical surface around the subject, while any type of sound can be played. The subject actively follows the sound source (in total darkness), while we accurately measure the head and eye orientation. The subject is comfortably seated on a chair, equipped with a third motor, which can induce whole-body rotations in the horizontal plane. In this way we can measure spatial behavior (gaze control) under a variety of dynamic audio-vestibular conditions, in which the sound moves around the subject at a relative velocity with respect to the head that also depends on the whole-body motion of the subject. We can thus dissociate the contributions of active and passive head movements to the perception and pursuit of moving sounds (and lights). The lab is equipped with a high-end binocular eye tracker system (EyeSeeCam), with which eye movements can be recorded at 500 Hz sampling rate per eye. Head movements are recorded through the same system.

The equipment can also be used to test auditory motion perception of patients with an implant, such as a cochlear implant, hearing aid devices, or a bone conduction . Companies producing such implants could gain more insight into the manner in which their products steer brain activity and perception. Audiological centers can use this technique for the same reasons.

Experiment information

  • If you wanna book the Auditory Motion Lab, please use QReserve.


%todo