<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.biophysics.science.ru.nl/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Lof</id>
	<title>biophysics - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.biophysics.science.ru.nl/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Lof"/>
	<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/Special:Contributions/Lof"/>
	<updated>2026-04-05T18:37:07Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.43.0</generator>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=TDT_RZ6&amp;diff=4907</id>
		<title>TDT RZ6</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=TDT_RZ6&amp;diff=4907"/>
		<updated>2026-04-02T07:52:16Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Programming */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:Tucker_Davis_RZ6.png|thumb|TDT RZ6 Multi I/O Processor]]&lt;br /&gt;
==Description==&lt;br /&gt;
The &#039;&#039;&#039;TDT RZ6 Multi I/O Processor&#039;&#039;&#039; is an advanced research tool tailored for (PhD) students seeking versatile capabilities in neuroscience and experimental setups. This device excels in various critical aspects:&lt;br /&gt;
&lt;br /&gt;
# &#039;&#039;&#039;Signal Processing Power:&#039;&#039;&#039; The RZ6 is a device, capable of real-time signal processing, making it ideal for experiments requiring precise timing and complex data manipulation.&lt;br /&gt;
# &#039;&#039;&#039;Multimodal Data Acquisition:&#039;&#039;&#039; It can simultaneously acquire multiple types of data, such as neural signals, analog inputs, and digital events, allowing for comprehensive experimental monitoring.&lt;br /&gt;
# &#039;&#039;&#039;Customizable Experimentation:&#039;&#039;&#039; The BIOX toolbox enables flexible programming for students designing and controlling experiments with a high degree of specificity.&lt;br /&gt;
# &#039;&#039;&#039;Synchronization:&#039;&#039;&#039; The RZ6 I/O can be used for synchronizing with other devices, ensuring precise timing between various components of an experimental setup.&lt;br /&gt;
# &#039;&#039;&#039;Stimulation Capabilities:&#039;&#039;&#039; Students can employ the RZ6 to deliver precisely timed stimuli, making it valuable for a wide range of experiments involving sensory or behavioral responses.&lt;br /&gt;
# &#039;&#039;&#039;MATLAB Integration:&#039;&#039;&#039; The RZ6 is fully compatible, facilitating seamless integration and data analysis within MATLAB.&lt;br /&gt;
# &#039;&#039;&#039;Reliability:&#039;&#039;&#039; It has a reputation for robustness and durability that students can rely on for consistent, high-quality data collection.&lt;br /&gt;
&lt;br /&gt;
Biophysics has developed software for the RZ6 called [[BIOX]], which has an easy interface with Matlab.&lt;br /&gt;
&lt;br /&gt;
==Technical info==&lt;br /&gt;
[[File: EEG_NIRS_RZ6_architecture.jpg|thumb|RZ6 architecture]]&lt;br /&gt;
Relevant manual from TDT&lt;br /&gt;
* Overview: https://www.tdt.com/docs/.&lt;br /&gt;
* RZ6: https://www.tdt.com/files/manuals/hardware/RZ6.pdf.&lt;br /&gt;
* PM2Relay: https://www.tdt.com/files/manuals/hardware/PM2R.pdf. &lt;br /&gt;
* RPvdsEx: http://www.tdt.com/files/manuals/RPvdsEx_Manual.pdf. &lt;br /&gt;
* ActiveX: http://www.tdt.com/files/manuals/ActiveX_User_Reference.pdf.&lt;br /&gt;
&lt;br /&gt;
TDT ActiveX controls enable Matlab to real-time control TDT system 3 hardware. See page 5 of the manual for example code to use Matlab to get a circuit running on the RZ6. Examples can be found in C:\TDT\ActiveX\ActXExamples\matlab.&lt;br /&gt;
&lt;br /&gt;
===Digital I/O===&lt;br /&gt;
[[File: RZ6_DB25_Digital_IO_pinout.jpg|thumb|DB25 Digital I/O pinout]]&lt;br /&gt;
[[File: PP_RZ6_Digital_IO_connections.png|thumb|PP RZ6 Digital I/O pinout]]&lt;br /&gt;
The RZ6 has a DB25 connector for digital I/O. A custom patch panel &#039;PP RZ6 Digital-I/O&#039; is available that splits the I/O to a DB25 connector for multiplexer control, a DB25 connector for a response box and eight BNC connector for separate I/O bits.&lt;br /&gt;
&lt;br /&gt;
====Multiplexer control====&lt;br /&gt;
&lt;br /&gt;
Byte-C is for multiplexer control. Four PM2R multiplexer can be controlled via this output. The fifth and sixth bit of byte-C codes for the PM2R device ID (0-3). The first four bits for the channel number. Only one channel can be open at a time for each PM2R. The seventh bit opens the channel and the eighth bit closes any open channel.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Bit number !! Integer value !! Function&lt;br /&gt;
|-&lt;br /&gt;
| 0 || 1 || Bit 1 (least significant bit) of channel number&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 2 || Bit 2 of channel number&lt;br /&gt;
|-&lt;br /&gt;
| 2 || 4 || Bit 3 of channel number&lt;br /&gt;
|-&lt;br /&gt;
| 3 || 8 || Bit 4 (most significant bit) of channel number&lt;br /&gt;
|-&lt;br /&gt;
| 4 || 16 || Least significant bit of device number&lt;br /&gt;
|-&lt;br /&gt;
| 5 || 32 || Most significant bit of device number&lt;br /&gt;
|-&lt;br /&gt;
| 6 || 64 || Turns on the channel of the specified device&lt;br /&gt;
|-&lt;br /&gt;
| 7 || 128 || Turns off all channels on the specified device only&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
====Response Box====&lt;br /&gt;
%todo&lt;br /&gt;
&lt;br /&gt;
====Digital I/O breakout====&lt;br /&gt;
The Digital I/O of the RZ6 has 24 digital lines forming byte A, B and C. For these digital lines we made a breakout panel named PP RZ6 Digital I/O. This panel has a DSub25-M connector for hooking up to the RZ6, a DSub25-M connector for connecting PM2R multiplexers to the RZ6 and a DSub25-F connector for connecting a Response Box to the RZ6. The input bits A4..A7 and output bits B4..B7 each have a BNC connector. The output bits can be used for sending trigger to other devices. The input bits can be used for receiving triggers (e.g. from a pushbutton).&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
&lt;br /&gt;
===RPvdsEx===&lt;br /&gt;
The programming of the RZ6 is done in RPvdsEx, which is a graphical development tool by Tucker Davis Technologies.&lt;br /&gt;
&lt;br /&gt;
===Matlab interface (ActiveX control)===&lt;br /&gt;
The current (as of march 2024) Matlab interface for the TDT devices is based on the function &amp;quot;actxserver&amp;quot;. Previously &amp;quot;actxcontrol&amp;quot; was used, but this will be obsolete in the near future.&lt;br /&gt;
&lt;br /&gt;
====Installing ActiveX control====&lt;br /&gt;
&lt;br /&gt;
*Go to the website of Tucker Davis technologies. Go to support\downloads.&lt;br /&gt;
*Download &#039;ActiveX Controls&#039;&lt;br /&gt;
*Run the executable&lt;br /&gt;
*When asked for a password use the password &#039;spider&#039;.&lt;br /&gt;
&lt;br /&gt;
====Matlab functions==== &lt;br /&gt;
&lt;br /&gt;
The following device driver functions are available in the biofysica toolbox: &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
function [module, err, errstr] = RZ6(number,circuit)&lt;br /&gt;
function [module, err, errstr] = ZBUS(nRacks)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;Module&amp;quot; is an object with device specific functionality. &amp;quot;number&amp;quot; (or &amp;quot;nRacks&amp;quot;) is given in order to distinguish between different hardware of the same type. &amp;quot;circuit&amp;quot; is the filename of the program that should be uploaded to the device.&lt;br /&gt;
&amp;quot;err&amp;quot; gives an integer and &amp;quot;errstr&amp;quot; the corresponding error message. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Error codes:&lt;br /&gt;
 0 ==&amp;gt; all devices: no error&lt;br /&gt;
-1 ==&amp;gt; all devices: failed to connect&lt;br /&gt;
-2 ==&amp;gt; RZ6: failed to load circuit; zBus: failed to reset&lt;br /&gt;
-3 ==&amp;gt; zBus: failed to flush IO&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For more on the device drivers see [https://www.tdt.com/files/manuals/ActiveX_User_Reference.pdf ActiveX_User_Reference.pdf].&lt;br /&gt;
&lt;br /&gt;
===BIOX toolbox===&lt;br /&gt;
We have developed a toolbox that can perform tasks for a large number of different experiments. It consists of RPvdsEx code for the RZ6 and a set easy to use Matlab functions.&lt;br /&gt;
* see [[BIOX]]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4906</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4906"/>
		<updated>2026-04-01T11:30:20Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[LabStreamingLayer]]). In order to get data from the EyeSeeCam you create an lslStream and session in matlab.&lt;br /&gt;
&lt;br /&gt;
Here is an example:&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab code for converting raw data==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4905</id>
		<title>How to</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4905"/>
		<updated>2026-04-01T11:29:38Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Lab==&lt;br /&gt;
*How to make a lab [[QReserve|reservation]]?&lt;br /&gt;
*How to report an [[Issues Tracking|issue]]?&lt;br /&gt;
*How to work with [[Ethics &amp;amp; Subjects|subjects]]?&lt;br /&gt;
*How to use a [[Lab journal|lab journal]]?&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
*How to use a [[DCN LED controller|DCN LED controller]]?&lt;br /&gt;
*How to use the [[PLC LED controller specifications|PLC LED controller]]?&lt;br /&gt;
*How to use a [[Digital Event Recorder]]?&lt;br /&gt;
&lt;br /&gt;
==Software==&lt;br /&gt;
*How to use [[Gitlab]]?&lt;br /&gt;
*How to use [[BIOX]]?&lt;br /&gt;
*How to use [[LabStreamingLayer|LabStreamingLayer (LSL)]]?&lt;br /&gt;
*How to use [[Coordinate systems]] in experiments?&lt;br /&gt;
*How to use [[Units in Matlab]]?&lt;br /&gt;
*How to make a [[Programming a GUI|Graphical User Interface (GUI)]] in Matlab?&lt;br /&gt;
*How to make [[Ripple Sounds]]?&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=LabStreamingLayer&amp;diff=4904</id>
		<title>LabStreamingLayer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=LabStreamingLayer&amp;diff=4904"/>
		<updated>2026-04-01T11:28:47Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Some devices with LSL */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
&lt;br /&gt;
LabStreamingLayer (LSL) is a framework and protocol designed for the real-time streaming of time-series data. It is primarily used in research and scientific applications, especially in fields such as neuroscience, psychology, and physiology, where the collection and synchronization of data from multiple sources are crucial.&lt;br /&gt;
&lt;br /&gt;
*Real-time Data Streaming: LSL allows the continuous and real-time transmission of data from various sources, such as sensors, recording devices, and software applications. This can include EEG, ECG, eye-tracking data, motion capture, and more.&lt;br /&gt;
&lt;br /&gt;
*Data Synchronization: LSL provides a mechanism for synchronizing data streams from multiple devices or applications.&lt;br /&gt;
&lt;br /&gt;
*Cross-Platform Support: LSL is platform-independent and can be used on various operating systems, including Windows, macOS, and Linux.&lt;br /&gt;
&lt;br /&gt;
*Language Support: LSL offers libraries and bindings for several programming languages, including Python, C/C++, Java, MATLAB, and others.&lt;br /&gt;
&lt;br /&gt;
*Flexible Data Types: It supports various data types, including numeric, string, and marker data, making it adaptable to different types of experiments and data formats.&lt;br /&gt;
&lt;br /&gt;
*Open Source: LabStreamingLayer is an open-source project, which means it is continuously developed and improved by a community of researchers and developers.&lt;br /&gt;
&lt;br /&gt;
*Network Capabilities: LSL supports both local data streaming (within a single computer) and network-based streaming.&lt;br /&gt;
&lt;br /&gt;
*Timestamps: LSL provides high-precision timestamps.&lt;br /&gt;
&lt;br /&gt;
==How to use LSL in Matlab==&lt;br /&gt;
In our biofysica repository in Gitlab we have [https://gitlab.science.ru.nl/marcw/biofysica/-/tree/master/liblsl/liblsl-Matlab?ref_type=heads matlab code] for accessing LSL devices. In the repository there are also examples for how to read the data from these devices.&lt;br /&gt;
&lt;br /&gt;
You can get a list of all available LSL-streams by executing the Matlab command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lsl_list&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Every stream has a type and a name. With the function &#039;&#039;&#039;lsl_resolver&#039;&#039;&#039; you can get the LSL info of available streams. You can look for a specific stream by specifying the stream type and/or the name in a string parameter:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lsl_resolver(&#039;type=&#039;&#039;&amp;lt;stream type&amp;gt; @ &amp;lt;hostname&amp;gt;&#039;&#039; and name=&#039;&#039;&amp;lt;stream name&amp;gt;&#039;&#039;&#039;)  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The hostname is the name of the computer to which a LSL capable device (e.g. eyetrackers) is attached or the name of an embedded computer as is the case for Digital Event Recorders.&lt;br /&gt;
&lt;br /&gt;
The following code example checks whether an LSL stream is available for the stream type &#039;&#039;&#039;Digital Events&#039;&#039;&#039; on host &#039;&#039;&#039;lslder04&#039;&#039;&#039; and the stream name &#039;&#039;&#039;Digital Events 1&#039;&#039;&#039;:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lslInfo = lsl_resolver(&#039;type=&#039;&#039;Digital Events @ lslder04&#039;&#039; and name=&#039;&#039;Digital Events 1&#039;&#039;&#039;);&lt;br /&gt;
streamInfoList = lslInfo.list;&lt;br /&gt;
if isempty(streamInfoList)&lt;br /&gt;
   error(&#039;no streams found&#039;);&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In this case the &#039;&#039;&#039;streamInfoList&#039;&#039;&#039; contains only one stream (or none). If you want to find all available streams you can use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lslInfo = lsl_resolver&lt;br /&gt;
streamInfoList = lslInfo.list;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The following code prints all the types and names of the streams that are found:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
for i = 1:size(streamInfoList ,1)&lt;br /&gt;
    fprintf(&#039;%d: name: &#039;&#039;%s&#039;&#039; type: &#039;&#039;%s&#039;&#039;\n&#039;,i,streamInfoList{i}.name,streamInfoList{i}.type);&lt;br /&gt;
end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can get the n-th stream in streamInfoList by using &#039;&#039;&#039;lsl_istream&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
stream_n = lsl_istream(lslInfo{n});&lt;br /&gt;
stream_m = lsl_istream(lslInfo{m});&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The following code creates an &#039;&#039;&#039;lsl_session&#039;&#039;&#039; and add two streams to the session. You can add as many streams as you like.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mySession = lsl_session();&lt;br /&gt;
mySession.add_stream(stream_n);&lt;br /&gt;
mySession.add_stream(stream_m);&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can start a session, do your experiment and stop the session and read the data from the streams.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mySession.start;&lt;br /&gt;
% do you experiment&lt;br /&gt;
mySession.stop;&lt;br /&gt;
data_n = stream_n.read;&lt;br /&gt;
data_m = stream_m.read;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
More information and examples can be found [https://gitlab.science.ru.nl/marcw/biofysica/-/blob/master/liblsl/liblsl-Matlab/examples/README.md?ref_type=heads here] on Gitlab.&lt;br /&gt;
&lt;br /&gt;
==Some devices with LSL==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! Device&lt;br /&gt;
! Stream Type&lt;br /&gt;
! Stream Name&lt;br /&gt;
|-&lt;br /&gt;
| Digital Event Recorder&lt;br /&gt;
| Digital Events @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Digital Events X&lt;br /&gt;
|-&lt;br /&gt;
| Digital Event Recorder&lt;br /&gt;
| Markers @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Digital Markers&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Capture @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Pupil Primitive Data - Eye 0&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Capture @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Pupil Python Representation - Eye 0&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Gaze @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| ?&lt;br /&gt;
|-&lt;br /&gt;
| OptiTrack Eyetracker&lt;br /&gt;
| OptiTrack Mocap @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Labeled Markers&lt;br /&gt;
|-&lt;br /&gt;
| EyeSeeCam Eyetracker&lt;br /&gt;
| EyeSeeCam @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| ?&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples (with descriptions in the readme.md file).&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=LabStreamingLayer&amp;diff=4903</id>
		<title>LabStreamingLayer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=LabStreamingLayer&amp;diff=4903"/>
		<updated>2026-04-01T11:25:41Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Some devices with LSL */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
&lt;br /&gt;
LabStreamingLayer (LSL) is a framework and protocol designed for the real-time streaming of time-series data. It is primarily used in research and scientific applications, especially in fields such as neuroscience, psychology, and physiology, where the collection and synchronization of data from multiple sources are crucial.&lt;br /&gt;
&lt;br /&gt;
*Real-time Data Streaming: LSL allows the continuous and real-time transmission of data from various sources, such as sensors, recording devices, and software applications. This can include EEG, ECG, eye-tracking data, motion capture, and more.&lt;br /&gt;
&lt;br /&gt;
*Data Synchronization: LSL provides a mechanism for synchronizing data streams from multiple devices or applications.&lt;br /&gt;
&lt;br /&gt;
*Cross-Platform Support: LSL is platform-independent and can be used on various operating systems, including Windows, macOS, and Linux.&lt;br /&gt;
&lt;br /&gt;
*Language Support: LSL offers libraries and bindings for several programming languages, including Python, C/C++, Java, MATLAB, and others.&lt;br /&gt;
&lt;br /&gt;
*Flexible Data Types: It supports various data types, including numeric, string, and marker data, making it adaptable to different types of experiments and data formats.&lt;br /&gt;
&lt;br /&gt;
*Open Source: LabStreamingLayer is an open-source project, which means it is continuously developed and improved by a community of researchers and developers.&lt;br /&gt;
&lt;br /&gt;
*Network Capabilities: LSL supports both local data streaming (within a single computer) and network-based streaming.&lt;br /&gt;
&lt;br /&gt;
*Timestamps: LSL provides high-precision timestamps.&lt;br /&gt;
&lt;br /&gt;
==How to use LSL in Matlab==&lt;br /&gt;
In our biofysica repository in Gitlab we have [https://gitlab.science.ru.nl/marcw/biofysica/-/tree/master/liblsl/liblsl-Matlab?ref_type=heads matlab code] for accessing LSL devices. In the repository there are also examples for how to read the data from these devices.&lt;br /&gt;
&lt;br /&gt;
You can get a list of all available LSL-streams by executing the Matlab command:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lsl_list&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Every stream has a type and a name. With the function &#039;&#039;&#039;lsl_resolver&#039;&#039;&#039; you can get the LSL info of available streams. You can look for a specific stream by specifying the stream type and/or the name in a string parameter:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lsl_resolver(&#039;type=&#039;&#039;&amp;lt;stream type&amp;gt; @ &amp;lt;hostname&amp;gt;&#039;&#039; and name=&#039;&#039;&amp;lt;stream name&amp;gt;&#039;&#039;&#039;)  &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The hostname is the name of the computer to which a LSL capable device (e.g. eyetrackers) is attached or the name of an embedded computer as is the case for Digital Event Recorders.&lt;br /&gt;
&lt;br /&gt;
The following code example checks whether an LSL stream is available for the stream type &#039;&#039;&#039;Digital Events&#039;&#039;&#039; on host &#039;&#039;&#039;lslder04&#039;&#039;&#039; and the stream name &#039;&#039;&#039;Digital Events 1&#039;&#039;&#039;:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lslInfo = lsl_resolver(&#039;type=&#039;&#039;Digital Events @ lslder04&#039;&#039; and name=&#039;&#039;Digital Events 1&#039;&#039;&#039;);&lt;br /&gt;
streamInfoList = lslInfo.list;&lt;br /&gt;
if isempty(streamInfoList)&lt;br /&gt;
   error(&#039;no streams found&#039;);&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In this case the &#039;&#039;&#039;streamInfoList&#039;&#039;&#039; contains only one stream (or none). If you want to find all available streams you can use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
lslInfo = lsl_resolver&lt;br /&gt;
streamInfoList = lslInfo.list;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The following code prints all the types and names of the streams that are found:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
for i = 1:size(streamInfoList ,1)&lt;br /&gt;
    fprintf(&#039;%d: name: &#039;&#039;%s&#039;&#039; type: &#039;&#039;%s&#039;&#039;\n&#039;,i,streamInfoList{i}.name,streamInfoList{i}.type);&lt;br /&gt;
end&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can get the n-th stream in streamInfoList by using &#039;&#039;&#039;lsl_istream&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
stream_n = lsl_istream(lslInfo{n});&lt;br /&gt;
stream_m = lsl_istream(lslInfo{m});&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The following code creates an &#039;&#039;&#039;lsl_session&#039;&#039;&#039; and add two streams to the session. You can add as many streams as you like.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mySession = lsl_session();&lt;br /&gt;
mySession.add_stream(stream_n);&lt;br /&gt;
mySession.add_stream(stream_m);&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You can start a session, do your experiment and stop the session and read the data from the streams.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mySession.start;&lt;br /&gt;
% do you experiment&lt;br /&gt;
mySession.stop;&lt;br /&gt;
data_n = stream_n.read;&lt;br /&gt;
data_m = stream_m.read;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
More information and examples can be found [https://gitlab.science.ru.nl/marcw/biofysica/-/blob/master/liblsl/liblsl-Matlab/examples/README.md?ref_type=heads here] on Gitlab.&lt;br /&gt;
&lt;br /&gt;
==Some devices with LSL==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! Device&lt;br /&gt;
! Stream Type&lt;br /&gt;
! Stream Name&lt;br /&gt;
|-&lt;br /&gt;
| Digital Event Recorder&lt;br /&gt;
| Digital Events @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Digital Events X&lt;br /&gt;
|-&lt;br /&gt;
| Digital Event Recorder&lt;br /&gt;
| Markers @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Digital Markers&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Capture @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Pupil Primitive Data - Eye 0&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Capture @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Pupil Python Representation - Eye 0&lt;br /&gt;
|-&lt;br /&gt;
| Pupil Labs Eyetracker&lt;br /&gt;
| Pupil Gaze @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| ?&lt;br /&gt;
|-&lt;br /&gt;
| OptiTrack Eyetracker&lt;br /&gt;
| OptiTrack Mocap @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| Labeled Markers&lt;br /&gt;
|-&lt;br /&gt;
| EyeSeeCam Eyetracker&lt;br /&gt;
| EyeSeeCam @ &amp;lt;hostname&amp;gt;&lt;br /&gt;
| ?&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4902</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4902"/>
		<updated>2026-04-01T11:23:29Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab programming */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]). In order to get data from the EyeSeeCam you create an lslStream and session in matlab.&lt;br /&gt;
&lt;br /&gt;
Here is an example:&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab code for converting raw data==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4901</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4901"/>
		<updated>2026-04-01T11:22:33Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* EyeSeeCam data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]). In order to get data from the EyeSeeCam you create an lslStream and session in matlab.&lt;br /&gt;
&lt;br /&gt;
Here is an example:&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4900</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4900"/>
		<updated>2026-04-01T11:20:56Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]). In order to get data from the EyeSeeCam you create an lslStream and session in matlab.&lt;br /&gt;
&lt;br /&gt;
Here is an example:&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4899</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4899"/>
		<updated>2026-04-01T11:20:36Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]). In order to get data from the EyeSeeCam you create an lslStream and session in matlab.&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4898</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4898"/>
		<updated>2026-04-01T11:19:53Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]). In matlab you create an lslStream and session for getting the data from the EyeSeeCam.&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4897</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4897"/>
		<updated>2026-04-01T11:18:29Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (for more info about LSL see [[Lab Streaming Layer]]).&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4896</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4896"/>
		<updated>2026-04-01T11:17:58Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam is connected to a Mac Mini. On the Mac runs an LSL server (see [[Lab Streaming Layer]] for information about LSL).&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4895</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4895"/>
		<updated>2026-04-01T11:15:40Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab (biofysica toolbox)==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet.&lt;br /&gt;
*When the stream is found you must add it to a session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from each stream object.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples (with descriptions in the readme.md file).&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4894</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4894"/>
		<updated>2026-04-01T11:15:16Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab (biofysica toolbox)==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet.&lt;br /&gt;
*When the stream is found you must add it to a session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from each stream object.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples (with descriptions in the readme.md file).&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4893</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4893"/>
		<updated>2026-04-01T11:14:44Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet.&lt;br /&gt;
*When the stream is found you must add it to a session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from each stream object.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples (with descriptions in the readme.md file).&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4892</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4892"/>
		<updated>2026-04-01T11:13:10Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples (with descriptions in the readme.md file).&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4891</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4891"/>
		<updated>2026-04-01T11:11:34Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt; you can find more examples.&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4890</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4890"/>
		<updated>2026-04-01T11:11:02Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..biofysica\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4889</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4889"/>
		<updated>2026-04-01T11:10:53Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In the directory &amp;lt;...biofysica\liblsl\liblsl-Matlab\examples&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4888</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4888"/>
		<updated>2026-04-01T11:09:51Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
*Open a stream to the EyeSeeCam&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
*Open a session and add the stream to the session.&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
*Start and stop the session and get the data from the stream.&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ....wait some time....&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
In&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4887</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4887"/>
		<updated>2026-04-01T11:04:54Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..\liblsl\liblsl-Matlab&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslSession = lsl_session;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStreams = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStreams);&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4886</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4886"/>
		<updated>2026-04-01T11:04:16Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox in the directory &amp;lt;..\liblsl&amp;gt;. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslSession = lsl_session;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStreams = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStreams);&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4885</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4885"/>
		<updated>2026-04-01T11:02:39Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab Example */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslSession = lsl_session;&lt;br /&gt;
 type = &#039;Digital Events @ lslder01&#039;;&lt;br /&gt;
 name = &#039;Digital Events 1&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type= &#039;&#039;%s&#039;&#039; and name= &#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStreams = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStreams);&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4884</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4884"/>
		<updated>2026-04-01T11:01:05Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LSL in Matlab */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;br /&gt;
&lt;br /&gt;
==Matlab Example==&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4883</id>
		<title>Lab Streaming Layer</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Lab_Streaming_Layer&amp;diff=4883"/>
		<updated>2026-04-01T11:00:41Z</updated>

		<summary type="html">&lt;p&gt;Lof: Created page with &amp;quot;==Introduction== Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.  LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
Lab Streaming Layer (LSL) is an open‑source software framework designed to make it easy to send, receive, and synchronize data streams in real time. It acts like a universal “data highway” that different devices and programs can use to communicate with each other.&lt;br /&gt;
&lt;br /&gt;
LSL is commonly used in research settings—especially in neuroscience, psychology, and human‑computer interaction—to collect data from multiple sources at the same time. For example, you can stream EEG signals, motion‑tracking data, eye‑tracking data, and experiment events through LSL and keep them perfectly time‑aligned.&lt;br /&gt;
It is typically used to:&lt;br /&gt;
- Connect different sensors and software tools without worrying about compatibility&lt;br /&gt;
- Record synchronized data from multiple devices&lt;br /&gt;
- Build experiments that require real‑time data exchange&lt;br /&gt;
- Store all incoming data in a single, well‑organized format&lt;br /&gt;
&lt;br /&gt;
==LSL in Matlab==&lt;br /&gt;
&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream.&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4882</id>
		<title>How to</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4882"/>
		<updated>2026-04-01T10:46:39Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Lab==&lt;br /&gt;
*How to make a lab [[QReserve|reservation]]?&lt;br /&gt;
*How to report an [[Issues Tracking|issue]]?&lt;br /&gt;
*How to work with [[Ethics &amp;amp; Subjects|subjects]]?&lt;br /&gt;
*How to use a [[Lab journal|lab journal]]?&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
*How to use a [[DCN LED controller|DCN LED controller]]?&lt;br /&gt;
*How to use the [[PLC LED controller specifications|PLC LED controller]]?&lt;br /&gt;
*How to use a [[Digital Event Recorder]]?&lt;br /&gt;
&lt;br /&gt;
==Software==&lt;br /&gt;
*How to use [[Gitlab]]?&lt;br /&gt;
*How to use [[BIOX]]?&lt;br /&gt;
*How to use [[LabStreamingLayer|LabStreamingLayer (LSL)]]?&lt;br /&gt;
*How to use [[Coordinate systems]] in experiments?&lt;br /&gt;
*How to use [[Units in Matlab]]?&lt;br /&gt;
*How to make a [[Programming a GUI|Graphical User Interface (GUI)]] in Matlab?&lt;br /&gt;
*How to make [[Ripple Sounds]]?&lt;br /&gt;
*How to use [[Lab Streaming Layer]] (aka: LSL)&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4881</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4881"/>
		<updated>2026-04-01T10:45:29Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4880</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4880"/>
		<updated>2026-04-01T10:44:33Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
Generic lsl functions can be found in the biofysica toolbox. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream. &lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4879</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4879"/>
		<updated>2026-04-01T10:44:22Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
Generic Lsl functions can be found in the biofysica toolbox. &lt;br /&gt;
*Each lsl-device can have one or more lsl-streams.&lt;br /&gt;
*An lsl-stream is identified by a type and a name. &lt;br /&gt;
*The function lsl_resolver checks if it can find the requested lslStream on the intranet. &lt;br /&gt;
*When the stream is found you must add it to the session. &lt;br /&gt;
*The session controls the actual data-acquisition with start and stop. &lt;br /&gt;
*The data is read from the stream. &lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4878</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4878"/>
		<updated>2026-04-01T10:41:45Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
An lsl stream is identified by a type and a name. The function lsl_resolver checks if it can find the requested lslStream on the intranet. When the stream is found you must add it to the session. The session controls the actual data-acquisition with start and stop. The data is read from the stream. &lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4877</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4877"/>
		<updated>2026-04-01T10:41:09Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
The function lsl_resolver checks if it can find the requested lslStream on the intranet. When the stream is found you must add it to the session. The session controls the actual data-acquisition with start and stop. The data is read from the stream. &lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4876</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4876"/>
		<updated>2026-04-01T10:40:35Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
The function lsl_resolver checks if it can find the requested lslStream on the intranet. When the stream is found it is added to the session. The session controls the actual data-acquisition with start and stop. The data is read from the stream. &lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4875</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4875"/>
		<updated>2026-04-01T10:37:15Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
&lt;br /&gt;
 session = lsl_session;&lt;br /&gt;
 session.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 session.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 session.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4874</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4874"/>
		<updated>2026-03-31T13:07:38Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 lslSession.start;&lt;br /&gt;
 ...wait some time...&lt;br /&gt;
 lslSession.stop;&lt;br /&gt;
 data = lslStream.read;&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4873</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4873"/>
		<updated>2026-03-31T13:06:33Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Matlab coding */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, type, name);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStream = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStream);&lt;br /&gt;
&lt;br /&gt;
 data = lslStream.read&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4872</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4872"/>
		<updated>2026-03-31T13:03:19Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Startup Instructions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Matlab coding==&lt;br /&gt;
&lt;br /&gt;
 type = &#039;EyeSeeCam @ dcn-pl04&#039;;&lt;br /&gt;
 name = &#039;EyeSeeCam SCI Data&#039;;&lt;br /&gt;
 lslString = sprintf(&#039;type=&#039;&#039;%s&#039;&#039; and name=&#039;&#039;%s&#039;&#039;&#039;, typeStr, nameStr);&lt;br /&gt;
 info=lsl_resolver(lslString);&lt;br /&gt;
 lslStreams = lsl_istream(info{1});&lt;br /&gt;
 lslSession.add_stream(lslStreams);&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4871</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4871"/>
		<updated>2026-03-31T12:58:06Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Startup Instructions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login on the Mac Mini to &#039;EyeSeeCam&#039; with password &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script on the Mac Mini(is on the desktop)&lt;br /&gt;
* Start the EyeSeeCam SCI program on the Mac Mini&lt;br /&gt;
* Start a matlab program on the windows computer (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button in the EyeSeeCam SCI program on the Mac Mini (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; in the EyeSeeCam SCI program on the Mac Mini (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4870</id>
		<title>EyeSeeCam SCI</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EyeSeeCam_SCI&amp;diff=4870"/>
		<updated>2026-03-31T12:52:23Z</updated>

		<summary type="html">&lt;p&gt;Lof: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:EyeSeeCam_SCI2.png|thumb|EyeSeeCam SCI]]&lt;br /&gt;
==Description==&lt;br /&gt;
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. The head tracking is done with an IMU and the eye tracking is done with cameras and software that tracks the pupils of both eyes. The tracking data can be read out directly in Matlab. In case a dedicated computer is used for reading out the EyeSeeCam, the data it is converted to an LSL stream. &lt;br /&gt;
&lt;br /&gt;
The EyeSeeCam Sci can record at 250 and and 500 Hz. We are typically recording at 500 Hz.&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
* The EyeSeeCam is connected to a NUC with two USB-C cables.&lt;br /&gt;
* The NUC is connected via ethernet with a Mac Mini.&lt;br /&gt;
* The Mac Mini runs an EyeSeeCam SCI program.&lt;br /&gt;
* The Mac Mini runs an LSL server for data transfer to a Windows Computer&lt;br /&gt;
* The windows computer runs Matlab and connect with an LSL-stream to the LSL server on the Mac Mini.&lt;br /&gt;
&lt;br /&gt;
==Startup Instructions==&lt;br /&gt;
&lt;br /&gt;
* Connect the EyeSeeCam to the NUC&lt;br /&gt;
* Remove both lens covers of the EyeSeeCam&lt;br /&gt;
* Start the NUC&lt;br /&gt;
* Start the Mac Mini&lt;br /&gt;
* Login to EyeSeeCam with &#039;Gimbal&#039;&lt;br /&gt;
* Start the LSL script&lt;br /&gt;
* Start the EyeSeeCam SCI program&lt;br /&gt;
* Start a matlab program (e.g. EG_program.m)&lt;br /&gt;
* Press the &amp;lt;begin&amp;gt; button (in the top right corner)&lt;br /&gt;
* Let the subject put on the EyeSeeCam&lt;br /&gt;
* Press &amp;lt;prepare&amp;gt; (shows the traces)&lt;br /&gt;
* Press &amp;lt;start&amp;gt; (starts collecting data)&lt;br /&gt;
* Start the experiment&lt;br /&gt;
&lt;br /&gt;
==Construction of a gaze trace==&lt;br /&gt;
The data from EyeSeeCam is recorded in different coordinate systems than the double polar system that we use to present our targets. The head and eye tracking data each have their own coordinate system. To accurately analyze our data we have to transform these coordinate systems to double polar coordinates in the lab frame. In order to relate the EyeSeeCam coordinates to the lab coordinates, we first need to record a starting gaze position for each trial. This means that during the recording, the subject has to look at the center speaker (0,0) when the trial starts. &lt;br /&gt;
&lt;br /&gt;
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.&lt;br /&gt;
&lt;br /&gt;
In Matlab we can use the class &#039;EyeSeeCamSci_Trace&#039; for calculating head, eye and gaze traces in RAS, HVF and DP coordinates.&lt;br /&gt;
&lt;br /&gt;
===Defining a coordinate system===&lt;br /&gt;
&lt;br /&gt;
For the purpose of the interpretation of the EyeSeeCam data we will define the starting direction as the direction the subject looks at at the beginning of each trial. The recording of the data should start when the subject is looking in this direction. &amp;lt;br&amp;gt;&lt;br /&gt;
The X, Y and Z of the EyeSeeCam data are in terms of RAS coordinates:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* When using &#039;EyeSeeCamSci_Trace&#039; head and gaze traces with RAS coordinates the values are relative to the starting position of subject. &lt;br /&gt;
* The HVF and DP coordinates of the eye trace are only valid when the head stays in the starting position.&lt;br /&gt;
&lt;br /&gt;
===Head movement===&lt;br /&gt;
EyeSeeCam_Sci provides us angle velocities for rotations around the  X,Y, and Z axes. For our analyses we want to have position data, so we have to calculate the position data from these angle velocities.&lt;br /&gt;
 &lt;br /&gt;
The head rotation axis X, Y and Z are defined with respect to the EyeSeeCam itself. &lt;br /&gt;
&lt;br /&gt;
*&#039;&#039;The EyeSeeCam data is &#039;&#039;&#039;not&#039;&#039;&#039; in the lab frame!&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinates for every time step. &lt;br /&gt;
&lt;br /&gt;
Since the time steps are about 2 ms (or 4 ms) the rotations for each time step are small. Therefore the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To get the total rotation matrix in lab coordinates at time t we have iterate over all the time steps &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R(t=0) = identity matrix&lt;br /&gt;
for t = 0 to t = tmax&lt;br /&gt;
   R(t) = R(t-delta_t) * delta_R(t)&lt;br /&gt;
end &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Eye movement===&lt;br /&gt;
&lt;br /&gt;
The eye movements are given by rotations in the same coordinate system as the head movements.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
X = anterior (torsion of the pupil in the gaze direction)&lt;br /&gt;
Y = -right&lt;br /&gt;
Z = superior&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The torsion is not important for the gaze direction and is therefore excluded from the calculation.&lt;br /&gt;
&lt;br /&gt;
Azimuth can be defined as a rotation about the Z axis and Elevation as a contra rotation about the Y axis.&lt;br /&gt;
&lt;br /&gt;
Since the eye gaze is given as a rotation angle in OCS coordinates and the eye and head tracker coordinate are fixed relative to each other, we can easily transform OCS coordinates to the EyeSeeCam coordinates. We can then create rotation matrices in the EyeSeeCam coordinates for the azimuth and elevation of the gaze. Since the torsion is excluded the rotation matrices for azimuth and elevation can be multiplied in any order to get the eye rotation matrix in EyeSeeCam coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eye_Rz(t) = Rz(eye_azimuth(t))&lt;br /&gt;
eye_Ry(t) = Ry(-eye_elevation(t))&lt;br /&gt;
eye_R(t)  = eye_Ry(t) * eye_Rz(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Gaze movement===&lt;br /&gt;
&lt;br /&gt;
The gaze movement is the combined head and eye movement.&lt;br /&gt;
To create the total rotation matrix that represents the &#039;gaze&#039; rotation in lab coordinates, you need to multiply the two rotation matrices you have for head and eye movement in the correct order. In your case, you have:&lt;br /&gt;
&lt;br /&gt;
A rotation matrix for the head in lab coordinates.&lt;br /&gt;
A rotation matrix for the eyes in head coordinates.&lt;br /&gt;
The correct order of multiplication is as follows:&lt;br /&gt;
&lt;br /&gt;
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
R_gaze(t) = head_R(t) * eye_R(t)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Lastly you can multiply the R_gaze(t) with the starting gaze vector to get the gaze (in lab coordinates) at time t. We assume that the starting gaze is in the forward direction, which is in the X direction in the lab coordinates.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
startingGaze = [1; 0; 0]&lt;br /&gt;
gaze(t) = R_gaze(t) * startingGaze&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Matlab programming==&lt;br /&gt;
&lt;br /&gt;
===Functions for converting EyeSeeCam data to double polar data===&lt;br /&gt;
There is one function for head movement, one for eye movement and one for gaze movement. The gaze movement combines head and eye movement.&lt;br /&gt;
&lt;br /&gt;
The functions take head rotation speeds and eye rotations (in degrees) as input and give objects containing the traces (head, eye and gaze respectivily) in double polar coordinates as output.&lt;br /&gt;
&lt;br /&gt;
* headTrace_DP = EyeSeeCamSciHeadRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg)&lt;br /&gt;
* eyeTrace_DP  = EyeSeeCamSciEyeRotationData2Trace_DP(Zocs_deg, Yocs_deg)&lt;br /&gt;
* gazeTrace_DP = EyeSeeCamSciHeadAndEyeRotationData2Trace_DP(Vx_deg, Vy_deg, Vz_deg, Zocs_deg, Yocs_deg)&lt;br /&gt;
&lt;br /&gt;
The Vx, Vy and Vz are the head rotation speeds in the EyeSeeCam coordinates&lt;br /&gt;
The Zocs and Yocs are the eye rotations in OCS coordinates.&lt;br /&gt;
&lt;br /&gt;
===EyeSeeCam data===&lt;br /&gt;
The recorded data is read from an LSL stream.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
% todo: LSL streaming parameters&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The lsldata is a struct and has the following fields:&lt;br /&gt;
*escdata&lt;br /&gt;
*escmetadata&lt;br /&gt;
*escstr&lt;br /&gt;
*evdata0&lt;br /&gt;
*evdata1&lt;br /&gt;
*evdata2&lt;br /&gt;
*evdata3&lt;br /&gt;
*evdata4&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
%Eye data&lt;br /&gt;
&lt;br /&gt;
RightEyePosX = 46;&lt;br /&gt;
RightEyePosY = 47;&lt;br /&gt;
RightEyePosZ = 48;&lt;br /&gt;
&lt;br /&gt;
xeye = lsldata.escdata.Data(46,:);       % right 46, left 32&lt;br /&gt;
yeye = lsldata.escdata.Data(47,:);       % right 47, left 33&lt;br /&gt;
zeye = lsldata.escdata.Data(48,:);       % right 48, left 34&lt;br /&gt;
&lt;br /&gt;
% Head data&lt;br /&gt;
HeadInertialVelXCal = 27;&lt;br /&gt;
HeadInertialVelYCal = 29;&lt;br /&gt;
HeadInertialVelZCal = 31;&lt;br /&gt;
&lt;br /&gt;
xh = lsldata.escdata.Data(27,:);       % calibrated torion velocity data HEAD&lt;br /&gt;
yh = lsldata.escdata.Data(29,:);       % calibrated vertical velocity data HEAD&lt;br /&gt;
zh = lsldata.escdata.Data(31,:);       % calibrated horizontal velocity data HEAD&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Common_lab_equipment&amp;diff=4869</id>
		<title>Common lab equipment</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Common_lab_equipment&amp;diff=4869"/>
		<updated>2026-03-30T07:55:32Z</updated>

		<summary type="html">&lt;p&gt;Lof: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Common lab equipment is equipment that does not belong to a certain setup, but can instead be used in different settings.&lt;br /&gt;
&lt;br /&gt;
==Active speakers==&lt;br /&gt;
* [[Tannoy Reveal 402 specifications|Tannoy Reveal 402]]&lt;br /&gt;
&lt;br /&gt;
==Audio amplifiers==&lt;br /&gt;
*[[ECLER MPA 4-80 Power Amplifier Specifications|ECLER MPA 4-80 Power Amplifier]]&lt;br /&gt;
&lt;br /&gt;
==Audio interfaces==&lt;br /&gt;
*[[MOTU Ultra Lite MK3]]&lt;br /&gt;
*[[MOTU Ultra Lite MK5]] (2x)&lt;br /&gt;
&lt;br /&gt;
==Audiological Headphones==&lt;br /&gt;
*[[Sennheiser HDA280 specifications|Sennheiser HDA280]]&lt;br /&gt;
*[[Sennheiser HDA300 specifications|Sennheiser HDA300]]&lt;br /&gt;
*[[Radioear 3045 specifications|RadioEar 3045 (P4493)]]&lt;br /&gt;
*[[Telephonics TDH-39P specifications|Telephonics TDH-39P]]&lt;br /&gt;
*[[Peltor H7A]]&lt;br /&gt;
&lt;br /&gt;
==Audio recorders==&lt;br /&gt;
*[[Zoom F1 Field Recorder]]&lt;br /&gt;
==Audiometers==&lt;br /&gt;
*[[Interacoustics Audiometer AD229e]]&lt;br /&gt;
*[[Interacoustics Audiometer AD629]]&lt;br /&gt;
&lt;br /&gt;
==Headphones==&lt;br /&gt;
*[[Sennheiser HD600 specifications|Sennheiser HD600]]&lt;br /&gt;
*[[Sennheiser HD100 specifications|Sennheiser HD100]]&lt;br /&gt;
*[[Beyerdynamic DT 770 PRO 80 Ohm specifications|Beyerdynamic DT 770 PRO]]&lt;br /&gt;
*[[AKG K271 MKII specifications|AKG K271 MKII]]&lt;br /&gt;
&lt;br /&gt;
==Dummy head==&lt;br /&gt;
*[[Binaural Enthousiasts B1-E Dummy Head]]&lt;br /&gt;
*[[Binaural Dummy Head Kemar]]&lt;br /&gt;
&lt;br /&gt;
==EEG recorders==&lt;br /&gt;
*[[TSMI SAGA]]&lt;br /&gt;
*[[TSMI Mobita]]&lt;br /&gt;
&lt;br /&gt;
==Electronic Workshop Equipment==&lt;br /&gt;
*[[Stanford Research Amplifier SR560 specifications|Stanford Research Amplifier SR560]]&lt;br /&gt;
*[[Thurlby Thandar Instruments TGP110|TTI TGP110 Pulse generator]]&lt;br /&gt;
*[[Rigol DP932E|Rigol DP932E Programmable Power Supply]]&lt;br /&gt;
*[[GW Instek SFG-2104|GW Instek SFG-2104 Function generator]]&lt;br /&gt;
*[[National Instruments VirtualBench|NI Virtual Bench]]&lt;br /&gt;
*[[Delta E018-0.6D Power Supply]]&lt;br /&gt;
*[[Kern 440-47N Scales|Kern 440-47N Scales]]&lt;br /&gt;
&lt;br /&gt;
==Event recorders==&lt;br /&gt;
*[[Digital Event Recorder|2 channel digital event recorder]]&lt;br /&gt;
*[[Digital Event Recorder|4 channel digital event recorder]]&lt;br /&gt;
*[[Digital Event Recorder|8 channel digital event recorder]]&lt;br /&gt;
&lt;br /&gt;
==Eye trackers==&lt;br /&gt;
*[[Pupil Labs Core|Pupil Labs Core (2x)]]&lt;br /&gt;
*[[Pupil Labs HTC Vive addon]]&lt;br /&gt;
*[[SR Research Eyelink 1000 plus]]&lt;br /&gt;
*[[SR Research Eyelink II]]&lt;br /&gt;
*[[EyeSeeCam SCI]]&lt;br /&gt;
&lt;br /&gt;
==fNIRS system==&lt;br /&gt;
*[[Artinis Brite 24 NIRS|Artinis Brite 24]]&lt;br /&gt;
*[[Artinis Oxymon NIRS|Artinis Oxymon]]&lt;br /&gt;
&lt;br /&gt;
==Head coil devices==&lt;br /&gt;
* [[Head coil glasses Auditory Sphere Lab]]&lt;br /&gt;
* [[Head coil glasses Auditory Motion Lab]]&lt;br /&gt;
* [[Head coil glasses Auditory Perception Lab]]&lt;br /&gt;
&lt;br /&gt;
==Insert Earphones==&lt;br /&gt;
*[[Etymotic Research Tuberphone ER-2 specifications|Etymotic Tuberphone ER-2]]&lt;br /&gt;
*[[Etymotic Research Tuberphone ER-3C specifications|Etymotic Tuberphone ER-3C]]&lt;br /&gt;
&lt;br /&gt;
==Head trackers==&lt;br /&gt;
*[[EM Field Head Tracking System specifications|EMF Head Tracking System]]&lt;br /&gt;
*[[EyeSeeCam SCI]]&lt;br /&gt;
&lt;br /&gt;
==Insert ear microphone==&lt;br /&gt;
*[[Etymotic Research ER-7C specifications|Etymotic ER-7C probe microphone]]&lt;br /&gt;
&lt;br /&gt;
==LED controllers==&lt;br /&gt;
*[[DCN LED controller|DCN LED controller]]&lt;br /&gt;
*[[PLC LED controller specifications|PLC LED controller]]&lt;br /&gt;
&lt;br /&gt;
==Microphone amplifiers==&lt;br /&gt;
*[[Behringer Xenyx 302 USB specifications|Behringer Xenyx 302 USB]]&lt;br /&gt;
*[[DAP audio PRE-202 specifications|DAP audio PRE-202 (3x)]]&lt;br /&gt;
&lt;br /&gt;
==Microphones==&lt;br /&gt;
*[[Behringer ECM8000 specifications|Behringer ECM8000 condensor microphone]]&lt;br /&gt;
&lt;br /&gt;
==Oscilloscopes==&lt;br /&gt;
*[[Tektronix MSO 2024]]&lt;br /&gt;
*[[Tektronix TDS 2004]]&lt;br /&gt;
*[[Tektronix TDS 2004B]]&lt;br /&gt;
*[[Tektronix TDS 2012]]&lt;br /&gt;
*[[Tektronix TBS 1064]]&lt;br /&gt;
[https://www.tek.com/en/documents/primer/oscilloscope-basics Oscilloscope basics]&lt;br /&gt;
&lt;br /&gt;
==Other equipment==&lt;br /&gt;
*[[Beamer Optoma EH319USTi|Optoma Beamer]]&lt;br /&gt;
*[[Field Analyser NFA 400 specifications|Field Analyser NFA 400]]&lt;br /&gt;
*[[Leica Disto D510 specifications|Leica Disto D510 laser distance meter]]&lt;br /&gt;
*[[Manhattan Numeric Pad USB]] (6x)&lt;br /&gt;
*[[Waterpik WP-660 Waterflosser]]&lt;br /&gt;
*[[Screen Trigger Device specifications|Screen Trigger Device]]&lt;br /&gt;
&lt;br /&gt;
==Response buttons==&lt;br /&gt;
* [[Buttonbox for RZ6]]&lt;br /&gt;
* [[Passive handheld push button]]&lt;br /&gt;
* [[Active handheld push button (5V output)]]&lt;br /&gt;
* [[Debouncer for passive push button]]&lt;br /&gt;
&lt;br /&gt;
==Sound level meters==&lt;br /&gt;
*[[Brüel &amp;amp; Kjear model 2250 Light|Brüel &amp;amp; Kjear model 2250 type 1]]&lt;br /&gt;
*[[Multicomp Pro MP780905 specifications|Multicomp Pro MP780905 type 2]]&lt;br /&gt;
&lt;br /&gt;
==Passive speakers==&lt;br /&gt;
* [[Cambridge Audio Minx Min 12 specifications|Cambridge Audio Minx Min 12]]&lt;br /&gt;
* [[Avantone Pro Mixcube specifications|Avantone Pro Mixcube]]&lt;br /&gt;
* [[JBL Control One]]&lt;br /&gt;
&lt;br /&gt;
==Sound level meter calibration devices==&lt;br /&gt;
*[[Aco Pacific model 521 specifications|Aco Pacific model 521]]&lt;br /&gt;
*[[Brüel &amp;amp; Kjaer Sound Calibrator model 4231 specifications|Brüel &amp;amp; Kjaer model 4231]]&lt;br /&gt;
&lt;br /&gt;
==TDT equipment==&lt;br /&gt;
*[[TDT RZ6|RZ6]]&lt;br /&gt;
*[[TDT Series 3|Series 3]]&lt;br /&gt;
&lt;br /&gt;
==VR Headsets==&lt;br /&gt;
*[[HTC VIVE]]&lt;br /&gt;
*[[HTC VIVE Cosmos]]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Software_Management&amp;diff=4868</id>
		<title>Guidelines for Research Software Management</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Software_Management&amp;diff=4868"/>
		<updated>2025-11-27T08:53:25Z</updated>

		<summary type="html">&lt;p&gt;Lof: Created page with &amp;quot;=Introduction to FAIR research software=  One of the pillars of open science is data. The guiding principle for research data is to make it open where possible and closed where necessary. Much has been achieved in making this a reality over the last few years.  Research software on the other hand, has been somewhat forgotten. However, it can be an integral part of a publication. Therefore, it is vitally important that code associated with scientific publications, is as r...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to FAIR research software=&lt;br /&gt;
&lt;br /&gt;
One of the pillars of open science is data. The guiding principle for research data is to make it open where possible and closed where necessary. Much has been achieved in making this a reality over the last few years.&lt;br /&gt;
&lt;br /&gt;
Research software on the other hand, has been somewhat forgotten. However, it can be an integral part of a publication. Therefore, it is vitally important that code associated with scientific publications, is as retrievable, accessible, interoperable and reusable. Irrespective of whether it concerns a simple script or a complete library.&lt;br /&gt;
&lt;br /&gt;
But how can code be published transparently? The FAIR principles of software and data management have been adopted by the Radboud and other universities to guide best practice for researchers in this area.&lt;br /&gt;
&lt;br /&gt;
FAIR stands for Findable, Accessible, Interoperable, and Reusable.&lt;br /&gt;
&lt;br /&gt;
[https://www.rug.nl/digital-competence-centre/research-data/research-software-management/fair-research-software?lang=en FAIR homepage]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=General_research_information&amp;diff=4867</id>
		<title>General research information</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=General_research_information&amp;diff=4867"/>
		<updated>2025-11-27T08:53:18Z</updated>

		<summary type="html">&lt;p&gt;Lof: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==General==&lt;br /&gt;
*[[Good Laboratory Practice (GLP)]]&lt;br /&gt;
*[[Ethics &amp;amp; Subjects]]&lt;br /&gt;
*[[Guidelines for Research Data Management]]&lt;br /&gt;
*[[Guidelines for Research Software Management]]&lt;br /&gt;
==Practical==&lt;br /&gt;
*[[QReserve | Booking a lab]]&lt;br /&gt;
*[[Gitlab | Code on Gitlab]] &lt;br /&gt;
*[[Issues Tracking]]&lt;br /&gt;
*[[Network]]&lt;br /&gt;
*[[Student supervision]]&lt;br /&gt;
*[[Tutorials]]&lt;br /&gt;
&lt;br /&gt;
==Other==&lt;br /&gt;
*[[COVID-19]]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4866</id>
		<title>Guidelines for Research Program Management</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4866"/>
		<updated>2025-11-27T08:53:01Z</updated>

		<summary type="html">&lt;p&gt;Lof: Blanked the page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4865</id>
		<title>Guidelines for Research Program Management</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4865"/>
		<updated>2025-11-27T08:52:22Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Introduction to FAIR research software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to FAIR research software=&lt;br /&gt;
&lt;br /&gt;
One of the pillars of open science is data. The guiding principle for research data is to make it open where possible and closed where necessary. Much has been achieved in making this a reality over the last few years.&lt;br /&gt;
&lt;br /&gt;
Research software on the other hand, has been somewhat forgotten. However, it can be an integral part of a publication. Therefore, it is vitally important that code associated with scientific publications, is as retrievable, accessible, interoperable and reusable. Irrespective of whether it concerns a simple script or a complete library.&lt;br /&gt;
&lt;br /&gt;
But how can code be published transparently? The FAIR principles of software and data management have been adopted by the Radboud and other universities to guide best practice for researchers in this area.&lt;br /&gt;
&lt;br /&gt;
FAIR stands for Findable, Accessible, Interoperable, and Reusable.&lt;br /&gt;
&lt;br /&gt;
[https://www.rug.nl/digital-competence-centre/research-data/research-software-management/fair-research-software?lang=en FAIR homepage]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4864</id>
		<title>Guidelines for Research Program Management</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=Guidelines_for_Research_Program_Management&amp;diff=4864"/>
		<updated>2025-11-27T08:49:35Z</updated>

		<summary type="html">&lt;p&gt;Lof: Created page with &amp;quot;=Introduction to FAIR research software=  One of the pillars of open science is data. The guiding principle for research data is to make it open where possible and closed where necessary. Much has been achieved in making this a reality over the last few years.  Research software on the other hand, has been somewhat forgotten. However, it can be an integral part of a publication. Therefore, it is vitally important that code associated with scientific publications, is as r...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to FAIR research software=&lt;br /&gt;
&lt;br /&gt;
One of the pillars of open science is data. The guiding principle for research data is to make it open where possible and closed where necessary. Much has been achieved in making this a reality over the last few years.&lt;br /&gt;
&lt;br /&gt;
Research software on the other hand, has been somewhat forgotten. However, it can be an integral part of a publication. Therefore, it is vitally important that code associated with scientific publications, is as retrievable, accessible, interoperable and reusable. Irrespective of whether it concerns a simple script or a complete library.&lt;br /&gt;
&lt;br /&gt;
But how can code be published transparently? The FAIR principles of software and data management have been adopted by the Radboud and other universities to guide best practice for researchers in this area.&lt;br /&gt;
&lt;br /&gt;
[https://www.rug.nl/digital-competence-centre/research-data/research-software-management/fair-research-software?lang=en]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=General_research_information&amp;diff=4863</id>
		<title>General research information</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=General_research_information&amp;diff=4863"/>
		<updated>2025-11-27T08:47:37Z</updated>

		<summary type="html">&lt;p&gt;Lof: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==General==&lt;br /&gt;
*[[Good Laboratory Practice (GLP)]]&lt;br /&gt;
*[[Ethics &amp;amp; Subjects]]&lt;br /&gt;
*[[Guidelines for Research Data Management]]&lt;br /&gt;
*[[Guidelines for Research Program Management]]&lt;br /&gt;
==Practical==&lt;br /&gt;
*[[QReserve | Booking a lab]]&lt;br /&gt;
*[[Gitlab | Code on Gitlab]] &lt;br /&gt;
*[[Issues Tracking]]&lt;br /&gt;
*[[Network]]&lt;br /&gt;
*[[Student supervision]]&lt;br /&gt;
*[[Tutorials]]&lt;br /&gt;
&lt;br /&gt;
==Other==&lt;br /&gt;
*[[COVID-19]]&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4862</id>
		<title>How to</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4862"/>
		<updated>2025-11-25T10:13:24Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Lab==&lt;br /&gt;
*How to make a lab [[QReserve|reservation]]?&lt;br /&gt;
*How to report an [[Issues Tracking|issue]]?&lt;br /&gt;
*How to work with [[Ethics &amp;amp; Subjects|subjects]]?&lt;br /&gt;
*How to use a [[Lab journal|lab journal]]?&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
*How to use a [[DCN LED controller|DCN LED controller]]?&lt;br /&gt;
*How to use the [[PLC LED controller specifications|PLC LED controller]]?&lt;br /&gt;
*How to use a [[Digital Event Recorder]]?&lt;br /&gt;
&lt;br /&gt;
==Software==&lt;br /&gt;
*How to use [[Gitlab]]?&lt;br /&gt;
*How to use [[BIOX]]?&lt;br /&gt;
*How to use [[LabStreamingLayer|LabStreamingLayer (LSL)]]?&lt;br /&gt;
*How to use [[Coordinate systems]] in experiments?&lt;br /&gt;
*How to use [[Units in Matlab]]?&lt;br /&gt;
*How to make a [[Programming a GUI|Graphical User Interface (GUI)]] in Matlab?&lt;br /&gt;
*How to make [[Ripple Sounds]]?&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4861</id>
		<title>How to</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=How_to&amp;diff=4861"/>
		<updated>2025-11-25T10:12:38Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Lab==&lt;br /&gt;
*How to make a lab [[QReserve|reservation]]?&lt;br /&gt;
*How to report an [[Issues Tracking|issue]]?&lt;br /&gt;
*How to work with [[Ethics &amp;amp; Subjects|subjects]]?&lt;br /&gt;
*How to use a [[Lab journal|lab journal]]?&lt;br /&gt;
&lt;br /&gt;
==Hardware==&lt;br /&gt;
*How to use a [[DCN LED controller|DCN LED controller]]?&lt;br /&gt;
*How to use the [[PLC LED controller specifications|PLC LED controller]]?&lt;br /&gt;
*How to use a [[Digital Event Recorder]]?&lt;br /&gt;
&lt;br /&gt;
==Software==&lt;br /&gt;
*How to use [[Gitlab]]?&lt;br /&gt;
*How to use [[BIOX]]?&lt;br /&gt;
*How to use [[LabStreamingLayer|LabStreamingLayer (LSL)]]?&lt;br /&gt;
*How to use [[Coordinate systems]] in experiments?&lt;br /&gt;
*How to use [[Units in Matlab]]?&lt;br /&gt;
*How to make a [[Programming a GUI|Graphical User Interface (GUI)]] in Matlab?&lt;br /&gt;
*How to make [[ripple sounds]]?&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=EXP_programs&amp;diff=4860</id>
		<title>EXP programs</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=EXP_programs&amp;diff=4860"/>
		<updated>2025-08-22T12:44:56Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
&lt;br /&gt;
!!!!SOME PROGRAMS ARE STILL IN DEVELOPMENT!!!!&lt;br /&gt;
&lt;br /&gt;
EXP programs are MATLAB programs for each auditory lab based on the EXP framework. The EXP framework is a MATLAB toolbox developed by [[Ruurd Lof]]. The framework is a collection of classes and functions that are the basis for Lab programs. The toolbox is mainly programmed object oriented programming style. It is modular with respect to the hardware that can be used.&lt;br /&gt;
&lt;br /&gt;
Each lab program is created by defining a gui and a few sub classes of certain classes in the EXP framework.&lt;br /&gt;
&lt;br /&gt;
The following programs are available:&lt;br /&gt;
*TL_Program for the test lab&lt;br /&gt;
*MO_Program for mobile applications (labtop with soundcard and eventrecorder)&lt;br /&gt;
*PL_Program for the auditory perception lab (patient lab)&lt;br /&gt;
*EG_Program for the EEG/NIRS lab &lt;br /&gt;
*RA_Program for the auditory persuit lab (robot arm)&lt;br /&gt;
*VC_Program for the vestibular chair lab&lt;br /&gt;
&lt;br /&gt;
==EXP framework==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The main classes in the EXP framework are:&lt;br /&gt;
&lt;br /&gt;
    EXP_programParameters&lt;br /&gt;
    EXP_hardwareSystems &lt;br /&gt;
    EXP_experiment&lt;br /&gt;
    EXP_recordingsHandler &lt;br /&gt;
    EXP_experimentPlayer&lt;br /&gt;
    EXP_guiHandler&lt;br /&gt;
&lt;br /&gt;
Other classes represent hardware that can be used in the experiment:&lt;br /&gt;
&lt;br /&gt;
    EXP_bioxSystem&lt;br /&gt;
    EXP_eventRecorder &lt;br /&gt;
    EXP_DCN_ledController&lt;br /&gt;
&lt;br /&gt;
Each lab program has a dedicated GUI that is always responsive and displays sounds, sound locations, led locations and acquisition results like head movements for every trial.&lt;br /&gt;
&lt;br /&gt;
In each program GUI you select an experiment file that can consist of several blocks with trials.&lt;br /&gt;
&lt;br /&gt;
The program outputs a .mat file for every trial with a struct called &#039;trialInfo&#039;.&lt;br /&gt;
&lt;br /&gt;
==Lab program example==&lt;br /&gt;
&lt;br /&gt;
An example of a Lab program is the TestLabProgram. All programs have the same basic structure:&lt;br /&gt;
&lt;br /&gt;
Six objects (TestLab classes are starting with TL_) are created in a fixed order and linked by passing references to the objects. &lt;br /&gt;
 &lt;br /&gt;
    programPar      = TL_programParameters(version);&lt;br /&gt;
    hardwareSystems = TL_hardwareSystems(programPar);                      &lt;br /&gt;
    experiment      = TL_experiment(programPar);             &lt;br /&gt;
    trialRecordings = TL_recordingsHandler(programPar, experiment, hardwareSystems);     &lt;br /&gt;
    player          = TL_experimentPlayer(hardwareSystems, trialRecordings, programPar, experiment);&lt;br /&gt;
    guihandler      = TL_guiHandler(player, programPar, experiment, hardwareSystems);&lt;br /&gt;
&lt;br /&gt;
At last a GUI is launched with a link to the guihandler object.&lt;br /&gt;
    &lt;br /&gt;
    TL_Gui(guihandler);&lt;br /&gt;
&lt;br /&gt;
===Program parameters===&lt;br /&gt;
The programPar object ....&lt;br /&gt;
&lt;br /&gt;
===Hardware systems===&lt;br /&gt;
The hardwareSystems object ....&lt;br /&gt;
&lt;br /&gt;
===Experiment===&lt;br /&gt;
The eperiment object ....&lt;br /&gt;
&lt;br /&gt;
===Trial recordings===&lt;br /&gt;
The trialRecordings object ....&lt;br /&gt;
&lt;br /&gt;
===Experiment player===&lt;br /&gt;
The experimentPlayer object ....&lt;br /&gt;
&lt;br /&gt;
===Gui handler===&lt;br /&gt;
The guiHandler object ....&lt;br /&gt;
&lt;br /&gt;
===The GUI===&lt;br /&gt;
The Gui object ...&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=NIRS-EEG_technical_info&amp;diff=4859</id>
		<title>NIRS-EEG technical info</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=NIRS-EEG_technical_info&amp;diff=4859"/>
		<updated>2025-06-04T12:03:06Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LEDs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:NIRS-EEG_lab.png|thumb|Lab floor plan]]&lt;br /&gt;
[[File:Refa64.png|thumb|Refa64]]&lt;br /&gt;
[[File:Mobita32.png|thumb|Mobita32]]&lt;br /&gt;
[[NIRS-EEG lab|back to NIRS-EEG]]&lt;br /&gt;
== Booth ==&lt;br /&gt;
*Dimensions: LxWxH = 540x280x265cm&lt;br /&gt;
===Acoustics===&lt;br /&gt;
*Walls: pyramid type acoustic foam&lt;br /&gt;
*Floor: anti-fatigue rubber floor mat with holes&lt;br /&gt;
see [[Acoustic Materials]]&lt;br /&gt;
&lt;br /&gt;
== Rack ==&lt;br /&gt;
&lt;br /&gt;
*[[Tektronix TDS 2012|Oscilloscope]]&lt;br /&gt;
*[[8 channel digital event recorder|Digital event recorder (LSLDER04)]]&lt;br /&gt;
&lt;br /&gt;
===TDT===&lt;br /&gt;
* 1 RZ6  Multi-core I/O Processor (DSP)&lt;br /&gt;
* 2 PM2R multiplexer&lt;br /&gt;
* 1 RP2  Enhanced Real-Time Processor (DSP)&lt;br /&gt;
* 2 RA16 Medusa base station&lt;br /&gt;
&lt;br /&gt;
==Speakers==&lt;br /&gt;
A semi-circle of 31 [[Cambridge Audio Minx Min 12 specifications|Cambridge Audio Minx Min12 speakers]] is available for sound localization experiments:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+ Speaker Position Table&lt;br /&gt;
|-&lt;br /&gt;
! Speaker position (degree) !! Device !! Channel !! Speaker position (degree) !! Device !! Channel&lt;br /&gt;
|-&lt;br /&gt;
|   ||   ||   || 0 || 0 || 0&lt;br /&gt;
|-&lt;br /&gt;
| -5 || 2 || 1 || 5 || 0 || 1&lt;br /&gt;
|-&lt;br /&gt;
| -10 || 2 || 2 || 10 || 0 || 2&lt;br /&gt;
|-&lt;br /&gt;
| -15 || 2 || 3 || 15 || 0 || 3&lt;br /&gt;
|-&lt;br /&gt;
| -20 || 2 || 4 || 20 || 0 || 4&lt;br /&gt;
|-&lt;br /&gt;
| -25 || 2 || 5 || 25 || 0 || 5&lt;br /&gt;
|-&lt;br /&gt;
| -30 || 2 || 6 || 30 || 0 || 6&lt;br /&gt;
|-&lt;br /&gt;
| -35 || 2 || 7 || 35 || 0 || 7&lt;br /&gt;
|-&lt;br /&gt;
| -40 || 2 || 8 || 40 || 0 || 8&lt;br /&gt;
|-&lt;br /&gt;
| -45 || 2 || 9 || 45 || 0 || 9&lt;br /&gt;
|-&lt;br /&gt;
| -50 || 2 || 10 || 50 || 0 || 10&lt;br /&gt;
|-&lt;br /&gt;
| -55 || 2 || 11 || 55 || 0 || 11&lt;br /&gt;
|-&lt;br /&gt;
| -60 || 2 || 12 || 60 || 0 || 12&lt;br /&gt;
|-&lt;br /&gt;
| -70 || 2 || 13 || 70 || 0 || 13&lt;br /&gt;
|-&lt;br /&gt;
| -80 || 2 || 14 || 80 || 0 || 14&lt;br /&gt;
|-&lt;br /&gt;
| -90 || 2 || 15 || 90 || 0 || 15&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==LEDs==&lt;br /&gt;
[[File:xxxxxxx.png|thumb|LED wiring scheme]] &lt;br /&gt;
%todo&lt;br /&gt;
&lt;br /&gt;
The system is triggered by the RZ6 output.&lt;br /&gt;
&lt;br /&gt;
===Parts===&lt;br /&gt;
*[[DCN LED controller]] (2x)&lt;br /&gt;
*LED mounting frames (32x)&lt;br /&gt;
*[[LED specifications|Red/Green LED’s]](32x)&lt;br /&gt;
*Mini-jack to mini-jack cables&lt;br /&gt;
&lt;br /&gt;
===LED positions===&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+ LED Position Table&lt;br /&gt;
|-&lt;br /&gt;
! LED position (degree) !! Device !! Channel !! LED position (degree) !! Device !! Channel&lt;br /&gt;
|-&lt;br /&gt;
|   ||   ||   || 0 || DCN-LED05 || 0&lt;br /&gt;
|-&lt;br /&gt;
| -5  || DCN-LED04 || 1 || 5 || DCN-LED05 || 1&lt;br /&gt;
|-&lt;br /&gt;
| -10 || DCN-LED04 || 2 || 10 || DCN-LED05|| 2&lt;br /&gt;
|-&lt;br /&gt;
| -15 || DCN-LED04 || 3 || 15 || DCN-LED05|| 3&lt;br /&gt;
|-&lt;br /&gt;
| -20 || DCN-LED04 || 4 || 20 || DCN-LED05 || 4&lt;br /&gt;
|-&lt;br /&gt;
| -25 || DCN-LED04 || 5 || 25 || DCN-LED05 || 5&lt;br /&gt;
|-&lt;br /&gt;
| -30 || DCN-LED04 || 6 || 30 || DCN-LED05 || 6&lt;br /&gt;
|-&lt;br /&gt;
| -35 || DCN-LED04 || 7 || 35 || DCN-LED05 || 7&lt;br /&gt;
|-&lt;br /&gt;
| -40 || DCN-LED04 || 8 || 40 || DCN-LED05 || 8&lt;br /&gt;
|-&lt;br /&gt;
| -45 || DCN-LED04 || 9 || 45 || DCN-LED05 || 9&lt;br /&gt;
|-&lt;br /&gt;
| -50 || DCN-LED04 || 10 || 50 || DCN-LED05 || 10&lt;br /&gt;
|-&lt;br /&gt;
| -55 || DCN-LED04 || 11 || 55 || DCN-LED05 || 11&lt;br /&gt;
|-&lt;br /&gt;
| -60 || DCN-LED04 || 12 || 60 || DCN-LED05 || 12&lt;br /&gt;
|-&lt;br /&gt;
| -70 || DCN-LED04 || 13 || 70 || DCN-LED05 || 13&lt;br /&gt;
|-&lt;br /&gt;
| -80 || DCN-LED04  || 14 || 80 || DCN-LED05 || 14&lt;br /&gt;
|-&lt;br /&gt;
| -90 || DCN-LED04 || 15 || 90 || DCN-LED05 || 15&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== EEG ==&lt;br /&gt;
* 32 channel water-based electrode system (TMSI, Mobita), including headcap(s)&lt;br /&gt;
* 72 channel REFA, including headcaps in 3 different sizes (TMSI)&lt;br /&gt;
* 16 channel REFA (TMSI)&lt;br /&gt;
* ring electrodes including separate caps for combined measurements with NIRS&lt;br /&gt;
* TMSI Polybench recording software&lt;br /&gt;
&lt;br /&gt;
== NIRS ==&lt;br /&gt;
* 2 x 24 channel system (Oxymon, Artinis) each consisting of 8 transmitters and 4 receivers (splitted fibers)&lt;br /&gt;
* sampling rate up to 250 Hz&lt;br /&gt;
* 8 AD channels (according to Artinis, they only sample at the rate of all other channels. Care needs to be taken, if sampling at low rates. Present your triggers for longer durations then)&lt;br /&gt;
* Oxysoft recording software&lt;br /&gt;
&lt;br /&gt;
== Disposables ==&lt;br /&gt;
For the preparation of the subjects you need all types of disposables like alcohol, lotion, cotton pads, electrolyte gel, etc... You can find information about these items in [[NIRS-EEG lab disposables]], and also info where to order them when needed.&lt;br /&gt;
&lt;br /&gt;
== Kitchen ==&lt;br /&gt;
[[File:EEG_NIRS_kitchen.JPEG|thumb|EEG/NIRS kitchen]]&lt;br /&gt;
A small [[NIRS-EEG kitchen|kitchen]] is available for washing the hair of the subject and for cleaning caps and other equipment afterwards.&lt;br /&gt;
&lt;br /&gt;
== FIRST AID (EHBO) ==&lt;br /&gt;
Above the [[NIRS-EEG kitchen|kitchen]] on the right there is a FIRST AID-kit on the wall.&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
	<entry>
		<id>https://wiki.biophysics.science.ru.nl/index.php?title=NIRS-EEG_technical_info&amp;diff=4858</id>
		<title>NIRS-EEG technical info</title>
		<link rel="alternate" type="text/html" href="https://wiki.biophysics.science.ru.nl/index.php?title=NIRS-EEG_technical_info&amp;diff=4858"/>
		<updated>2025-06-04T12:00:58Z</updated>

		<summary type="html">&lt;p&gt;Lof: /* LEDs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:NIRS-EEG_lab.png|thumb|Lab floor plan]]&lt;br /&gt;
[[File:Refa64.png|thumb|Refa64]]&lt;br /&gt;
[[File:Mobita32.png|thumb|Mobita32]]&lt;br /&gt;
[[NIRS-EEG lab|back to NIRS-EEG]]&lt;br /&gt;
== Booth ==&lt;br /&gt;
*Dimensions: LxWxH = 540x280x265cm&lt;br /&gt;
===Acoustics===&lt;br /&gt;
*Walls: pyramid type acoustic foam&lt;br /&gt;
*Floor: anti-fatigue rubber floor mat with holes&lt;br /&gt;
see [[Acoustic Materials]]&lt;br /&gt;
&lt;br /&gt;
== Rack ==&lt;br /&gt;
&lt;br /&gt;
*[[Tektronix TDS 2012|Oscilloscope]]&lt;br /&gt;
*[[8 channel digital event recorder|Digital event recorder (LSLDER04)]]&lt;br /&gt;
&lt;br /&gt;
===TDT===&lt;br /&gt;
* 1 RZ6  Multi-core I/O Processor (DSP)&lt;br /&gt;
* 2 PM2R multiplexer&lt;br /&gt;
* 1 RP2  Enhanced Real-Time Processor (DSP)&lt;br /&gt;
* 2 RA16 Medusa base station&lt;br /&gt;
&lt;br /&gt;
==Speakers==&lt;br /&gt;
A semi-circle of 31 [[Cambridge Audio Minx Min 12 specifications|Cambridge Audio Minx Min12 speakers]] is available for sound localization experiments:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+ Speaker Position Table&lt;br /&gt;
|-&lt;br /&gt;
! Speaker position (degree) !! Device !! Channel !! Speaker position (degree) !! Device !! Channel&lt;br /&gt;
|-&lt;br /&gt;
|   ||   ||   || 0 || 0 || 0&lt;br /&gt;
|-&lt;br /&gt;
| -5 || 2 || 1 || 5 || 0 || 1&lt;br /&gt;
|-&lt;br /&gt;
| -10 || 2 || 2 || 10 || 0 || 2&lt;br /&gt;
|-&lt;br /&gt;
| -15 || 2 || 3 || 15 || 0 || 3&lt;br /&gt;
|-&lt;br /&gt;
| -20 || 2 || 4 || 20 || 0 || 4&lt;br /&gt;
|-&lt;br /&gt;
| -25 || 2 || 5 || 25 || 0 || 5&lt;br /&gt;
|-&lt;br /&gt;
| -30 || 2 || 6 || 30 || 0 || 6&lt;br /&gt;
|-&lt;br /&gt;
| -35 || 2 || 7 || 35 || 0 || 7&lt;br /&gt;
|-&lt;br /&gt;
| -40 || 2 || 8 || 40 || 0 || 8&lt;br /&gt;
|-&lt;br /&gt;
| -45 || 2 || 9 || 45 || 0 || 9&lt;br /&gt;
|-&lt;br /&gt;
| -50 || 2 || 10 || 50 || 0 || 10&lt;br /&gt;
|-&lt;br /&gt;
| -55 || 2 || 11 || 55 || 0 || 11&lt;br /&gt;
|-&lt;br /&gt;
| -60 || 2 || 12 || 60 || 0 || 12&lt;br /&gt;
|-&lt;br /&gt;
| -70 || 2 || 13 || 70 || 0 || 13&lt;br /&gt;
|-&lt;br /&gt;
| -80 || 2 || 14 || 80 || 0 || 14&lt;br /&gt;
|-&lt;br /&gt;
| -90 || 2 || 15 || 90 || 0 || 15&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==LEDs==&lt;br /&gt;
Standard red and green leds are mounted in the center of the speakers:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+ LED Position Table&lt;br /&gt;
|-&lt;br /&gt;
! LED position (degree) !! Device !! Channel !! LED position (degree) !! Device !! Channel&lt;br /&gt;
|-&lt;br /&gt;
|   ||   ||   || 0 || DCN-LED05 || 0&lt;br /&gt;
|-&lt;br /&gt;
| -5  || DCN-LED04 || 1 || 5 || DCN-LED05 || 1&lt;br /&gt;
|-&lt;br /&gt;
| -10 || DCN-LED04 || 2 || 10 || DCN-LED05|| 2&lt;br /&gt;
|-&lt;br /&gt;
| -15 || DCN-LED04 || 3 || 15 || DCN-LED05|| 3&lt;br /&gt;
|-&lt;br /&gt;
| -20 || DCN-LED04 || 4 || 20 || DCN-LED05 || 4&lt;br /&gt;
|-&lt;br /&gt;
| -25 || DCN-LED04 || 5 || 25 || DCN-LED05 || 5&lt;br /&gt;
|-&lt;br /&gt;
| -30 || DCN-LED04 || 6 || 30 || DCN-LED05 || 6&lt;br /&gt;
|-&lt;br /&gt;
| -35 || DCN-LED04 || 7 || 35 || DCN-LED05 || 7&lt;br /&gt;
|-&lt;br /&gt;
| -40 || DCN-LED04 || 8 || 40 || DCN-LED05 || 8&lt;br /&gt;
|-&lt;br /&gt;
| -45 || DCN-LED04 || 9 || 45 || DCN-LED05 || 9&lt;br /&gt;
|-&lt;br /&gt;
| -50 || DCN-LED04 || 10 || 50 || DCN-LED05 || 10&lt;br /&gt;
|-&lt;br /&gt;
| -55 || DCN-LED04 || 11 || 55 || DCN-LED05 || 11&lt;br /&gt;
|-&lt;br /&gt;
| -60 || DCN-LED04 || 12 || 60 || DCN-LED05 || 12&lt;br /&gt;
|-&lt;br /&gt;
| -70 || DCN-LED04 || 13 || 70 || DCN-LED05 || 13&lt;br /&gt;
|-&lt;br /&gt;
| -80 || DCN-LED04  || 14 || 80 || DCN-LED05 || 14&lt;br /&gt;
|-&lt;br /&gt;
| -90 || DCN-LED04 || 15 || 90 || DCN-LED05 || 15&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== EEG ==&lt;br /&gt;
* 32 channel water-based electrode system (TMSI, Mobita), including headcap(s)&lt;br /&gt;
* 72 channel REFA, including headcaps in 3 different sizes (TMSI)&lt;br /&gt;
* 16 channel REFA (TMSI)&lt;br /&gt;
* ring electrodes including separate caps for combined measurements with NIRS&lt;br /&gt;
* TMSI Polybench recording software&lt;br /&gt;
&lt;br /&gt;
== NIRS ==&lt;br /&gt;
* 2 x 24 channel system (Oxymon, Artinis) each consisting of 8 transmitters and 4 receivers (splitted fibers)&lt;br /&gt;
* sampling rate up to 250 Hz&lt;br /&gt;
* 8 AD channels (according to Artinis, they only sample at the rate of all other channels. Care needs to be taken, if sampling at low rates. Present your triggers for longer durations then)&lt;br /&gt;
* Oxysoft recording software&lt;br /&gt;
&lt;br /&gt;
== Disposables ==&lt;br /&gt;
For the preparation of the subjects you need all types of disposables like alcohol, lotion, cotton pads, electrolyte gel, etc... You can find information about these items in [[NIRS-EEG lab disposables]], and also info where to order them when needed.&lt;br /&gt;
&lt;br /&gt;
== Kitchen ==&lt;br /&gt;
[[File:EEG_NIRS_kitchen.JPEG|thumb|EEG/NIRS kitchen]]&lt;br /&gt;
A small [[NIRS-EEG kitchen|kitchen]] is available for washing the hair of the subject and for cleaning caps and other equipment afterwards.&lt;br /&gt;
&lt;br /&gt;
== FIRST AID (EHBO) ==&lt;br /&gt;
Above the [[NIRS-EEG kitchen|kitchen]] on the right there is a FIRST AID-kit on the wall.&lt;/div&gt;</summary>
		<author><name>Lof</name></author>
	</entry>
</feed>