Difference between revisions of "EyeSeeCam SCI"
Line 2: | Line 2: | ||
==Description== | ==Description== | ||
The EyeSeeCam SCI is a combined eye tracker and head movement tracker. | The EyeSeeCam SCI is a combined eye tracker and head movement tracker. | ||
+ | |||
+ | ==Construction of a gaze trace== | ||
+ | From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace. | ||
+ | |||
+ | ===Head movement=== | ||
+ | The EyeSeeCam gives angle velocities for rotations around X, Y and Z axis. In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinate for every time step. | ||
+ | |||
+ | Since the time steps are about 2 ms the rotations for each time step are small. Therefor the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by | ||
+ | <pre> | ||
+ | delta_R(t) = delta_Rx * delta_Ry * delta_Rz. | ||
+ | </pre> | ||
+ | |||
+ | To get R(t) in lab coordinates we have | ||
+ | <pre> | ||
+ | R(t) = R(t-delta_t) * delta_R(t). | ||
+ | </pre> | ||
+ | |||
+ | ===Eye movement=== | ||
+ | |||
+ | |||
+ | |||
+ | ===Gaze movement=== | ||
+ | |||
+ | The gaze movement is the combined head and eye movement. | ||
+ | To create the total rotation matrix that represents the 'gaze' rotation in lab coordinates, you need to multiply the two rotation matrices you have in the correct order. In your case, you have: | ||
+ | |||
+ | A rotation matrix for the head in lab coordinates. | ||
+ | A rotation matrix for the eyes in head coordinates. | ||
+ | The correct order of multiplication is as follows: | ||
+ | |||
+ | First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates. | ||
==Matlab programming== | ==Matlab programming== |
Revision as of 10:41, 13 May 2024
Description
The EyeSeeCam SCI is a combined eye tracker and head movement tracker.
Construction of a gaze trace
From the EyeSeeCam data we can construct a head movement trace and an eye movement trace. These can be combined to create a gaze trace.
Head movement
The EyeSeeCam gives angle velocities for rotations around X, Y and Z axis. In order to create a head movement trace we have to calculate rotation matrices (delta_Rx, delta_Ry and delta_Rz) in EyeSeeCam coordinate for every time step.
Since the time steps are about 2 ms the rotations for each time step are small. Therefor the order in which delta_Rx, delta_Ry and delta_Rz are multiplied are not important and we calculate delta_R(t) in EyeSeeCam coordinates by
delta_R(t) = delta_Rx * delta_Ry * delta_Rz.
To get R(t) in lab coordinates we have
R(t) = R(t-delta_t) * delta_R(t).
Eye movement
Gaze movement
The gaze movement is the combined head and eye movement. To create the total rotation matrix that represents the 'gaze' rotation in lab coordinates, you need to multiply the two rotation matrices you have in the correct order. In your case, you have:
A rotation matrix for the head in lab coordinates. A rotation matrix for the eyes in head coordinates. The correct order of multiplication is as follows:
First, multiply the rotation matrix of the head in lab coordinates by the rotation matrix of the eyes in head coordinates.
Matlab programming
Matlab interface
The recorded data is read from an LSL stream.
% todo: LSL streaming parameters
The lsldata is a struct and has the following fields:
- escdata
- escmetadata
- escstr
- evdata0
- evdata1
- evdata2
- evdata3
- evdata4
The lsldata.escdata contains the eye and head data. The data contains a matrix for with a row for every parameter. The total of parameters is 63. The field lsldata.escmetadata.channel lists all the names of the parameters.
%Eye data RightEyePosX = 46; RightEyePosY = 47; RightEyePosZ = 48; xeye = lsldata.escdata.Data(46,:); % right 46, left 32 yeye = lsldata.escdata.Data(47,:); % right 47, left 33 zeye = lsldata.escdata.Data(48,:); % right 48, left 34 % Head data HeadInertialVelXCal = 27; HeadInertialVelYCal = 29; HeadInertialVelZCal = 31; xh = lsldata.escdata.Data(27,:); % calibrated torion velocity data HEAD yh = lsldata.escdata.Data(29,:); % calibrated vertical velocity data HEAD zh = lsldata.escdata.Data(31,:); % calibrated horizontal velocity data HEAD
Converting rotation speed
%make a coordinate object mycoordinates_XYZ = coordinates_XYZ(startingGaze); % initialize R_total to 3D unit matrix R_total = [1,0,0;0,1,0;0,0,1]; %loop through al timesteps for i = (timeRange) % determine the angle changes between time t(i)-delta_t/2 to t(i)+delta_t/2 delta_angleX = Vx(i)*delta_t; delta_angleY = Vy(i)*delta_t; delta_angleZ = Vz(i)*delta_t; %create a rotation matrices delta_Rx = Rx(delta_angleX); delta_Ry = Ry(delta_angleY); delta_Rz = Rz(delta_angleZ); %multiply rotation matrices (order is not important if angles are small enough) delta_R = delta_Rx * delta_Ry * delta_Rz; % determine new R_total % rotation in device coordinates, order: R_total * delta_R R_total = R_total * delta_R; %rotate startingpoint with R_total newpoint = R_total * startingGaze; %add new position to list of coordinates mycoordinates_XYZ.add(newpoint); end % transform XYZ to RAS coordinates with EyeSeeCam definition mycoordinates_RAS = transform_XYZ2RAS(mycoordinates_XYZ, definition_XYZ2RAS_EyeSeeCam_Sci); mycoordinates_DP = transform_RAS2DP(mycoordinates_RAS);