Quick viewing(Text Mode)

Inertial Measurement Units II

Inertial Measurement Units II

Inertial Measurement Units II

Gordon Wetzstein Stanford University

EE 267 Virtual Reality Lecture 10 stanford.edu/class/ee267/ Polynesian Migration

wikipedia Lecture Overview

• short review of coordinate systems, tracking in flatland, and accelerometer-only tracking

: , axis & angle, lock • rotations with • 6-DOF IMU sensor fusion with quaternions • primary goal: track orientation of head or device

• inertial sensors required pitch, yaw, and roll to be determined

oculus.com from lecture 2:

vertex in clip space vertex

vclip = M proj ⋅ M view ⋅ M model ⋅v

oculus.com from lecture 2:

vertex in clip space vertex

vclip = M proj ⋅ M view ⋅ M model ⋅v

projection view matrix model matrix

translation

M view = R⋅T (−eye)

oculus.com Euler angles

θ y

rotation translation

M R T eye θ x view = ⋅ (− ) θz

R = Rz (−θz )⋅ Rx (−θ x )⋅ Ry (−θ y ) roll pitch yaw oculus.com Euler angles

θ y

2 important coordinate systems: body/sensor world/inertial frame frame M R T eye θ x view = ⋅ (− ) θz

R = Rz (−θz )⋅ Rx (−θ x )⋅ Ry (−θ y ) roll pitch yaw oculus.com Gyro Integration aka Dead Reckoning

• from gyro measurements to orientation – use Taylor expansion

have: angle at have: last time step time step ∂ θ (t + Δt) ≈ θ (t) + θ (t)Δt + ε, ε ∼ O(Δt 2 ) ∂t want: angle at approximation error! current time step = ω

have: gyro measurement (angular velocity) Orientation Tracking in Flatland

• problem: track 1 angle in 2D space • sensors: 1 gyro, 2-axis accelerometer VRduino • sensor fusion with complementary IMU filter, i.e. linear interpolation:

(t) (t−1) θ = α θ +ω! Δt + (1−α )atan2 a!x ,a!y ( ) ( )

• no drift, no noise! Tilt from Accelerometer

• assuming acceleration points up (i.e. no external forces), we can compute the tilt (i.e. pitch and roll) from a 3-axis accelerometer

⎛ 0 ⎞ ⎛ 0 ⎞ a! ⎜ ⎟ ⎜ ⎟ aˆ = = R 1 = Rz (−θz )⋅ Rx (−θ x )⋅ Ry −θ y 1 a! ⎜ ⎟ ( )⎜ ⎟ ⎝ 0 ⎠ ⎝ 0 ⎠

⎛ ⎞ 2 2 − cos(−θ x )sin(−θz ) θ = −atan2 aˆ ,sign aˆ ⋅ aˆ + aˆ ⎜ ⎟ x ( z ( y ) x y ) = ⎜ cos(−θ x )cos(−θz ) ⎟ ⎜ ⎟ θ = −atan2 −aˆ ,aˆ both in rad sin z ( x y ) ⎝⎜ (−θ x ) ⎠⎟ Euler Angles and Gimbal Lock

• so far we have represented head rotations with Euler angles: 3 rotation angles around the axis applied in a specific sequence

• problematic when interpolating between rotations in keyframes (in computer animation) or integration à singularities Gimbal Lock

The Guerrilla CG Project, The Euler (gimbal lock) Explained – see: https://www.youtube.com/watch?v=zc8b2Jo7mno Rotations with Axis-Angle Representation and Quaternions Rotations with Axis and Angle Representation

• solution to gimbal lock: use axis and angle representation for rotation! • simultaneous rotation around a normalized vector v by angle θ

• no “order” of rotation, all at once around that vector Quaternions

• think about quaternions as an extension of complex numbers to having 3 (different) imaginary numbers or fundamental units i,j,k

q = qw + iqx + jqy + kqz

ij = − ji = k i ≠ j ≠ k ki = −ik = j 2 2 2 i = j = k = ijk = −1 jk = −kj = i Quaternions

• think about quaternions as an extension of complex numbers to having 3 (different) imaginary numbers or fundamental quaternion units i,j,k

q = qw + iqx + jqy + kqz

• quaternion algebra is well-defined and will give us a powerful tool to work with rotations in axis-angle representation in practice Quaternions

• axis-angle to quaternion (need normalized axis v)

⎛ θ ⎞ ⎛ θ ⎞ ⎛ θ ⎞ ⎛ θ ⎞ q(θ,v) = cos + i vx sin + j vy sin + k vz sin ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ !#"#$ !#"#$ !#"#$ !#"#$ qw qx qy qz Quaternions

• axis-angle to quaternion (need normalized axis v)

⎛ θ ⎞ ⎛ θ ⎞ ⎛ θ ⎞ ⎛ θ ⎞ q(θ,v) = cos + i vx sin + j vy sin + k vz sin ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ ⎝⎜ 2⎠⎟ !#"#$ !#"#$ !#"#$ !#"#$ qw qx qy qz • valid rotation quaternions have unit length

2 2 2 2 q = qw + qx + qy + qz = 1 Two Types of Quaternions

• vector quaternions represent 3D points or vectors u=(ux,uy,uz) can have arbitrary length

qu = 0 + iux + j uy + k uz

• valid rotation quaternions have unit length

2 2 2 2 q = qw + qx + qy + qz = 1 Quaternion Algebra • quaternion addition:

q + p = (qw + pw ) + i(qx + px ) + j(qy + py ) + k(qz + pz ) • quaternion multiplication:

qp = (qw + iqx + jqy + kqz )( pw + ipx + jpy + kpz )

= (qw pw − qx px − qy py − qz pz ) +

i(qw px + qx pw + qy pz − qz py ) +

j(qw py − qx pz + qy pw + qz px ) +

k(qw pz + qx py − qy px + qz pw ) + Quaternion Algebra

* • quaternion conjugate: q = qw − iqx − jqy − kqz q* • quaternion inverse: q−1 = q 2

−1 • rotation of vector quaternion q u by q : q'u = qquq −1 • inverse rotation: qu = q q'u q

• successive rotations by q then q : −1 −1 1 2 q'u = q2 q1 qu q1 q2 Quaternion Algebra

• detailed derivations and reference of general quaternion algebra and rotations with quaternions in course notes

• please read course notes for more details! Quaternion-based 6-DOF Orientation Tracking Quaternion-based Orientation Tracking

1. 3-axis gyro integration

2. computing the tilt correction quaternion

3. applying a complementary filter Gyro Integration with Quaternions

• start with initial quaternion: q(0) = 1+ i0 + j0 + k0

• convert 3-axis gyro measurements ω ! = ω ! , ω ! , ω ! to ( x y z ) instantaneous rotation quaternion as ⎛ ω! ⎞ avoid division by 0! q = q Δt ω! , Δ ⎝⎜ ω! ⎠⎟ angle rotation axis • integrate as (t+Δt) (t) qω = q qΔ Gyro Integration with Quaternions

(t+Δt) • integrated gyro rotation quaternion q ω represents rotation from body to world frame, i.e. 1 (world) (t+Δt) (body) (t+Δt)− qu = qω qu qω

t • last estimate q ( ) is either from gyro-only (for dead reckoning) or from last complementary filter

t t t • integrate as ( +Δ ) ( ) qω = q qΔ Tilt Correction with Quaternions • assume accelerometer measures gravity vector in body (sensor) coordinates a! = a! ,a! ,a! ( x y z )

• transform vector quaternion of a ! into current estimation of world space as 1 (world) (t+Δt) (body) (t+Δt)− qa = qω qa qω

q(body) = 0 + ia! + ja! + ka! a x y z Tilt Correction with Quaternions • assume accelerometer measures gravity vector in body (sensor) coordinates a! = a! ,a! ,a! ( x y z )

• transform vector quaternion of a ! into current estimation of world space as 1 (world) (t+Δt) (body) (t+Δt)− qa = qω qa qω

• if gyro quaternion is correct, then accelerometer world vector points up, i.e. (world) qa = 0 + i0 + j 9.81+ k0 Tilt Correction with Quaternions • gyro quaternion likely includes drift • accelerometer measurements are noisy and also include forces other than gravity, so it’s unlikely that accelerometer world vector actually points up

• if gyro quaternion is correct, then accelerometer world vector points up, i.e. (world) qa = 0 + i0 + j 9.81+ k0 Tilt Correction with Quaternions

(world) solution: compute tilt correction quaternion that would rotate q a into up direction

(world) how? get normalized vector part of vector quaternion qa

⎛ (world) (world) (world) ⎞ qa qa qa v = ⎜ x , y , z ⎟ ⎜ (world) (world) (world) ⎟ ⎝ qa qa qa ⎠ Tilt Correction with Quaternions

(world) solution: compute tilt correction quaternion that would rotate q a into up direction ⎛ n ⎞ qt = q φ, ⎝⎜ n ⎠⎟ ⎛ v ⎞ x ⎛ 0 ⎞ ⎜ ⎟ ⎜ ⎟ −1 vy i 1 = cos(φ) ⇒ φ = cos vy ⎜ ⎟ ⎜ ⎟ ( ) ⎜ ⎟ 0 vz ⎝ ⎠ ⎝ ⎠ ⎛ v ⎞ ⎛ ⎞ x ⎛ 0 ⎞ −vz ⎜ ⎟ ⎜ ⎟ n = vy × ⎜ 1 ⎟ = 0 ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎝ 0 ⎠ ⎜ v ⎟ ⎝ vz ⎠ ⎝ x ⎠ Complementary Filter with Quaternions

• complementary filter: rotate into gyro world space first, then rotate “a bit” into the direction of the tilt correction quaternion

(t+Δt) ⎛ n ⎞ (t+Δt) qc = q (1−α )φ, qω 0 ≤ α ≤ 1 ⎝⎜ n ⎠⎟

1 (world) (t+Δt) (body) (t+Δt)− • rotation of any vector quaternion is then qu = qc qu qc Integration into Graphics Pipeline

(t+Δt) • compute q c via quaternion complementary filter first

• stream from microcontroller to PC

(t+Δt) • convert to 4x4 (see course notes) qc ⇒ Rc

−1 • set view matrix to M v i e w = R c to rotate the world in front of the virtual camera Head and Neck Model

y y θ x θz lh ln IMU

ln −z x pitch around base of neck! roll around base of neck! Head and Neck Model

• why? there is not always positional tracking! this gives some motion parallax • can extend to torso, and using other kinematic constraints

• integrate into pipeline as

M view = T (0,−ln ,−lh )⋅ R⋅T (0,ln ,lh )⋅T (−eye) Must read: course notes on IMUs!