1269  Sensor Fusion Practice Exercises

Learning Objectives

After completing these exercises, you will be able to:

  • Implement Kalman filters for position tracking
  • Design complementary filters for IMU orientation
  • Build multi-sensor data quality assessments
  • Create hierarchical sensor fusion architectures

1269.1 Practice Exercises

Objective: Implement a 1D Kalman filter to fuse noisy GPS position measurements with accelerometer-based velocity estimates.

Tasks:

  1. Generate synthetic data: true position following constant velocity motion, GPS measurements with sigma=5m noise, accelerometer with sigma=0.5 m/s2 noise
  2. Implement Kalman filter with state [position, velocity]: prediction step (x_k = F x_{k-1}, P_k = F P_{k-1} F^T + Q), update step (K = P H^T (H P H^T + R)^-1, x = x + K (z - H x))
  3. Tune parameters: process noise Q (model uncertainty), measurement noise R (sensor variance), initial state x_0 and covariance P_0
  4. Compare performance: plot true position, noisy GPS, Kalman estimate; calculate RMS error for each

Expected Outcome: Kalman filter achieves 2-3m RMS error (40-50% better than GPS alone at 5m). Understand that filter smooths noise while tracking motion trends. Learn parameter tuning: too-small Q makes filter ignore measurements, too-large R makes filter ignore measurements.

Objective: Implement a complementary filter to fuse gyroscope and accelerometer data for drift-free orientation estimation.

Tasks:

  1. Collect IMU data from sensor (MPU6050 or phone): gyroscope (angular velocity w) and accelerometer (gravity direction) at 100 Hz
  2. Implement gyroscope-only integration: theta_k = theta_{k-1} + w * dt; observe drift over 60 seconds
  3. Implement accelerometer-only: theta = atan2(accel_y, accel_z); observe noise from vibrations
  4. Implement complementary filter: theta = alpha * (theta + w * dt) + (1-alpha) * theta_accel with alpha=0.98

Expected Outcome: Complementary filter achieves <1 deg steady-state error with rapid response to rotations. Understand frequency separation: gyroscope (high-pass) + accelerometer (low-pass). Learn alpha tuning: higher alpha = trust gyro more, lower alpha = trust accel more.

Objective: Implement outlier detection and sensor health monitoring to ensure fusion reliability.

Tasks:

  1. Simulate 3 temperature sensors: Sensor A (sigma=2C), Sensor B (sigma=1C), Sensor C (sigma=0.5C, but occasional outliers)
  2. Implement inverse variance weighting: w_i = (1/sigma_i^2) / sum(1/sigma_j^2), fused = sum(w_i * measurement_i)
  3. Add Mahalanobis distance outlier detection: d^2 = (z - mu)^2 / sigma^2; reject if d^2 > 5.991 (95% confidence)
  4. Inject outliers (Sensor C reads 50C when truth is 22C) and verify filter rejects outliers

Expected Outcome: Fusion weights Sensor C highest under normal conditions. When outliers occur, Mahalanobis detector rejects them, fusion uses only A and B. Understand robustness: system continues even if sensors fail.

Objective: Design and implement a multi-level fusion system combining low-level, feature-level, and decision-level fusion.

Tasks:

  1. Low-level fusion: Combine 3 redundant temperature sensors using Kalman filter -> single fused estimate
  2. Feature-level fusion: Extract features from accelerometer and gyroscope -> concatenate into feature vector [accel_mean, accel_std, gyro_mean, gyro_std]
  3. Decision-level fusion: Train classifiers on each sensor independently -> combine decisions using majority voting
  4. Compare performance: measure accuracy, latency, and computational cost for each level

Expected Outcome: Low-level fusion provides optimal accuracy but high computation. Feature-level enables ML classification with 80-90% accuracy. Decision-level is most flexible but suboptimal accuracy. Understand trade-offs: choose level based on application requirements.

1269.2 Videos

This video explains the intuition behind Kalman filters for sensor fusion, including the predict-update cycle and optimal weighting.

Learn how particle filters handle non-linear, non-Gaussian systems for indoor localization and tracking.

This video covers machine learning approaches for sensor signal processing, including feature extraction and classification.

1269.3 Resources

1269.3.1 Libraries

1269.3.2 Papers

  • Madgwick, S. (2010). “An efficient orientation filter for IMU and MARG sensor arrays”
  • Welch, G. & Bishop, G. (2006). “An Introduction to the Kalman Filter”
  • Hall, D. L. & Llinas, J. (2001). “Handbook of Multisensor Data Fusion”

1269.3.3 Books

  • “Kalman Filtering: Theory and Practice Using MATLAB” by Grewal & Andrews
  • “Multisensor Data Fusion” by Martin Liggins et al.