%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#2C3E50', 'primaryTextColor': '#fff', 'primaryBorderColor': '#16A085', 'lineColor': '#16A085', 'secondaryColor': '#E67E22', 'tertiaryColor': '#ecf0f1'}}}%%
flowchart TB
subgraph Centralized["Centralized Fusion"]
CS1[Sensor 1] --> CF[Central<br/>Fusion Node]
CS2[Sensor 2] --> CF
CS3[Sensor 3] --> CF
CF --> CO[Optimal<br/>Result]
end
subgraph Distributed["Distributed Fusion"]
DS1[Sensor 1] --> DF1[Local<br/>Fusion 1]
DS2[Sensor 2] --> DF1
DS3[Sensor 3] --> DF2[Local<br/>Fusion 2]
DS4[Sensor 4] --> DF2
DF1 & DF2 --> DG[Global<br/>Fusion]
DG --> DO[Scalable<br/>Result]
end
subgraph Hierarchical["Hierarchical Fusion"]
HS1[Sensor 1] --> HL1[Level 1<br/>Fusion]
HS2[Sensor 2] --> HL1
HS3[Sensor 3] --> HL2[Level 2<br/>Fusion]
HS4[Sensor 4] --> HL2
HL1 & HL2 --> HL3[Level 3<br/>Global]
HL3 --> HO[Balanced<br/>Result]
end
style CF fill:#E67E22,stroke:#2C3E50,color:#fff
style DF1 fill:#16A085,stroke:#2C3E50,color:#fff
style DF2 fill:#16A085,stroke:#2C3E50,color:#fff
style DG fill:#2C3E50,stroke:#16A085,color:#fff
style HL1 fill:#16A085,stroke:#2C3E50,color:#fff
style HL2 fill:#16A085,stroke:#2C3E50,color:#fff
style HL3 fill:#2C3E50,stroke:#16A085,color:#fff
1266 Sensor Fusion Architectures
Learning Objectives
After completing this chapter, you will be able to:
- Design centralized, distributed, and hierarchical fusion architectures
- Apply the Dasarathy taxonomy for fusion classification
- Choose appropriate fusion architectures for IoT applications
- Understand trade-offs between architecture options
1266.1 Fusion Architecture Types
1266.1.1 Architecture Comparison
| Architecture | Pros | Cons | Use Case |
|---|---|---|---|
| Centralized | Optimal fusion, simple | Single point of failure, high bandwidth | Small-scale systems |
| Distributed | Scalable, fault-tolerant, low bandwidth | Sub-optimal, complex | Large-scale IoT networks |
| Hierarchical | Balanced, modular | Moderate complexity | Smart buildings, smart cities |
1266.2 Data Fusion Taxonomy (Dasarathy Classification)
The Dasarathy taxonomy provides a formal framework for classifying sensor fusion systems based on their input and output abstraction levels.
%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#2C3E50', 'primaryTextColor': '#fff', 'primaryBorderColor': '#16A085', 'lineColor': '#16A085', 'secondaryColor': '#E67E22', 'tertiaryColor': '#ecf0f1'}}}%%
flowchart TB
subgraph DAI_DAO["DAI-DAO: Data In -> Data Out"]
D1[Sensor 1 Raw] --> Fuse1[Weighted<br/>Average]
D2[Sensor 2 Raw] --> Fuse1
Fuse1 --> Out1[Fused Raw Data]
end
subgraph FEI_FEO["FEI-FEO: Feature In -> Feature Out"]
F1[Feature Set 1] --> Fuse2[Feature<br/>Concatenation]
F2[Feature Set 2] --> Fuse2
Fuse2 --> Out2[Combined Features]
end
subgraph DEI_DEO["DEI-DEO: Decision In -> Decision Out"]
Dec1[Decision 1<br/>90% pedestrian] --> Vote[Voting/<br/>Bayesian]
Dec2[Decision 2<br/>85% obstacle] --> Vote
Vote --> Out3[Final Decision<br/>BRAKE!]
end
style D1 fill:#2C3E50,stroke:#16A085,color:#fff
style D2 fill:#2C3E50,stroke:#16A085,color:#fff
style F1 fill:#16A085,stroke:#2C3E50,color:#fff
style F2 fill:#16A085,stroke:#2C3E50,color:#fff
style Dec1 fill:#E67E22,stroke:#2C3E50,color:#fff
style Dec2 fill:#E67E22,stroke:#2C3E50,color:#fff
style Out1 fill:#27AE60,stroke:#2C3E50,color:#fff
style Out2 fill:#27AE60,stroke:#2C3E50,color:#fff
style Out3 fill:#27AE60,stroke:#2C3E50,color:#fff
1266.2.1 The Five Dasarathy Categories
| Category | Input -> Output | Description | IoT Example |
|---|---|---|---|
| DAI-DAO | Data -> Data | Raw values combined | Two temperature sensors averaged |
| DAI-FEO | Data -> Feature | Raw data to features | Accel + gyro raw -> motion features |
| FEI-FEO | Feature -> Feature | Features merged | Wi-Fi RSSI + BLE features combined |
| FEI-DEO | Feature -> Decision | Features classified | HR + motion -> “User sleeping” |
| DEI-DEO | Decision -> Decision | Decisions fused | Camera + Radar -> BRAKE! |
1266.2.2 When to Use Each Category
- DAI-DAO: Need single “best” raw measurement from redundant sensors
- DAI-FEO: Raw data needs transformation into meaningful metrics
- FEI-FEO: Combining complementary features from different sensor types
- FEI-DEO: Features need classification into categorical outputs
- DEI-DEO: Multiple independent classifiers vote on final decision
1266.3 Smart Home Multi-Sensor Example
%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#2C3E50', 'primaryTextColor': '#fff', 'primaryBorderColor': '#16A085', 'lineColor': '#16A085', 'secondaryColor': '#E67E22', 'tertiaryColor': '#ecf0f1'}}}%%
flowchart LR
subgraph Sensors["Sensors (Raw Data)"]
WiFi[Wi-Fi RSSI<br/>-65 dBm]
ZB[Zigbee Motion<br/>Active]
Temp[Temperature<br/>24C]
Light[Light Level<br/>200 lux]
end
subgraph DAI["DAI-FEO Stage"]
WiFi --> Loc[Location<br/>Feature]
ZB --> Occ[Occupancy<br/>Feature]
end
subgraph FEI["FEI-FEO Stage"]
Loc --> Combined[Combined<br/>Context]
Occ --> Combined
Temp --> Combined
Light --> Combined
end
subgraph DEO["FEI-DEO Stage"]
Combined --> Decision{Room<br/>Occupied?<br/>Comfort<br/>Optimal?}
end
subgraph Action["Actuation"]
Decision --> HVAC[Adjust HVAC]
Decision --> Lights[Dim Lights]
end
style WiFi fill:#2C3E50,stroke:#16A085,color:#fff
style ZB fill:#16A085,stroke:#2C3E50,color:#fff
style Temp fill:#E67E22,stroke:#2C3E50,color:#fff
style Decision fill:#27AE60,stroke:#2C3E50,color:#fff
1266.4 Architecture Trade-offs
TipTradeoff: Early Fusion vs Late Fusion
| Factor | Early Fusion (Data-Level) | Late Fusion (Decision-Level) |
|---|---|---|
| Information preservation | Maximum | Minimal |
| Computational cost | Very high | Low |
| Sensor heterogeneity | Difficult | Easy |
| Timing requirements | Strict sync needed | Relaxed (async OK) |
| Failure handling | Harder | Easier |
Choose Early Fusion when:
- Sensors measure complementary aspects (RGB + depth)
- Cross-sensor correlations are important
- Maximum accuracy is required
Choose Late Fusion when:
- Sensors are heterogeneous with different rates
- Graceful degradation is required
- Computational resources are limited
TipTradeoff: Sensor-Level vs Central Fusion
| Factor | Sensor-Level Fusion | Central Fusion |
|---|---|---|
| Latency | Lower - local decisions | Higher - network traversal |
| Bandwidth | Lower - compressed data | Higher - raw data |
| Accuracy | May be limited locally | Full dataset access |
| Reliability | Graceful degradation | Single point of failure |
Choose Sensor-Level Fusion when:
- Network bandwidth is constrained
- Local latency requirements are strict
- Privacy requires data to stay on-device
Choose Central Fusion when:
- Sensors are heterogeneous and require complex correlation
- Historical context is needed
- Computational resources at edge are limited
1266.5 Worked Example: Autonomous Forklift Safety System
Scenario: Design fusion architecture for autonomous forklift pedestrian detection.
Requirements:
- <200ms latency from detection to brake
99.9% pedestrian recall
- <1 false alarm per hour
Sensors:
- 2D LIDAR (10 Hz, 360 deg, 30m range)
- Stereo camera (30 Hz, 90 deg FOV)
- Ultrasonic array (20 Hz, 180 deg, 5m range)
Architecture Decision: Cascaded fusion
- Stage 1 (LIDAR, 40ms): Fast point cloud clustering, flag all potential pedestrians
- Stage 2 (Camera, 80ms): Run CNN only on LIDAR-flagged regions
- Stage 3 (Ultrasonic, 20ms): Confirm close-range detections
Fusion Logic:
Alert if:
(LIDAR AND Camera) OR
(Ultrasonic AND Camera) OR
(LIDAR confidence > 0.95)
Result: 99.4% recall, 0.7 false alarms/hr, 155ms latency
Key Insight: Cascaded fusion leverages each sensor’s strengths - LIDAR for reliable detection, camera for classification, tracking for occlusion robustness.
1266.6 Summary
Sensor fusion architectures determine system performance:
- Centralized: Optimal but single point of failure
- Distributed: Scalable and fault-tolerant
- Hierarchical: Balanced approach for complex systems
- Dasarathy taxonomy: Classifies fusion by input/output abstraction levels
- Trade-offs: Early vs late fusion, sensor-level vs central
1266.7 What’s Next
- Real-World Applications - Smartphone rotation, activity recognition
- Best Practices - Common pitfalls to avoid
- Exercises - Design fusion architectures