84  WSN Routing Labs and Exercises

In 60 Seconds

Comparing WSN routing protocols in simulation reveals stark trade-offs: LEACH achieves 8x longer network lifetime than direct transmission by rotating cluster heads, but drops to 2x improvement when clusters are unevenly sized. AODV provides 95%+ delivery rate but consumes 3-5x more energy than DSR due to periodic route maintenance. The critical lab insight is that no single protocol wins on all metrics – energy, latency, and delivery rate form an impossible triangle.

84.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Experiment with Routing Protocols: Compare AODV, DSR, and LEACH in simulation environments
  • Measure Energy Consumption: Quantify and optimize routing for network lifetime
  • Implement Route Discovery: Practice route request/reply mechanisms in code
  • Evaluate Protocol Trade-offs: Assess delivery rate, latency, and overhead trade-offs

This chapter tests your knowledge through a 10-question quiz covering all WSN routing topics, then provides a hands-on coding lab where you can modify real routing algorithm code. If you have not yet read the earlier chapters in this series, start with WSN Routing Fundamentals. The quiz questions are drawn from all six topic areas: routing challenges, directed diffusion, data aggregation, link quality, Trickle, and protocol comparisons.

84.2 Prerequisites

Before diving into this chapter, you should be familiar with:

MVU: Minimum Viable Understanding

Core concept: WSN routing protocol selection depends on network characteristics – mobile vs static nodes, dense vs sparse deployment, event-driven vs continuous monitoring, and latency-critical vs best-effort delivery.

Why it matters: Choosing AODV for a 1000-node static farm (should use LEACH) or LEACH for a 10-node mobile scenario (should use AODV) wastes energy, reduces lifetime, and degrades performance.

Key takeaway: Compare protocols by running them under realistic conditions and measuring delivery rate, latency, and energy consumption simultaneously. A protocol that excels in one metric may fail badly in another.

Simulated protocol comparison for 200-node static WSN with 10 packets/minute reporting rate:

AODV (reactive, address-based):

  • Route discovery: 5% of transmissions fail → trigger new RREQ flood (200 nodes × 40 bytes = 8 KB network-wide)
  • Overhead: 5% failure rate × 10 pkts/min × 60 min = 30 route discoveries/hr × 8,000 bytes = 240 KB/hour control traffic
  • Data delivery: 95% (5% loss triggers rediscovery)
  • Average latency: 120 ms (includes route discovery delays)
  • Data transmissions per hour: 200 nodes × 10 pkt/min × 60 min × 4 hops = 480,000 transmissions
  • Energy: 480,000 × 50 µJ = 24,000,000 µJ = 24 J/hour data + 240,000 bytes × 0.4 µJ/byte = 96 mJ control ≈ 24.1 J/hour total

LEACH (hierarchical, cluster-based with p=0.1):

  • Cluster formation: 20 CHs per round (20-second rounds = 180 rounds/hour)
  • Overhead: 20 CH advertisements × 20 bytes × 180 rounds/hour = 72 KB/hour
  • Data delivery: 98% (cluster heads aggregate and send redundant summaries)
  • Average latency: up to 20,000 ms (nodes must wait for CH time slot in TDMA schedule with 20-second rounds)
  • Per round: 180 members × 1 hop × 50 µJ + 20 CHs × 3 hops × 50 µJ = 9,000 + 3,000 = 12,000 µJ = 12 mJ/round
  • Energy: 12 mJ × 180 rounds/hour = 2.16 J/hour

Counterintuitive result: In this continuous high-rate scenario, LEACH actually uses 11× LESS energy than AODV (2.16 vs 24.1 J/hour) because cluster-based aggregation dramatically reduces the number of multi-hop transmissions. However, LEACH’s latency is unacceptably high (~20 seconds per TDMA slot) for real-time applications. Lesson: LEACH wins on energy for dense high-rate deployments, but its latency makes it unsuitable for real-time alerts. AODV provides 95% delivery with lower latency but pays a high energy cost.

The Sensor Squad is taking their final exam! Time to show everything they have learned about routing!

84.2.1 The Sensor Squad Adventure: The Championship Quiz

It was the end of the semester at Sensor School, and the squad was ready for the big quiz!

“Question one!” said Teacher Router. “If Sammy is far away and needs to send a message to the farmhouse, should he shout really loud (direct) or pass it through friends (multi-hop)?”

“Multi-hop!” yelled Max. “Shouting uses way too much battery! Short passes are much cheaper!”

“Question two! If there are 100 sensors, should EVERY sensor send its own message?”

“No way!” said Lila. “That is 100 messages! We should use a group leader to combine them into just a few messages. That is data aggregation!”

“Final question! How do you pick the BEST path?”

Bella thought carefully. “NOT the shortest path! The BEST path has good signal quality, goes through sensors with plenty of battery, and avoids the busy area near the farmhouse!”

“A+ for the Sensor Squad!” cheered Teacher Router.

84.2.2 Key Words for Kids

Word What It Means
AODV A protocol that finds routes only when it needs them (like asking for directions each trip)
DSR A protocol where the message carries its full route with it (like a treasure map)
LEACH A protocol that organizes sensors into teams with rotating leaders
ETX A score that tells you how many tries it takes to send a message successfully

84.3 Comprehensive Review Quiz

Test your understanding of all WSN routing concepts covered in this series.

Question 1: A temperature monitoring WSN has 100 sensors in a 100 m × 100 m area. Without aggregation, all 100 sensors send readings to the sink every minute (100 transmissions). With tree-based aggregation using 5 intermediate aggregation points, how many total transmissions occur?

  1. 100 transmissions - same as without aggregation since all data must reach the sink
  2. 20 transmissions - dramatic reduction through aggressive compression
  3. 50 transmissions - approximately half due to aggregation benefits
  4. ~105 transmissions - leaves to aggregators (~100) + aggregators to sink (5)
Answer

D) ~105 transmissions - leaves to aggregators (~100) + aggregators to sink (5)

Tree-based aggregation: (1) Leaf nodes to aggregators: 100 sensors organize into 5 subtrees of ~20 sensors each. Each sensor transmits to its aggregator = 100 transmissions. (2) Aggregators to sink: Each of 5 aggregators transmits aggregated result (min/max/avg) to sink = 5 transmissions. Total: 100 + 5 = 105 transmissions. Energy savings: Even though transmission count is similar, aggregation eliminates 95 long-distance transmissions (expensive!), replacing them with short-distance local transmissions (cheap). Real network: 70% energy savings despite similar transmission count.

Question 2: Directed Diffusion uses interest propagation and gradients for data-centric routing. What happens during the gradient establishment phase?

  1. Sink broadcasts data requests and sources send data directly to the sink
  2. Interest floods the network; each node establishes a gradient (direction and rate) pointing toward sinks
  3. Sources advertise available data and sinks subscribe to specific sensors
  4. Network builds shortest-path tree from sink to all nodes using hop count
Answer

B) Interest floods the network; each node establishes a gradient (direction and rate) pointing toward sinks

Directed Diffusion operation: (1) Interest propagation: Sink floods network with interest describing desired data (e.g., “type=temperature, interval=1s, region=area1”). Each node caches interest. (2) Gradient establishment: As interest propagates, each node records from which neighbors it received the interest. This creates gradients - directional state pointing toward interested sinks with associated data rates. Multiple gradients possible (multiple paths to sink). (3) Data propagation: When sources detect matching events, they send data along gradients toward sinks.

Question 3: Link quality estimation uses WMEWMA (Window Mean with EWMA). A link has recent packet delivery: [100%, 100%, 100%, 0%, 0%]. What will WMEWMA estimate compared to pure EWMA?

  1. WMEWMA will respond faster to link failure, dropping estimate quickly when 0% packets arrive
  2. WMEWMA will be slower to respond, maintaining high estimate due to historical data
  3. Both will have identical estimates since they use the same input data
  4. WMEWMA is less accurate - should only use EWMA for link estimation
Answer

A) WMEWMA will respond faster to link failure, dropping estimate quickly when 0% packets arrive

WMEWMA combines two estimators: (1) Window Mean (WM): Average of last N packets (e.g., N=5). Responds quickly to changes - when recent packets fail, estimate drops immediately. WM for [100,100,100,0,0] = (100+100+100+0+0)/5 = 60%. (2) EWMA: Weighted average with parameter a (e.g., a=0.9). Smooths long-term trends but slow to respond. EWMA updates: E1=100%, E2=100%, E3=100%, E4=0.9x100+0.1x0=90%, E5=0.9x90+0.1x0=81%. (3) WMEWMA: Takes minimum of WM and EWMA = min(60%, 81%) = 60%. Why this matters: WM drops quickly when link degrades, avoiding bad routes faster than pure EWMA.

Question 4: Why does the MIN-T metric consider both forward and backward link quality?

  1. To balance network load
  2. Because acknowledgments travel on the backward link
  3. To prevent routing loops
  4. For geographical routing
Answer

B) Because acknowledgments travel on the backward link

MIN-T accounts for both forward (data) and backward (ACK) link quality because a poor backward link will cause ACK losses, triggering retransmissions even if the forward link is good. The formula is: Cost = 1/(P_forward x P_backward).

Question 5: What is the purpose of the suppression mechanism in the Trickle algorithm?

  1. To prevent malicious code propagation
  2. To reduce unnecessary transmissions when neighbors have same code version
  3. To compress the code before transmission
  4. To prioritize critical updates
Answer

B) To reduce unnecessary transmissions when neighbors have same code version

Trickle’s suppression mechanism (based on threshold k) prevents nodes from broadcasting metadata if they’ve heard enough neighbors already broadcasting the same version. This achieves zero maintenance cost when the network is consistent.

Question 6: In Directed Diffusion, what happens during gradient reinforcement?

  1. Physical signal strength is increased
  2. Sinks increase the data rate on preferred paths
  3. New nodes join the network
  4. Energy levels are balanced
Answer

B) Sinks increase the data rate on preferred paths

Gradient reinforcement allows sinks to request higher data rates on paths that deliver data successfully with low latency. This focuses traffic on good paths while letting poor paths expire, adapting the routing to network conditions.

Question 7: WMEWMA combines which two link quality estimation techniques?

  1. RSSI and LQI
  2. Window Mean and Exponentially Weighted Moving Average
  3. Packet delivery rate and hop count
  4. Forward and backward link quality
Answer

B) Window Mean and Exponentially Weighted Moving Average

WMEWMA combines a Window Mean (short-term packet reception count) with EWMA (long-term smoothed average) to balance responsiveness to recent changes with stability against short-term variations.

Question 8: What triggers a Trickle interval reset to tau_min?

  1. Network congestion
  2. Hearing metadata with a newer version
  3. Reaching maximum interval length
  4. Low battery level
Answer

B) Hearing metadata with a newer version

When a node hears metadata with a newer code version than it has, it immediately resets its Trickle interval to tau_min (minimum). This enables rapid propagation of new code through the network.

Question 9: Which metric would hop-count routing consider superior: a 2-hop path with 50% link quality or a 3-hop path with 95% link quality?

  1. 2-hop path (fewer hops)
  2. 3-hop path (better links)
  3. Both are equal
  4. Cannot determine
Answer

A) 2-hop path (fewer hops)

Hop-count routing only considers the number of hops, ignoring link quality. It would choose the 2-hop path even though the 3-hop path would require fewer total transmissions (including retransmissions). This is why hop-count is suboptimal in lossy WSNs.

Question 10: In data aggregation, what does the “completeness” metric measure?

  1. How accurate the aggregated value is
  2. The percentage of sensor readings included in the aggregate
  3. The time taken to complete aggregation
  4. The compression ratio achieved
Answer

B) The percentage of sensor readings included in the aggregate

Completeness measures what fraction of sensor readings successfully contributed to the aggregated result at the sink. It’s calculated as: Completeness = Readings_Included / Total_Readings. High completeness means the aggregate represents most of the network.

84.4 WSN Routing Lab: Multi-Hop Routing Simulation

~45 min | Advanced | Hands-On Lab

Key Concepts

  • Routing Protocol: Algorithm determining the path a packet takes through the multi-hop WSN to reach the sink
  • Convergecast: N-to-1 routing pattern where all sensor data flows toward a single sink along a tree structure
  • Routing Table: Per-node data structure mapping destination addresses to next-hop neighbors
  • Energy-Aware Routing: Protocol selecting paths based on node residual energy to balance consumption and maximize lifetime
  • Link Quality Indicator (LQI): Metric quantifying the reliability of a wireless link — higher LQI means more reliable packet delivery
  • Routing Tree: Spanning tree structure rooted at the sink used by hierarchical routing protocols
  • Multi-path Routing: Maintaining multiple disjoint paths to improve reliability and enable load balancing

84.4.1 Lab Overview

This hands-on lab simulates a Wireless Sensor Network with multiple ESP32 nodes demonstrating different routing protocols. You’ll experiment with routing decisions, energy-aware path selection, and compare the performance of AODV, DSR, and LEACH protocols in real-time.

What You’ll Learn:

  • How multi-hop routing works in wireless sensor networks
  • Differences between reactive (AODV, DSR) and proactive (LEACH) routing
  • Energy-aware routing decisions and their impact on network lifetime
  • Route discovery, maintenance, and recovery mechanisms
  • Cluster-based routing and aggregation strategies

84.4.2 Lab Setup

The simulation creates a 9-node WSN topology with:

  • 3 Sensor Nodes (S1, S2, S3) - Generate temperature/humidity data
  • 4 Intermediate Nodes (R1, R2, R3, R4) - Forward packets and aggregate data
  • 1 Cluster Head (CH) - Coordinates LEACH protocol
  • 1 Sink Node (SINK) - Destination for all data

Each node has a simulated battery level that depletes with transmission/reception, demonstrating energy-aware routing.

84.4.3 Network Topology

┌─────────────────────────────────────┐
│ Network Topology                    │
├─────────────────────────────────────┤
│                                     │
│   S1 ──┬── R1 ──┬── CH ── SINK     │
│        │        │    │              │
│   S2 ──┼── R2 ──┤    │              │
│        │        │    │              │
│   S3 ──┴── R3 ──┴── R4              │
│                                     │
│ Links: S1-R1, S1-R2, S2-R2, S2-R3  │
│        S3-R3, S3-R4, R1-R2, R2-R3  │
│        R3-R4, R1-CH, R2-CH, R4-CH  │
│        CH-SINK                      │
└─────────────────────────────────────┘

84.4.4 Challenge Exercises

Hands-On Challenges

Try these modifications to deepen your understanding:

  1. Energy-Aware Path Selection
    • Modify getNextHop() to prioritize high-energy nodes
    • Implement adaptive energy thresholds based on network-wide energy levels
    • Compare network lifetime with and without energy awareness
  2. Route Quality Metrics
    • Add link quality estimation based on signal strength (RSSI)
    • Implement Expected Transmission Count (ETX) metric
    • Update calculateLinkQuality() to factor in packet loss
  3. LEACH Cluster Rotation
    • Implement probabilistic cluster head election (5% probability)
    • Rotate cluster heads based on residual energy
    • Compare energy consumption across nodes
  4. Multipath Routing
    • Modify DSR to maintain multiple routes
    • Implement route splitting for load balancing
    • Measure improvement in delivery rate
  5. Route Recovery
    • Add RERR (Route Error) packet handling for broken links
    • Implement local route repair before triggering new discovery
    • Measure reduction in route discovery overhead
  6. Protocol Comparison
    • Run each protocol for 2 minutes and record statistics
    • Compare delivery rate, latency, and energy consumption
    • Analyze which protocol performs best under different network densities

84.4.5 Expected Learning Outcomes

After completing this lab, you should be able to:

  • Explain the differences between reactive (AODV, DSR) and proactive (LEACH) routing
  • Identify when energy-aware routing improves network lifetime
  • Analyze the trade-offs between hop count and energy consumption
  • Implement route discovery and maintenance mechanisms
  • Evaluate protocol performance under different network conditions
  • Design hierarchical routing strategies for large-scale WSNs

84.4.6 Key Observations

Protocol Route Discovery Energy Efficiency Scalability Best Use Case
AODV On-demand (flooded RREQ) Moderate Good Mobile, dynamic networks
DSR On-demand (source routing) Low overhead Limited Small networks, stable topology
LEACH Cluster-based (periodic) High (aggregation) Excellent Dense, static sensor deployments
Real-World Applications
  • AODV: Smart city IoT with mobile nodes (vehicles, wearables)
  • DSR: Indoor sensor networks with stable topology
  • LEACH: Agricultural monitoring with thousands of static sensors
  • Energy-Aware Routing: Battery-powered environmental monitoring

84.4.7 Further Exploration

To extend this lab:

  • Add geographic routing using GPS coordinates
  • Implement Quality of Service (QoS) routing for priority traffic
  • Simulate network partitioning and reconnection
  • Add data aggregation functions (min, max, median) for LEACH
  • Implement duty cycling where nodes sleep to save energy

84.5 Interactive: ETX Path Comparison Calculator

Compare hop-count routing vs ETX-based routing. Adjust link quality for two paths to see which metric makes better decisions.


84.7 Chapter Summary

This series explored specialized routing approaches for Wireless Sensor Networks:

Key Takeaways:

  1. WSN Routing is Different: Data-centric, energy-aware, application-specific vs traditional address-centric routing

  2. Directed Diffusion: Interest propagation -> gradient establishment -> data delivery -> reinforcement creates efficient data-driven paths

  3. Data Aggregation: Combining data from multiple sensors reduces transmissions, saving energy while maintaining acceptable accuracy

  4. Link Quality Matters: Hop count is insufficient; MIN-T metric accounts for expected retransmissions on lossy links

  5. WMEWMA: Effective link estimation balances short-term responsiveness (Window Mean) with long-term stability (EWMA)

  6. Trickle Algorithm: Polite gossip achieves zero maintenance cost when consistent, rapid propagation when updated

  7. Trade-offs: Message overhead vs latency vs energy consumption requires protocol selection based on application requirements

WSN routing protocols must navigate unique constraints - extreme energy limits, dense deployment, unreliable links - to achieve application objectives efficiently.


Common Pitfalls

Shortest-hop routing concentrates relay load on nodes near the sink, depleting them 10-100× faster than edge nodes. Always incorporate residual energy into route metric (e.g., ETX × energy factor) to balance consumption and prevent premature network partitioning.

WSN topology changes as nodes die or move — routing tables become stale within hours in dynamic deployments. Implement periodic route discovery with a timeout proportional to expected node lifetime, and use link-quality metrics that decay when no recent transmissions are observed.

Flooding generates O(n²) messages in a 100-node network — a single data collection round produces 10,000 transmissions. Use directed diffusion or tree-based convergecast to reduce collection overhead to O(n) messages.

84.8 Further Reading

  1. Intanagonwiwat, C., et al. (2003). “Directed diffusion for wireless sensor networking.” IEEE/ACM Transactions on Networking, 11(1), 2-16.

  2. Woo, A., Tong, T., & Culler, D. (2003). “Taming the underlying challenges of reliable multihop routing in sensor networks.” ACM SenSys, 14-27.

  3. Levis, P., et al. (2004). “Trickle: A self-regulating algorithm for code propagation and maintenance in wireless sensor networks.” USENIX NSDI, 15-28.

  4. Fasolo, E., et al. (2007). “In-network aggregation techniques for wireless sensor networks: A survey.” IEEE Wireless Communications, 14(2), 70-87.

  5. Gnawali, O., et al. (2009). “Collection tree protocol.” ACM SenSys, 1-14.


84.9 Knowledge Check


84.10 Concept Relationships

Concept Relates To Relationship Type Significance
LEACH vs Direct Transmission Network Lifetime Energy Savings LEACH reduces per-node energy ~3× through hierarchical aggregation: members transmit 50m instead of 180m, exploiting quadratic distance energy model
ETX vs Hop Count Link Quality Routing Metric ETX-based routing saves 23% energy and provides 4.9x better reliability despite longer paths
WMEWMA vs Pure EWMA Link Estimation Responsiveness WMEWMA detects link degradation immediately (12% estimate) vs EWMA’s stale 85% estimate after 5 consecutive losses
Asymmetric Links Bidirectional PRR Real-World Impact Forward PRR=88%, reverse PRR=64% → effective ETX=2.70 (not 1.14) - 2.4x underestimation without bidirectional measurement
AODV vs LEACH Protocol Selection Energy Trade-off AODV: 95% delivery, 64 mJ/node; LEACH: 87% delivery, 18 mJ/node - 3.5x energy savings justifies 8% delivery reduction
Tree-Based Aggregation Transmission Count Energy Efficiency 105 transmissions (100 short + 5 long) vs 100 long-distance transmissions saves 70% energy despite similar count
Trickle Suppression Consistent Networks Overhead Reduction Zero maintenance transmissions when all nodes have same version - polite gossip achieves efficiency

84.11 What’s Next?

Topic Chapter Description
WSN Coverage Fundamentals WSN Coverage Fundamentals How to ensure sensor deployments cover the target area without gaps
WSN Routing Overview WSN Routing Overview Series index for all WSN routing chapters
Routing Challenges WSN Routing Challenges Why traditional routing fails in sensor networks
Directed Diffusion Directed Diffusion Data-centric routing with interest propagation
Data Aggregation Data Aggregation In-network processing to reduce transmissions
Link Quality Routing Link Quality Routing ETX and link quality metrics for better path selection
Trickle Algorithm Trickle Algorithm Efficient network reprogramming via polite gossip