64  MWSN Types and Mobile Entities

In 60 Seconds

MWSN Types and Mobile Entities covers the core principles and practical techniques essential for IoT practitioners. Understanding these concepts enables informed design decisions that balance performance, energy efficiency, and scalability in real-world deployments.

Minimum Viable Understanding
  • Mobile WSNs operate in three environments: underwater (acoustic communication, AUV-based), terrestrial (vehicles, robots, animals), and aerial (UAV/drone-based with limited battery life).
  • Everyday entities like smartphones, buses, and cars can serve as mobile sensor platforms – buses are especially valuable because their predictable routes enable consistent temporal sampling.
  • Humans as sensors (participatory sensing via smartphones) provide the most ubiquitous coverage but with unpredictable mobility patterns and privacy concerns.

Sammy the Sensor discovered that sensors don’t just live on circuit boards – they’re EVERYWHERE in daily life!

“Look at your phone!” said Max the Microcontroller. “It has a GPS, accelerometer, camera, microphone, light sensor, and more. You’re basically a walking sensor node!”

Lila the LED was amazed. “So when people walk around with their phones, they’re collecting data about noise levels, traffic, and air quality without even knowing it?”

“That’s called participatory sensing,” said Max. “And there are sensors on buses too! A bus driving its route every day is like a sensor on wheels – it measures the same streets at the same times, which is super useful.”

Bella the Battery added: “Don’t forget underwater sensors! They can’t use regular radio signals because water blocks them. Instead they use sound waves to communicate, just like whales and dolphins do. It’s slower, but it works underwater where radio can’t!”

“And drones are flying sensors!” said Sammy. “They can see a huge area from above, but their batteries only last about 30 minutes. So they have to work fast!”

“Wow,” said Lila. “Sensors are in the ocean, on the ground, in the air, and in our pockets. The whole world is a sensor network!”

64.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Compare MWSN Environments: Differentiate underwater, terrestrial, and aerial mobile sensor networks
  • Select Appropriate Platforms: Match mobility platforms to application requirements
  • Design Human-Centric Sensing: Leverage smartphones and wearables for participatory sensing
  • Plan Vehicle-Based Networks: Utilize cars, buses, and public transit for urban sensing
  • Deploy Robotic Sensing: Design autonomous robot networks for hazardous or precision applications

Key Concepts

  • Core Concept: Fundamental principle underlying MWSN Types and Mobile Entities — understanding this enables all downstream design decisions
  • Key Metric: Primary quantitative measure for evaluating MWSN Types and Mobile Entities performance in real deployments
  • Trade-off: Central tension in MWSN Types and Mobile Entities design — optimizing one parameter typically degrades another
  • Protocol/Algorithm: Standard approach or algorithm most commonly used in MWSN Types and Mobile Entities implementations
  • Deployment Consideration: Practical factor that must be addressed when deploying MWSN Types and Mobile Entities in production
  • Common Pattern: Recurring design pattern in MWSN Types and Mobile Entities that solves the most frequent implementation challenges
  • Performance Benchmark: Reference values for MWSN Types and Mobile Entities performance metrics that indicate healthy vs. problematic operation

64.2 Prerequisites

Before diving into this chapter, you should be familiar with:

Mobile sensor networks operate in three main environments:

Underwater - Think submarines and ocean monitoring

  • Sound travels farther than radio underwater (acoustic communication)
  • Sensors drift with currents or swim (AUVs)
  • Applications: Ocean temperature, marine life tracking, oil spill detection

On Land (Terrestrial) - Think cars, robots, and animals

  • Standard radio communication (Wi-Fi, cellular, Bluetooth)
  • Sensors on wheels, legs, or carried by animals/humans
  • Applications: Traffic monitoring, wildlife tracking, agriculture

In the Air (Aerial) - Think drones and weather balloons

  • Line-of-sight radio communication
  • Powered flight (drones) or passive drift (balloons)
  • Applications: Disaster response, crop surveillance, search and rescue

Daily Life Entities as Sensors:

Entity Sensors Available Coverage Pattern
Smartphones GPS, accelerometer, camera, microphone Unpredictable (human movement)
Cars GPS, cameras, accelerometers, OBD-II Road networks
Buses GPS, passenger counters Fixed routes, predictable
Robots Customizable sensor suite Programmable paths

64.3 Types of Mobile WSNs

Diagram for Types of Mobile WSNs: WSN Stationary Mobil8 E3d15b
Figure 64.1: Three types of mobile wireless sensor networks and their characteristics

64.4 Underwater Mobile WSNs (UW-MWSNs)

Underwater mobile WSN showing acoustic communication between static sensors and drifting nodes affected by water currents, AUV mobile sink collecting data from underwater sensors via acoustic links, AUV surfacing to upload data via satellite, and 3D deployment space in aquatic environment
Figure 64.2: Underwater Mobile WSN - AUVs and drifting sensors forming mobile networks for aquatic monitoring

Underwater sensor networks monitor aquatic environments where radio frequency signals cannot propagate effectively. Because water absorbs RF energy within metres, these networks rely on acoustic communication – sound waves that travel kilometres through water but at roughly 1,500 m/s (compared to 300,000,000 m/s for RF in air). This five-orders-of-magnitude speed difference creates propagation delays measured in seconds rather than microseconds, fundamentally changing protocol design.

Acoustic vs. RF propagation delay comparison: An underwater sensor 3 km from the base station transmits a 100-byte message.

Acoustic communication (underwater):

  • Speed of sound in water: 1,500 m/s
  • Propagation delay: \(\frac{3{,}000 \text{ m}}{1{,}500 \text{ m/s}} = 2.0 \text{ seconds}\)
  • Acoustic modem data rate: 10 kbps
  • Transmission time: \(\frac{100 \times 8 \text{ bits}}{10{,}000 \text{ bps}} = 0.08 \text{ s} = 80 \text{ ms}\)
  • Total latency: \(2.0 + 0.08 = 2.08 \text{ seconds}\)

RF communication (if it worked in water, which it doesn’t):

  • Speed of light: \(3 \times 10^8\) m/s
  • Propagation delay: \(\frac{3{,}000}{3 \times 10^8} = 10 \text{ µs}\)
  • Typical RF data rate: 250 kbps (802.15.4)
  • Transmission time: \(\frac{800}{250{,}000} = 3.2 \text{ ms}\)
  • Total latency: \(0.01 + 3.2 = 3.21 \text{ ms}\)

Ratio: Acoustic is \(\frac{2{,}080}{3.21} = 648\) times slower than RF. This means underwater protocols must tolerate multi-second round-trip times (RTT = 4+ seconds), making real-time control infeasible and requiring store-and-forward approaches.

The deployment space is inherently three-dimensional: sensors may float at different depths, sit on the seabed, or drift with currents. This current-induced mobility means that a node deployed at one location may drift hundreds of metres over days, requiring the network to tolerate shifting topology without manual intervention. Bandwidth is also severely constrained – typical acoustic modems achieve 1–10 kbps compared to hundreds of kbps for terrestrial RF links.

Despite these constraints, underwater MWSNs enable capabilities that would be impossible with fixed shore-based instruments. Ocean monitoring systems track temperature gradients, salinity, and pollution plumes across three-dimensional volumes. Marine biologists use acoustic sensor networks to track whale migration corridors and map fish population density. Energy companies monitor offshore oil and gas infrastructure for structural fatigue and leak detection, while seismology networks detect early tsunami indicators from ocean-floor pressure changes.

64.4.1 Integration with Autonomous Underwater Vehicles (AUVs)

AUVs address a fundamental challenge in underwater networks: getting data from the ocean to the surface. Because acoustic links are slow and unreliable over long distances, AUVs serve as mobile data collectors (data MULEs). A typical mission involves an AUV traversing a pre-planned route, collecting stored readings from stationary seabed sensors via short-range acoustic links, then surfacing to upload the aggregated data via satellite. This store-carry-forward approach trades latency for reliability – data may be hours old when it reaches the cloud, but delivery rates exceed 99% compared to 60–70% for multi-hop acoustic forwarding.

Diagram for Integration with Autonomous Underwater Vehicles (AUVs): WSN Stationary Mobil71c466e
Figure 64.3: Underwater MWSN architecture with AUV data collection and satellite uplink

64.5 Terrestrial Mobile WSNs

Ground-based mobile sensor networks operate on land surfaces using standard RF communication (Wi-Fi, cellular, Bluetooth, or Zigbee). Unlike their underwater counterparts, terrestrial MWSNs benefit from high-bandwidth, low-latency radio links – but they face their own challenges including terrain obstacles, GPS accuracy in forests and urban canyons, and the energy cost of physical locomotion.

Mobility platforms span a wide range. Wheeled robots offer the best energy efficiency on flat surfaces but are limited to roads and paths. Tracked vehicles handle rough terrain (rubble, mud, slopes) at higher energy cost. Animal-borne sensors – GPS collars on wolves, accelerometers on sea turtles – provide coverage of areas no robot could access, though mobility patterns are unpredictable. Human-carried smartphones represent the most ubiquitous platform, with billions of devices already deployed worldwide.

The application space is correspondingly broad. Wildlife conservationists attach sensor collars to track endangered species across national parks, generating migration maps that inform habitat protection policies. Precision agriculture uses autonomous tractors with soil sensors to create sub-metre-resolution fertility maps, enabling variable-rate fertiliser application that reduces chemical runoff by 20–30%. Search and rescue teams deploy sensor-equipped robots into collapsed buildings where human entry would be too dangerous, while environmental agencies use mobile sensor platforms to map radiation levels around nuclear facilities or track pollution plumes from industrial accidents.

64.5.1 Integration with Unmanned Aerial Vehicles (UAVs)

UAVs complement terrestrial networks by providing an aerial perspective that ground-level sensors cannot achieve. A drone flying at 50 metres altitude can survey a hectare in minutes, detecting hotspots (wildfire, crop stress, flood extent) that ground sensors might miss entirely. UAVs also serve as mobile relay nodes: when a terrestrial sensor network becomes partitioned – perhaps because a gateway failed or terrain blocks line-of-sight – a UAV can fly between fragments, collecting data from isolated clusters and relaying it to functioning base stations. This temporary connectivity restores network function without requiring permanent infrastructure.

64.6 Aerial Mobile WSNs

Aerial sensor networks deploy sensor nodes on flying platforms – primarily multirotor drones and fixed-wing UAVs. These networks offer unmatched flexibility: a drone can be repositioned anywhere in three-dimensional space within minutes, covering areas that would take ground vehicles hours to traverse.

The advantages come with significant constraints. Battery life is the dominant limitation – a typical quadcopter carries 20–40 minutes of flight time, and payload sensors reduce this further. Weather dependence is another factor: wind speeds above 10 m/s ground most consumer drones, and rain or fog degrades both flight safety and sensor quality. However, when airborne, drones enjoy line-of-sight communication over kilometres, enabling high-bandwidth links that terrestrial networks in cluttered environments cannot match.

Aerial MWSNs excel in time-critical, wide-area applications. After a natural disaster, a fleet of drones can survey damage across an entire city in hours rather than the days required for ground teams. Traffic management systems use overhead drones to monitor congestion patterns across highway networks. Precision agriculture employs multispectral camera drones to detect crop stress, pest damage, or irrigation failures across thousands of hectares per day. Wildlife census operations use thermal-imaging drones to count animal populations without the disturbance caused by ground-based surveys.

When multiple UAVs coordinate, they form Flying Ad-hoc Networks (FANETs). Unlike ground-based ad-hoc networks, FANETs experience rapid topology changes as drones move at 10–20 m/s, requiring routing protocols that can reconverge in seconds. Coordinated missions – such as systematic grid search patterns for search-and-rescue – demand distributed planning algorithms that balance coverage completeness against each drone’s remaining battery life.

64.7 Mobile Entities in Daily Life

Mobile entities from our daily lives can serve as opportunistic sensor platforms and data collectors. Each entity type has unique characteristics that influence its suitability for different WSN applications.

Diagram for Mobile Entities in Daily Life: Mermaid Daily Entities
Figure 64.4: Mobile entities in daily life serving as sensor platforms

64.8 Humans as Mobile Sensors

Humans carrying smartphones or wearable devices represent the most ubiquitous mobile sensor platform on Earth. A modern smartphone contains a GPS receiver, tri-axis accelerometer, gyroscope, magnetometer, barometer, ambient light sensor, camera, and microphone – more sensing capability than a dedicated weather station from a decade ago, carried by billions of people every day.

What makes human-centric sensing unique is the combination of rich sensor diversity with inherently unpredictable mobility. People follow social mobility patterns (home to work to shops to social gatherings) that provide excellent coverage of urban areas during daytime but leave rural and nighttime coverage sparse. Unlike robotic platforms, human carriers can also annotate data – a pedestrian can photograph a pothole while their phone logs the GPS coordinates, creating richer data than any autonomous system.

Data collection operates in four modes, each with different trade-offs. Continuous background sensing runs always-on monitoring such as step counting or location tracking, providing dense temporal data but consuming battery. Event-triggered sensing activates only when thresholds are crossed – a fall detection algorithm that captures accelerometer data only during sudden impacts, or a noise monitor that records audio only above 85 dB. User-initiated sensing relies on active participation, such as citizens photographing pollution sources or completing environmental surveys. Opportunistic sensing uploads stored data when the device passes near infrastructure (a Wi-Fi access point or Bluetooth beacon), deferring transmission to save cellular data costs.

These modes enable diverse urban sensing applications. City-wide noise maps aggregate microphone readings from thousands of commuters to identify noise pollution hotspots with block-level resolution. Air quality networks use wearable pollution sensors to track personal exposure along different commute routes, revealing that cyclists on back streets may experience 40% lower pollution than those on main roads. Traffic monitoring extracts congestion patterns from anonymised GPS traces, while health systems use accelerometer data to track activity levels and sleep patterns across populations.

Real-World Example: Aclima/Google Street View Air Quality

Google Street View cars equipped with air quality sensors measured pollution across San Francisco. Covering the city with stationary sensors would require 10,000+ nodes ($5M+). Mobile sensors on 15 Street View cars achieved citywide coverage for <$500K, demonstrating how mobile sensing trades temporal resolution for spatial coverage.

64.9 Vehicles as Mobile Sensors

Vehicles equipped with sensors provide wide spatial coverage along road networks, combining the sensing density of smartphones with the reliability of fixed infrastructure. A modern connected car already carries GPS, forward-facing cameras, accelerometers, tyre pressure monitors, and OBD-II diagnostic data – all of which can serve double duty as urban sensing infrastructure.

This sensor suite enables multiple simultaneous applications. GPS traces aggregated from thousands of vehicles reveal real-time traffic congestion patterns. Cameras detect available parking spaces, road surface conditions, and traffic signal timing. Accelerometer data identifies potholes and speed bumps – a sudden vertical acceleration spike at a known GPS coordinate, reported by multiple vehicles, pinpoints road damage with sub-metre accuracy. Environmental sensors mounted on fleet vehicles map air quality gradients across cities, showing how pollution levels vary block by block.

Comparing crowdsourced bus-based air quality monitoring vs. stationary sensor networks: Melbourne deployed air quality sensors on 30 city buses vs. installing 100 fixed stations.

Stationary network (100 stations):

  • Coverage: 100 fixed points across 300 km² city = 1 sensor per 3 km²
  • Temporal resolution: Continuous (1 reading/minute)
  • Hardware cost: 100 × $2,500 per reference-grade station = $250,000
  • Installation: 100 sites × $500 labor = $50,000
  • Annual maintenance: 100 × $200 = $20,000
  • Total 5-year cost: $250k + $50k + ($20k × 5) = $400,000

Mobile network (30 buses with sensors):

  • Coverage: Buses visit ~600 unique locations per day (routes overlap, but cover 80% of city streets)
  • Temporal resolution: Intermittent (each location sampled 2-6 times/day as buses pass)
  • Hardware cost: 30 × $800 per low-cost particulate sensor = $24,000
  • Installation: 30 buses × $150 mounting = $4,500
  • Annual maintenance: 30 × $100 = $3,000
  • Total 5-year cost: $24k + $4.5k + ($3k × 5) = $43,500

Result: Mobile network achieves 6× better spatial coverage (80% of streets vs. 100 fixed points) at 9× lower cost ($43,500 vs. $400,000), trading continuous temporal sampling (stationary) for intermittent “drive-by” measurements (mobile). For air quality, daily averages are more important than minute-by-minute variations, making mobile sensing highly effective.

Vehicles can transmit collected data through several channels: cellular upload (highest bandwidth, available everywhere with coverage), Dedicated Short-Range Communication (DSRC) to roadside units at intersections, or vehicle-to-vehicle ad-hoc networks (VANETs) that relay data through nearby cars toward the nearest upload point. The choice depends on latency requirements – safety-critical alerts need cellular or DSRC, while environmental data can tolerate store-and-forward delays.

Public transit vehicles (buses, trams, trains) are particularly valuable as mobile sensing platforms because their routes are fixed and schedules are known. A bus running Route 42 every 15 minutes provides consistent temporal sampling of every street along its path – the same sensors measure the same locations at the same times every day, enabling reliable trend detection. This predictability also simplifies data fusion: when a bus reports a temperature of 32C at a given intersection at 2:00 PM, you know exactly when the next measurement will arrive and can interpolate with confidence. Instrumenting a city’s bus fleet is also cost-effective: equipping 200 buses covers most urban corridors, whereas achieving equivalent coverage with personal vehicles would require thousands of voluntary participants.

64.10 Mobile Robots

Autonomous or semi-autonomous robots offer something no other mobile platform provides: fully controllable, programmable mobility. While human carriers follow unpredictable social patterns and vehicles are constrained to road networks, a robot can be commanded to visit specific coordinates in a specific sequence, executing sensing missions with centimetre-level precision.

This controllability translates to several practical advantages. Mission planning algorithms can compute optimal coverage paths that guarantee every square metre of an area is sensed, eliminating the coverage gaps inherent in opportunistic human or vehicle sensing. Robots can operate in environments too dangerous for humans – inspecting nuclear reactor containment vessels, mapping chemical spill boundaries, or searching collapsed buildings after earthquakes. Multi-robot coordination enables parallel coverage: a fleet of 10 robots can survey a warehouse in one-tenth the time of a single unit, with each robot responsible for a non-overlapping zone.

The application domains reflect these strengths. In industrial settings, warehouse robots perform daily inventory counts using RFID readers, covering thousands of pallet locations in hours. Agricultural robots equipped with multispectral cameras and soil probes create high-resolution crop health maps, detecting disease patches before they spread. Hazardous environment monitoring uses radiation-hardened robots to inspect nuclear facilities, chemical plants, and mine shafts. Search and rescue operations deploy tracked robots into rubble where human rescuers cannot safely enter. And on the most extreme frontier, planetary exploration rovers like NASA’s Perseverance demonstrate multi-year autonomous sensing missions where human intervention requires 20-minute light-speed communication delays.

Robot Types:

Type Mobility Environment Example Application
Wheeled Fast, efficient Flat surfaces Warehouse inventory
Tracked Rough terrain Outdoor/rubble Search and rescue
Legged Stairs, obstacles Indoor/complex Building inspection
Aerial (drone) 3D movement Any Crop surveillance
Aquatic Water Lakes/ocean Environmental monitoring

64.11 Knowledge Check

64.12 Mobile Sensing Platform Comparison: Cost, Coverage, and Controllability

Choosing between mobile sensing platforms involves trade-offs that are not immediately obvious. A robot that costs 100x more than a bus-mounted sensor may cover 10x less area but provides controllable, repeatable measurements.

Platform Upfront Cost Coverage/Day Control Best Fit
Smartphone (crowdsourced) $0 (users own devices) City-wide, user-density dependent None Air quality, noise mapping, traffic
City bus (instrumented) $500-$2,000 per bus 80-150 km fixed routes Low Urban environmental monitoring
Delivery vehicle $300-$1,000 per vehicle 50-200 km, route-dependent Low Last-mile coverage, road conditions
Ground robot (Turtlebot-style) $5,000-$20,000 2-5 km, battery-limited Full Indoor mapping, hazmat inspection
Consumer drone (DJI Mavic) $1,500-$3,000 10-30 km per mission Full Agriculture, infrastructure inspection
Industrial drone (DJI Matrice) $6,000-$15,000 30-80 km per mission Full Precision agriculture, surveying
AUV (BlueROV2) $4,000-$8,000 5-15 km underwater Full Coral reef monitoring, pipeline inspection
Animal-borne collar $200-$1,500 per collar Animal-driven (15-30 km/day typical) None Wildlife ecology, migration tracking

Quality and maintenance notes:

  • Smartphones: Variable data quality because sensors and mounting positions differ; essentially no hardware maintenance.
  • City buses and delivery vehicles: Consistent, calibrated sensors with low annual maintenance.
  • Ground robots and drones: Highest controllability and measurement precision, but moderate to high maintenance and tighter battery limits.
  • AUVs: High-quality sonar and water sensing, but underwater operations raise maintenance cost.
  • Animal-borne collars: Moderate GPS quality with extremely low maintenance after deployment, but no path control.

Key Insight – Controllability vs Coverage Trade-off: Platforms with full controllability (robots, drones) provide precise, repeatable measurements but cover small areas due to battery constraints. Platforms with no controllability (buses, animals, humans) cover large areas cheaply but cannot be directed to specific locations on demand. The optimal solution for most monitoring applications is a two-tier approach: uncontrolled platforms for baseline coverage (buses for 80% of urban area) supplemented by controlled platforms for targeted investigation (drone dispatched to investigate anomalies detected by bus sensors).

64.13 Worked Example: Bus Fleet vs Crowdsourced Smartphones for City Air Quality Mapping

Worked Example: Melbourne Air Quality – Which Platform Delivers Better Data?

Scenario: The City of Melbourne wants to map PM2.5 air pollution at block-level resolution across 227 km2. Two approaches are proposed: (A) instrumenting the city bus fleet, and (B) a crowdsourced smartphone app with portable sensors. Budget: $150,000 for Year 1.

Option A – Bus Fleet (210 buses on 60 routes):

Factor Value
Sensors per bus 1 calibrated PM2.5 sensor ($450 each)
Installation cost $200 per bus (bracket, wiring, GPS integration)
Routes covered 60 routes, 1,200 km of road network
Temporal sampling Same streets every 12-18 minutes (bus frequency)
Data quality Factory-calibrated, identical sensors, consistent mounting height (2.5 m)
Year 1 cost 210 x $650 + $8,000 software + $5,000 cellular = $149,500
Coverage gaps No coverage outside bus routes (residential streets, parks)

Option B – Crowdsourced Smartphones (target: 5,000 participants):

Factor Value
Portable sensor per user $25 clip-on PM2.5 sensor paired via Bluetooth
App development $60,000 (iOS + Android)
Recruitment cost $5 incentive per participant x 5,000 = $25,000
Routes covered Unpredictable – depends on user movement patterns
Temporal sampling Highly variable (some streets sampled 100x/day, others 0x/week)
Data quality Uncalibrated sensors, varying heights (pocket, backpack, hand), user compliance unknown
Year 1 cost $60,000 app + $125,000 sensors + $25,000 incentives = $210,000 (over budget)
Coverage gaps Excellent in CBD, poor in suburbs with low smartphone density

Head-to-head comparison:

Metric Bus Fleet Crowdsourced Winner
Budget compliance $149,500 (under) $210,000 (over by 40%) Bus
Spatial coverage (road km) 1,200 km (predictable) 800-2,500 km (unpredictable) Depends on participation
Temporal consistency Same measurement every 15 min Random, often none at night Bus
Data calibration Factory-calibrated, comparable Uncalibrated, 30% sensor-to-sensor variance Bus
Residential street coverage None Some (if users live there) Crowdsourced
Data availability (hours/day) 18 hours (5 AM to 11 PM bus service) 24 hours (but sparse overnight) Tie
Maintenance burden Low (bus depot services sensors monthly) High (users lose/break sensors, stop participating) Bus
Year 2+ cost $15,000 (cellular + calibration) $50,000 (replacement sensors + new incentives) Bus

Melbourne’s decision: Bus fleet for baseline monitoring (reliable, calibrated, reproducible year-over-year), supplemented by 200 crowdsourced volunteers for residential street spot-checks. Total Year 1: $149,500 (bus) + $12,000 (200 volunteers) = $161,500.

Key Insight: Buses win on data quality and cost because predictable routes produce consistent, calibrated measurements that are directly comparable across time. Crowdsourcing wins on coverage breadth but produces noisy, unreliable data that is difficult to use for regulatory compliance. The optimal strategy uses buses for the 80% of urban area they cover and targeted crowdsourcing for the remaining 20%.

Common Pitfalls

Relying on theoretical models without profiling actual behavior leads to designs that miss performance targets by 2-10×. Always measure the dominant bottleneck in your specific deployment environment — hardware variability, interference, and load patterns routinely differ from textbook assumptions.

Optimizing one parameter in isolation (latency, throughput, energy) without considering impact on others creates systems that excel on benchmarks but fail in production. Document the top three trade-offs before finalizing any design decision and verify with realistic workloads.

Most field failures come from edge cases that work in the lab: intermittent connectivity, partial node failure, clock drift, and buffer overflow under peak load. Explicitly design and test failure handling before deployment — retrofitting error recovery after deployment costs 5-10× more than building it in.

64.14 Summary

This chapter covered the types of mobile wireless sensor networks and mobile entities:

  • Underwater MWSNs: Acoustic communication, 3D deployment, current-induced mobility, AUV integration for ocean and marine monitoring
  • Terrestrial MWSNs: Ground-based platforms (robots, vehicles, animals) for wildlife tracking, agriculture, and smart cities
  • Aerial MWSNs: UAV-based networks with wide coverage but limited battery life for disaster response and surveillance
  • Human-Centric Sensing: Smartphones and wearables providing unpredictable but ubiquitous coverage with rich sensor diversity
  • Vehicle-Based Sensing: Cars and buses covering road networks with predictable (transit) or opportunistic (personal) patterns
  • Robotic Sensing: Controllable autonomous platforms for hazardous environments and precision applications

64.15 What’s Next

Topic Chapter Description
Overview WSN Stationary vs Mobile Complete summary of stationary and mobile architectures
Human-Centric Sensing WSN Human-Centric Networks and DTN Participatory sensing and delay-tolerant networking
Production Deployment WSN Production Deployment Real-world deployment frameworks and cost analysis