Catalogue the 20+ sensors available in modern smartphones and explain their roles in IoT sensing applications
Access smartphone sensors using both Web APIs and native platform SDKs for data collection
Design participatory sensing applications that balance data quality with user privacy and battery optimization
Implement mobile sensor data processing including sensor fusion, activity recognition, and geofencing
In 60 Seconds
Your smartphone packs 20+ sensors (accelerometer, gyroscope, GPS, camera, microphone, barometer, and more) into a pocket-sized platform. This chapter covers how to access them via Web and Native APIs, design participatory sensing applications with privacy protection, and build hands-on mobile sensing projects.
IMU in Smartphones: The 6- or 9-axis inertial measurement unit in smartphones enables activity recognition (steps, runs, falls), orientation tracking, gesture detection, and dead-reckoning navigation when GPS is unavailable
GPS + Sensor Fusion: Combining GPS (accurate but slow, power-hungry) with accelerometer and gyroscope (fast, low-power) through a Kalman filter gives smooth, continuous position and orientation even through GPS outages in tunnels or indoors
Barometric Altitude: Smartphone barometers measure air pressure with +-1 hPa accuracy, enabling floor-level altitude detection (approximately +-3 m vertical resolution) for indoor navigation and elevator detection
Camera as a Sensor: Smartphone cameras enable visual sensing: QR code scanning, computer vision object detection, document scanning, color measurement, and as a building block for augmented reality applications
Microphone for Environmental Sensing: Smartphone microphones enable sound level measurement (noise pollution mapping), speech-to-text, audio fingerprinting (identifying locations by acoustic signature), and earthquake detection through low-frequency vibration
Participatory Sensing Platform: Using millions of smartphones as a distributed sensor network; applications include noise pollution mapping, pothole detection from accelerometer data, weather observation crowdsourcing, and traffic monitoring from GPS traces
Sensor Fusion Algorithms on Mobile: On-device fusion algorithms (complementary filter, Madgwick filter, Mahony filter) combine accelerometer, gyroscope, and magnetometer to produce stable orientation quaternions used for AR, navigation, and gaming
For Beginners: Your Phone as a Sensor
Your smartphone is actually a powerful IoT device disguised as a phone. It packs 20+ sensors – an accelerometer (detects motion), a gyroscope (detects rotation), GPS (tracks location), a barometer (measures air pressure), cameras, and microphones – all connected to the internet. This chapter shows you how to tap into those sensors for data collection, turning the phone in your pocket into a sophisticated sensing platform.
Overview
Modern smartphones are the most sensor-rich consumer devices ever created, integrating 20+ sensors into a pocket-sized platform. This comprehensive chapter explores how to leverage smartphones as powerful IoT sensor nodes for participatory sensing, crowdsourcing, and mobile data collection applications.
What You’ll Learn
This chapter is divided into four parts covering the complete mobile sensing ecosystem:
Sensor capabilities and architecture - Understanding the 20+ sensors in modern smartphones
Web and Native APIs - Accessing sensors through browser APIs and platform-specific SDKs
Participatory sensing and privacy - Building crowdsourcing applications with privacy protection
Hands-on labs and assessment - Practical exercises and knowledge checks
Start with Part 1 to understand sensor capabilities and architecture
Continue to Part 2 to learn API access methods
Study Part 3 for application design patterns and best practices
Complete Part 4 labs to gain hands-on experience
For quick reference: Each part is self-contained and can be read independently.
Quick Start
Choose Your Path
Beginner path (70-90 minutes): 1. Read Part 1 introduction and beginner sections 2. Skim Part 2 Web APIs examples 3. Complete Lab 1 (accelerometer data collection)
Intermediate path (120-150 minutes): 1. Read Parts 1-3 completely 2. Complete Labs 1-3 3. Take the knowledge check quiz
Advanced path (180-240 minutes): 1. Study all four parts in depth 2. Complete all labs including advanced variants 3. Build a custom participatory sensing application
Quantify fall detection reliability across platforms. A fall produces peak acceleration \(a_{peak} > 2g\) followed by low motion (<0.2g for >3 seconds). Detection requires sampling at \(f_s \geq 50Hz\) to capture the impact transient.
Phone-based detection (60% accuracy): User carries phone in pocket only 12 hours/day (50% coverage). Phone orientation varies, requiring 3-axis magnitude: \(a_{mag} = \sqrt{a_x^2 + a_y^2 + a_z^2}\). False negatives: phone on desk during 50% of waking hours.
Wrist-worn smartwatch (90% accuracy): Always on body. Power budget: 200mAh battery, accelerometer at 50Hz draws 150µA. Daily consumption: \(150µA \times 24h = 3.6mAh\). Display + BLE: 50mAh/day. Battery life: \(200mAh / 53.6mAh = 3.7\) days.
Medical pendant (98% accuracy): Neck-worn guarantees body contact. Sampling at 100Hz with adaptive thresholds: \(a_{threshold} = a_{baseline} + 1.5g\) where \(a_{baseline}\) tracks user’s typical daily motion. False alarm rate: <1 per 6 months (\(p_{false} < 0.0055\)).
False alarm rate | High (phone dropped ≠ fall) | Medium (hand gestures can trigger) | Low (medical-grade algorithms) | User burden | Low (already owned) | Medium (extra device to charge) | Medium (medical device stigma) | Cost | $0 (app only) | $150-400 (smartwatch) | $200-800 (medical pendant) | Regulatory | No certification | No medical cert (consumer) | FDA/CE certified | Response time | 2-5 sec (if on body) | 1-3 sec | <1 sec (cellular built-in) |
Decision matrix:
Choose Phone if: - Pilot/research study (validate concept before hardware investment) - Budget constrained (<$10K) - Short-term monitoring (weeks, not years) - User population tech-savvy and always carries phone
Choose Dedicated Medical if: - Life-critical application (high liability) - Regulatory compliance required - User population includes dementia/cognitive impairment (may not use phone) - False negative rate must be <5%
Real deployment example - Elderly Care Facility (200 residents):
Bottom line: Phones excel at pilot studies and opportunistic sensing. For continuous, safety-critical monitoring, purpose-built wearables or medical devices are essential. Always prototype with phones first to validate demand before investing in dedicated hardware.
13.2 Knowledge Checks
Quick Check: Smartphone Sensor Count
Approximately how many sensors are integrated in a modern flagship smartphone?
3-5 sensors
20+ sensors
50-100 sensors
Only 1 (the camera)
Answer
B) 20+ sensors. Modern smartphones include accelerometers, gyroscopes, magnetometers, GPS/GNSS receivers, barometers, ambient light sensors, proximity sensors, multiple cameras, microphone arrays, fingerprint sensors, face recognition sensors, temperature sensors, and more. This makes them the most sensor-rich consumer devices ever created.
Quick Check: Participatory Sensing
What makes participatory sensing fundamentally different from traditional sensor networks?
It uses more expensive sensors
It relies on volunteer smartphone users rather than dedicated deployed hardware
It only works indoors
It requires wired connections
Answer
B) It relies on volunteer smartphone users rather than dedicated deployed hardware. Participatory sensing leverages the billions of smartphones already carried by people worldwide, turning everyday users into data contributors. This enables massive geographic coverage at minimal infrastructure cost, but introduces challenges around data quality, user incentivization, and privacy protection.
Matching Quiz: Smartphone Sensors and Their Applications
Ordering Quiz: Mobile Participatory Sensing Design
:
Common Pitfalls
1. Confusing Phone Accelerometer Axes with World Coordinates
Smartphone accelerometer axes are fixed to the phone body, not the world. Rotating the phone changes the gravity vector across axes. Always apply a rotation to map phone-frame readings to world-frame coordinates using the device orientation before interpreting acceleration as motion.
2. Relying on Smartphone GPS in Urban Canyons
Tall buildings in urban environments reflect GPS signals (multipath), causing position errors of 10-100 m. Smartphone GPS is not suitable for applications requiring meter-level accuracy in dense urban environments. Use Wi-Fi positioning (Google Fused Location Provider) or cellular positioning as supplementary sources.
3. Insufficient Battery Budget for Continuous Sensing
Running GPS at 1 Hz update rate continuously drains a smartphone battery in approximately 4-6 hours. Running the camera continuously reduces this to 2-3 hours. Design sensing applications with duty cycling — collect bursts of sensor data, pause to allow battery recovery, and sync data when charging.
4. Assuming Sensor Quality is Uniform Across Devices
Smartphone sensor quality varies enormously between manufacturers and models. A budget Android phone may have a +-3% RH humidity sensor accuracy while a flagship model achieves +-1%. Calibration and capability testing on the specific target device model is essential before deploying a sensing application at scale.
Label the Diagram
13.3 What’s Next
If you want to…
Read this
Learn about mobile phone sensor APIs and how to access them