20  Mobile Sensors Assessment

In 60 Seconds

This assessment covers 10 knowledge check questions and a 10-question comprehensive review quiz on smartphone sensor types, Web APIs (Generic Sensor, Geolocation, DeviceOrientation), participatory sensing, privacy protection (differential privacy, k-anonymity), and battery optimization strategies for mobile IoT sensing.

Key Concepts
  • Accelerometer: a MEMS sensor that measures proper acceleration along one or more axes; used in smartphones for screen rotation, step counting, and activity classification
  • Gyroscope: a sensor measuring angular velocity (rotation rate) around one or more axes; combined with accelerometer data enables orientation and gesture recognition
  • Sensor Fusion: combining data from multiple sensors (e.g., accelerometer + gyroscope + magnetometer) using algorithms like Kalman filters to produce more accurate and stable estimates than any single sensor
  • Activity Recognition: using machine learning classifiers applied to sensor time-series data to infer user activities (walking, running, driving, stationary) from a smartphone
  • Sampling Rate: the frequency at which sensor readings are captured; higher rates capture rapid motion but consume more power and storage; must exceed twice the highest frequency of interest (Nyquist)
  • Context Sensing: inferring higher-level user context (location type, physical activity, social situation) from low-level sensor streams to enable adaptive app behaviour
  • MEMS: Micro-Electro-Mechanical Systems — miniaturised mechanical and electro-mechanical elements fabricated on semiconductor substrates; the technology behind modern smartphone sensors

20.1 Learning Objectives

This assessment will test your understanding of:

  • Distinguish smartphone sensor types and explain their applications
  • Compare Web APIs for sensor access and their browser compatibility
  • Assess participatory sensing principles and design trade-offs
  • Evaluate privacy protection techniques including differential privacy and k-anonymity
  • Apply battery optimization strategies to extend mobile sensing duration
  • Analyze sensor fusion techniques for indoor and outdoor navigation

Mobile sensing battery life is governed by duty cycle:

\[ P_{\text{avg}} = P_{\text{idle}} + d \cdot (P_{\text{active}} - P_{\text{idle}}), \qquad t_{\text{life}} = \frac{E_{\text{battery}}}{P_{\text{avg}}} \]

with duty cycle \(d = t_{\text{active}}/t_{\text{period}}\).

Worked example: If a phone has \(E_{\text{battery}} = 12.95\) Wh, idle power is 250 mW, and sensing bursts use 1.8 W for 8s every 60s:

\[ d = 8/60 = 0.133,\quad P_{\text{avg}} = 0.25 + 0.133(1.8-0.25)=0.456 \text{ W} \]

Battery life is \(12.95/0.456 \approx 28.4\) hours. If sampling is relaxed to every 5 minutes (\(d=8/300\)), \(P_{\text{avg}}\) drops to about 0.291 W and battery life rises to about 44.5 hours.

Interactive Calculator:

Scenario: A city-wide noise monitoring app collects sound levels every 10 seconds using the microphone, plus GPS location. Initial version drains phone battery from 100% to 0% in 4 hours.

Step 1: Measure baseline power consumption

Component           Power Draw   Duty Cycle   Avg Power
GPS (continuous)    450 mW       100%         450 mW
Microphone          150 mW       100%         150 mW
Audio processing    300 mW       50%          150 mW
Network (cellular)  800 mW       10%          80 mW
Screen (off)        50 mW        100%         50 mW
----------------------------------------------------------
Total average power:                          880 mW

Battery calculation:

  • Phone battery: 3500 mAh × 3.7V = 12,950 mWh = 12.95 Wh
  • Theoretical battery life = 12,950 mWh / 880 mW = 14.7 hours at 880 mW average

But measured life is only 4 hours! Why? GPS and cellular radios cycle between sleep and active states, with 2-3× higher peak power than the listed average values. The actual average power consumption is closer to 3,240 mW (12,950 mWh / 4 hours).

Step 2: Optimization Strategy 1 - Adaptive GPS sampling

// Original: continuous GPS (every second)
// Optimized: significant location change trigger
if ('geolocation' in navigator) {
    // Only update when moved >100m
    navigator.geolocation.watchPosition(
        callback,
        error,
        { enableHighAccuracy: false, // Use cell/Wi-Fi positioning
          maximumAge: 60000,          // Accept 1-minute-old position
          timeout: 30000 }
    );
}
// Power reduction: 450 mW → 50 mW (9× reduction)

Step 3: Optimization Strategy 2 - Audio batching

// Original: record 10-second clips continuously
// Optimized: record 10s every 5 minutes
setInterval(() => {
    recordAudio(10000); // 10-second sample
    processAndUpload();
}, 300000); // Every 5 minutes

// Audio duty cycle: 10s / 300s = 3.3%
// Power: (150 + 150) mW × 0.033 = 10 mW average
// vs. continuous 300 mW (30× reduction)

Step 4: Optimization Strategy 3 - Data batching

// Original: upload after each 10s sample
// Optimized: batch 1 hour of data, upload once
let dataBuffer = [];
setInterval(() => {
    dataBuffer.push(currentSample);
    if (dataBuffer.length >= 12) { // 12 × 5min = 1 hour
        uploadBatch(dataBuffer);
        dataBuffer = [];
    }
}, 300000);

// Network duty cycle: 5s / 3600s = 0.14%
// Power: 800 mW × 0.0014 = 1.1 mW average
// vs. continuous 80 mW (73× reduction)

Optimized power budget:

Component           Power Draw   Duty Cycle   Avg Power
GPS (location change) 450 mW     11%          50 mW
Microphone          150 mW       3.3%         5 mW
Audio processing    300 mW       3.3%         10 mW
Network (batched)   800 mW       0.14%        1.1 mW
Screen (off)        50 mW        100%         50 mW
----------------------------------------------------------
Total average power:                          116.1 mW

Result:

  • Battery life = 12,950 mWh / 116.1 mW = 111.6 hours (4.7 days)
  • 28× battery life improvement (4 hours → 111 hours)
  • Data quality maintained: 12 samples/hour still provides city-wide noise coverage

Key lessons:

  1. GPS is the biggest battery drain - use significant change triggers, not continuous polling
  2. Batch uploads save 10-100× power - network radio wake-up costs dominate transmission time
  3. Reduce sampling frequency - 10-second clips every 5 minutes still capture noise patterns
  4. Use accelerometer as wake trigger - only sample audio when phone detects motion (user is active)

Production optimization: Add battery-aware adaptation:

navigator.getBattery().then(battery => {
    if (battery.level < 0.2) {         // <20% battery
        samplingInterval = 600000;      // Every 10 minutes
    } else if (battery.level < 0.5) {  // <50% battery
        samplingInterval = 300000;      // Every 5 minutes
    } else {
        samplingInterval = 60000;       // Every 1 minute (high battery)
    }
});

This demonstrates how systematic power analysis and adaptive strategies transform an impractical 4-hour battery life into a usable 4+ day deployment.

20.2 Knowledge Check

Test your knowledge of mobile phone sensing and its applications in IoT systems with these 10 questions.

Question 1

Which sensor is typically used for step counting in fitness trackers?

  1. Gyroscope
  2. Accelerometer
  3. Magnetometer
  4. Barometer

B) Accelerometer

The accelerometer is the primary sensor for step counting because it detects linear acceleration in 3 axes (x, y, z). Step detection algorithms:

  1. Calculate magnitude: √(x² + y² + z²)
  2. Detect peaks: When magnitude exceeds threshold (typically 10.8-12.7 m/s², or 1.1-1.3g)
  3. Apply timing filter: Minimum interval between steps (300-500 ms)
  4. Count valid peaks: Each peak = one step

Why not other sensors?

  • Gyroscope: Measures rotation, not linear movement
  • Magnetometer: Detects magnetic fields (used for compass)
  • Barometer: Measures air pressure (used for altitude/floors)

Modern fitness trackers often combine accelerometer + gyroscope for better accuracy in detecting different activities (walking, running, cycling).

Try It: Step Detection Simulator

Explore how accelerometer magnitude data is used to count steps. Adjust the walking speed and detection threshold to see how peak detection algorithms identify valid steps.

Question 2

What is the typical GPS accuracy on smartphones in open outdoor areas?

  1. 1-2 meters
  2. 5-10 meters
  3. 50-100 meters
  4. 500-1000 meters

B) 5-10 meters

GPS accuracy on smartphones:

Environment Typical Accuracy
Open outdoor 5-10 meters
Urban areas 10-50 meters
Indoor Not available or >100m
With A-GPS 3-5 meters

Factors affecting GPS accuracy:

  • Satellite visibility: Need 4+ satellites for 3D fix
  • Atmospheric conditions: Ionospheric delay, signal refraction
  • Multipath effects: Signal reflections in urban canyons
  • Device hardware: Quality of GPS chip and antenna

Improvements:

  • A-GPS (Assisted GPS): Uses cellular network to speed up satellite acquisition
  • GLONASS, Galileo, BeiDou: Additional satellite systems improve accuracy
  • Wi-Fi positioning: Indoor location using Wi-Fi access point triangulation
  • Sensor fusion: Combine GPS with accelerometer and gyroscope for smoothing

For IoT applications requiring high precision (e.g., autonomous vehicles), RTK-GPS can achieve centimeter-level accuracy but requires additional infrastructure.

Question 3

Which Web API allows access to smartphone sensors without requiring a native app?

  1. MQTT API
  2. Generic Sensor API
  3. WebSocket API
  4. REST API

B) Generic Sensor API

The Generic Sensor API is a W3C standard that provides a unified interface for accessing smartphone sensors via web browsers:

Supported sensors:

  • Accelerometer: Linear acceleration (3-axis)
  • Gyroscope: Angular velocity (3-axis)
  • Magnetometer: Magnetic field (3-axis)
  • AbsoluteOrientationSensor: Device orientation in 3D space
  • AmbientLightSensor: Illuminance level

Example:

const accelerometer = new Accelerometer({ frequency: 60 });
accelerometer.addEventListener('reading', () => {
    console.log(`X: ${accelerometer.x}, Y: ${accelerometer.y}, Z: ${accelerometer.z}`);
});
accelerometer.start();

Additional Web APIs for sensors:

  • Geolocation API: GPS location (navigator.geolocation)
  • DeviceOrientation API: Gyroscope/magnetometer orientation
  • Web Audio API: Microphone access
  • MediaDevices API: Camera access (getUserMedia)

Advantages of Web APIs:

  • No app installation required
  • Cross-platform compatibility
  • Instant updates (no app store approval)
  • Lower development cost

Limitations:

  • Requires HTTPS for security
  • Limited background execution
  • May have lower sampling rates than native apps
Question 4

What is the primary advantage of participatory sensing compared to fixed sensor networks?

  1. Higher sensor accuracy
  2. Lower cost and greater spatial coverage
  3. Faster data transmission
  4. Better battery life

B) Lower cost and greater spatial coverage

Participatory sensing leverages smartphones carried by users to collect data, offering several advantages:

Advantages:

  1. Massive spatial coverage: Billions of smartphones worldwide provide data from areas that would be impractical to cover with fixed sensors
  2. Lower infrastructure cost: No need to deploy and maintain dedicated sensor networks
  3. Temporal coverage: Mobile users provide data at different times and locations
  4. Rapid deployment: No hardware installation required—just release an app

Example applications:

  • Traffic monitoring: Waze uses crowdsourced location data to detect congestion
  • Air quality mapping: Multiple apps aggregate pollution readings from phone sensors
  • Noise pollution: Community noise maps created from smartphone microphones
  • Pothole detection: Accelerometer data detects road conditions

Challenges:

  • Data quality variation: Different devices, sensor calibration, user behavior
  • Privacy concerns: Location tracking and data anonymization
  • Incentivization: Need to motivate users to participate
  • Battery drain: Continuous sensing impacts battery life

Comparison with fixed networks:

  • Fixed sensors: Higher accuracy, controlled placement, 24/7 operation
  • Mobile sensors: Broader coverage, lower cost, flexibility

Hybrid approaches combining both are often most effective.

Question 5

Which technique is commonly used to reduce battery drain during continuous GPS tracking?

  1. Increase sampling frequency
  2. Use A-GPS for faster acquisition
  3. Adaptive sampling based on movement
  4. Disable Wi-Fi and Bluetooth

C) Adaptive sampling based on movement

Adaptive sampling adjusts GPS update frequency based on device movement and context, significantly reducing battery consumption:

Strategy:

IF stationary:
    GPS update every 60 seconds (or disable)
ELIF walking:
    GPS update every 10-20 seconds
ELIF driving:
    GPS update every 5-10 seconds
ELIF high-speed movement:
    GPS update every 1-3 seconds

Implementation approaches:

  1. Movement-based: Use accelerometer to detect motion
    • If accelerometer magnitude is stable → reduce GPS rate
    • If significant movement detected → increase GPS rate
  2. Geofencing-based: Update rate depends on proximity to points of interest
    • Far from POIs → low update rate
    • Near POI → high update rate
  3. Battery-aware: Adjust based on remaining battery
    • High battery (>80%) → normal rate
    • Low battery (<20%) → minimal rate
  4. Context-aware: Use activity recognition
    • Stationary → GPS off
    • Walking → 10-second intervals
    • Driving → 5-second intervals

Power consumption comparison:

  • Continuous GPS (1 Hz): ~450 mW (4-6 hours battery life)
  • Adaptive sampling: ~50-100 mW (20-40 hours battery life)

Additional battery optimization techniques:

  • Batching: Group GPS updates and process together
  • Sensor fusion: Use accelerometer+gyroscope between GPS updates
  • Wi-Fi positioning: Use Wi-Fi when indoors (lower power than GPS)
  • Geofence-based wake-up: Only enable GPS when entering/exiting areas

Note: While A-GPS (option B) speeds up initial satellite acquisition, it doesn’t significantly reduce ongoing power consumption during tracking.

Try It: Adaptive GPS Sampling Visualizer

Compare battery life under different GPS sampling strategies. Select an activity context and see how adaptive sampling adjusts the update interval to balance accuracy and power.

Question 6

What is the primary privacy concern with mobile sensing applications?

  1. High data storage requirements
  2. Excessive battery consumption
  3. Location tracking and user identification
  4. Slow data transmission

C) Location tracking and user identification

Mobile sensing applications pose significant privacy risks because they can reveal sensitive information about users:

Privacy concerns:

  1. Location tracking: GPS data reveals:
    • Home and work addresses
    • Daily routines and patterns
    • Visited locations (hospitals, religious sites, political events)
    • Social relationships (who you meet and where)
  2. Activity inference: Sensor data can infer:
    • Health conditions (gait analysis from accelerometer)
    • Lifestyle habits (sleep patterns, exercise)
    • Transportation modes (walking, driving, public transit)
  3. User identification: Even “anonymized” data can be de-anonymized:
    • Unique movement patterns act as fingerprints
    • 4 spatio-temporal points can identify 95% of users
    • Combining datasets enables re-identification

Privacy protection techniques:

  1. Data minimization: Only collect necessary data
  2. Anonymization: Remove personally identifiable information
  3. Differential privacy: Add statistical noise to protect individuals
  4. K-anonymity: Ensure each record is indistinguishable from k-1 others
  5. Location obfuscation: Report grid-based regions instead of precise coordinates
  6. On-device processing: Process data locally, only send aggregated results
  7. User consent: Clear opt-in with transparent data usage policies

Regulatory frameworks:

  • GDPR (Europe): Requires explicit consent, data minimization
  • CCPA (California): User rights to data access and deletion
  • HIPAA (USA): Health data protection requirements

Best practices for developers:

  • Implement “privacy by design”
  • Provide granular permissions (e.g., approximate location only)
  • Allow users to review and delete their data
  • Encrypt data in transit and at rest
  • Regular privacy audits and impact assessments
Question 7

Which sensor combination is typically used for dead reckoning navigation when GPS is unavailable?

  1. Camera + Microphone
  2. Accelerometer + Gyroscope
  3. Barometer + Magnetometer
  4. Proximity + Light sensor

B) Accelerometer + Gyroscope

Dead reckoning (also called inertial navigation) estimates position by tracking movement from a known starting point without external references.

How it works:

  1. Accelerometer: Measures linear acceleration

    • Integrate once → velocity
    • Integrate twice → displacement
  2. Gyroscope: Measures angular velocity

    • Integrate → orientation (heading)
  3. Combined approach:

    Initial position: (x₀, y₀)
    Measure: acceleration (ax, ay, az) and rotation (ωx, ωy, ωz)
    Calculate: velocity and orientation
    Estimate: new position (x₁, y₁)

Typical implementation:

# Simplified dead reckoning
position = [0, 0]  # Start position
velocity = [0, 0]
orientation = 0  # degrees

while True:
    accel = read_accelerometer()
    gyro = read_gyroscope()
    dt = 0.01  # 100 Hz sampling

    # Update orientation from gyroscope
    orientation += gyro.z * dt

    # Update velocity from accelerometer (in local frame)
    velocity[0] += accel.x * dt
    velocity[1] += accel.y * dt

    # Rotate velocity to global frame
    vx_global = velocity[0] * cos(orientation) - velocity[1] * sin(orientation)
    vy_global = velocity[0] * sin(orientation) + velocity[1] * cos(orientation)

    # Update position
    position[0] += vx_global * dt
    position[1] += vy_global * dt

Challenges:

  • Drift error: Small sensor errors accumulate rapidly
  • Noise: Sensor noise causes position uncertainty
  • Gravity compensation: Must separate gravity from motion

Solutions:

  • Sensor fusion: Combine with magnetometer (compass) for heading correction
  • Periodic GPS updates: Reset position when GPS available
  • Kalman filtering: Optimal estimation combining multiple sensors
  • Zero-velocity updates: Reset errors during stationary periods

Use cases:

  • Indoor navigation (shopping malls, airports)
  • GPS-denied environments (tunnels, urban canyons)
  • Pedestrian dead reckoning (PDR) for step-by-step tracking
  • Augmented reality positioning

Additional sensors that help:

  • Magnetometer: Provides absolute heading (compass)
  • Barometer: Detects floor changes in buildings
  • Wi-Fi/Bluetooth beacons: Periodic position resets
Try It: Dead Reckoning Path Simulator

Visualize how dead reckoning estimates position by integrating step length and heading. Adjust step count, stride length, and turn angle to see how small heading errors accumulate into drift over distance.

Question 8

What is the purpose of a Progressive Web App (PWA) for mobile sensing?

  1. To increase GPS accuracy
  2. To provide installable, offline-capable web applications
  3. To reduce sensor sampling rates
  4. To improve battery life

B) To provide installable, offline-capable web applications

Progressive Web Apps (PWAs) combine the best of web and native apps for mobile sensing:

Key features:

  1. Installable: Add to home screen like native apps

    • No app store submission required
    • Instant updates without user approval
  2. Offline capability: Service Workers cache resources

    // Service Worker caching
    self.addEventListener('install', event => {
        event.waitUntil(
            caches.open('sensor-cache-v1').then(cache => {
                return cache.addAll(['/index.html', '/app.js', '/styles.css']);
            })
        );
    });
  3. Background sync: Queue data uploads when offline

    • Data collected offline is uploaded when connection restored
    • Prevents data loss in poor connectivity areas
  4. Push notifications: Receive alerts even when app closed

  5. Responsive: Works on any device size

Advantages for mobile sensing:

Feature PWA Native App Mobile Web
Installation Optional Required No
Updates Automatic Manual (app store) Automatic
Offline support Yes Yes No
Sensor access Yes (via Web APIs) Yes (full access) Yes
Distribution URL/QR code App stores URL
Development cost Lower Higher Lower

PWA manifest.json example:

{
  "name": "IoT Sensor App",
  "short_name": "Sensors",
  "start_url": "/",
  "display": "standalone",
  "background_color": "#ffffff",
  "theme_color": "#007bff",
  "icons": [{"src": "/icon-192.png", "sizes": "192x192", "type": "image/png"}]
}

Use cases for IoT sensing:

  • Environmental monitoring (air quality, noise)
  • Health tracking (activity, vitals)
  • Participatory sensing campaigns
  • Field data collection (surveys, inspections)

Limitations:

  • Reduced background processing compared to native apps
  • Some sensors may have limited access (varies by browser)
  • iOS has more restrictions than Android

PWAs are ideal for rapid deployment of IoT sensing applications without the overhead of native app development and distribution.

Question 9

Which sampling rate is typical for smartphone accelerometers used in activity recognition?

  1. 1-5 Hz
  2. 10-20 Hz
  3. 50-100 Hz
  4. 500-1000 Hz

C) 50-100 Hz

Accelerometer sampling rates for different IoT applications:

Application Sampling Rate Rationale
Activity recognition 50-100 Hz Captures human movement patterns (walking ~2 Hz, running ~3 Hz). Nyquist requires 2× highest frequency, so 50 Hz sufficient
Step counting 20-50 Hz Detect peaks in acceleration magnitude (steps occur at ~2-3 Hz)
Fall detection 50-100 Hz Rapid acceleration changes during falls require higher sampling
Gesture recognition 100-200 Hz Fine-grained hand movements need higher resolution
Vehicle tracking 10-20 Hz Slower dynamics, lower rate saves battery
Screen rotation 10-20 Hz Smooth UI updates, not time-critical
High-precision IMU 200-1000 Hz Robotics, drones, navigation systems

Why 50-100 Hz for activity recognition?

  1. Human movement frequency: Most human activities have frequency components below 20 Hz
    • Walking: ~2 Hz (120 steps/min)
    • Running: ~2.5 Hz (150 steps/min)
    • Hand gestures: <10 Hz
  2. Nyquist sampling theorem: Must sample at ≥2× the highest frequency component
    • To capture 20 Hz signals → need ≥40 Hz sampling (Nyquist minimum)
    • 50-100 Hz provides safety margin for accurate signal reconstruction and accounts for higher-frequency transients
  3. Battery vs. accuracy trade-off:
    • Higher rates improve accuracy but drain battery
    • 50-100 Hz balances both for mobile devices
  4. Processing requirements:
    • 100 Hz × 3 axes = 300 samples/second
    • Manageable on smartphones without excessive CPU load

Typical smartphone accelerometer specs:

  • Maximum rate: 200-400 Hz (hardware dependent)
  • Typical use: 50-100 Hz
  • Low-power mode: 10-20 Hz
  • Sensor fusion: 100 Hz (combining accel + gyro + mag)

Example configuration:

// Web API
const accelerometer = new Accelerometer({ frequency: 60 }); // 60 Hz

// Android (Java)
sensorManager.registerListener(this, accelerometer,
    SensorManager.SENSOR_DELAY_GAME); // ~50-100 Hz

Power consumption:

  • 10 Hz: ~0.5 mW
  • 50 Hz: ~2 mW
  • 100 Hz: ~4 mW

For battery-constrained IoT applications, adaptive sampling (adjusting rate based on activity) is common practice.

Try It: Nyquist Sampling Rate Explorer

See the Nyquist theorem in action. Adjust the signal frequency and sampling rate to observe when the sampled signal accurately reconstructs the original, and when aliasing occurs due to under-sampling.

Question 10

What is the primary challenge of using smartphone camera as a sensor for IoT applications?

  1. Low image resolution
  2. High power consumption and processing requirements
  3. Lack of software support
  4. Limited field of view

B) High power consumption and processing requirements

Using the camera as a sensor in IoT applications faces significant challenges:

Power consumption:

  • Camera sensor: 200-500 mW
  • Image processing: 500-2000 mW (depends on resolution and algorithms)
  • Comparison: Accelerometer uses ~2 mW (100-1000× less power)

Processing requirements:

  • Image capture: 1920×1080 pixels = 2.1 million pixels
  • Frame rate: 30 fps = 63 million pixels/second
  • Processing: Object detection, recognition, tracking requires significant CPU/GPU
  • Latency: Real-time processing challenging on mobile devices

Specific challenges:

  1. Battery drain:
    • Continuous camera use can drain battery in 1-2 hours
    • Background camera use may be restricted by OS
  2. Computational load:
    • Image processing algorithms (edge detection, feature extraction) are CPU-intensive
    • Machine learning models (object detection) require GPU acceleration
  3. Storage requirements:
    • Video/images consume significant storage
    • Streaming to cloud requires bandwidth
  4. Privacy concerns:
    • Camera captures sensitive information
    • Requires user consent and careful data handling
    • May violate privacy laws in public spaces
  5. Environmental factors:
    • Lighting conditions affect image quality
    • Motion blur during device movement
    • Occlusions and obstructions

Solutions and optimizations:

  1. On-device processing: Process images locally, send only results

    Camera → Image Processing → Object Detection → Send "car detected" (NOT raw image)
  2. Trigger-based capture: Only activate camera when needed

    Motion sensor detects movement → Capture image → Process → Sleep
  3. Low-resolution processing: Use lower resolution for detection, high-res for confirmation

  4. Edge AI accelerators: Use dedicated hardware (e.g., Google Edge TPU, Apple Neural Engine)

  5. Frame skipping: Process every Nth frame instead of all frames

  6. Region of interest: Only process relevant parts of image

IoT camera applications:

Application Challenge Solution
QR code scanning Continuous camera on Activate only when user opens scanner
Object detection High CPU usage Use lightweight models (MobileNet, TinyYOLO)
Surveillance Privacy + power Motion-triggered capture, edge processing
AR applications Real-time tracking Use AR frameworks (ARCore, ARKit) with optimized pipelines
Plant identification Network bandwidth On-device ML models, compress images before upload

Power consumption comparison (typical smartphone): - Accelerometer: 0.002 W - GPS: 0.05 W - Display: 0.3-1.0 W - Camera + Processing: 0.7-2.5 W (largest power consumer besides display)

For continuous IoT sensing, camera use is often limited to intermittent capture or trigger-based activation to manage power consumption.


20.3 Comprehensive Review Quiz


20.4 Chapter Summary

Chapter Summary

Smartphones are ubiquitous multi-sensor platforms that extend IoT capabilities to billions of users worldwide, offering 10+ sensors (motion, position, environmental, multimedia) combined with powerful processors, always-on connectivity, and rich user interfaces. This unique combination enables participatory sensing applications where volunteer users contribute data for environmental monitoring, traffic analysis, and public health tracking.

Web-based sensing through standardized APIs (Generic Sensor API, Geolocation API, DeviceOrientation, Progressive Web Apps) enables cross-platform sensor access without requiring native app development. These browser-based approaches reduce deployment barriers and enable rapid prototyping of mobile IoT applications. Native frameworks like React Native provide deeper sensor access and better performance for production applications.

Privacy and battery management are critical considerations for mobile sensing applications. Privacy protection techniques include data anonymization, differential privacy, k-anonymity, and informed consent mechanisms. Battery optimization strategies involve adaptive sampling rates, sensor batching, duty cycling, and intelligent use of sensor fusion to reduce redundant measurements while maintaining data quality.

Participatory sensing transforms smartphones into crowdsourced sensor networks for applications ranging from air quality monitoring to traffic flow analysis and noise pollution mapping. The combination of location awareness, user context, and multi-sensor capabilities makes smartphones powerful tools for understanding urban environments and enabling smart city applications.

Key Takeaway

Mobile phone sensing assessment boils down to three pillars: understanding what each sensor measures and its limitations (GPS accuracy degrades indoors, accelerometers drift over time), knowing how to access sensors through Web or Native APIs, and designing systems that protect user privacy (differential privacy, k-anonymity) while conserving battery (adaptive sampling, duty cycling).

20.5 Academic Resources

Four-panel image of a wearable neck sensor: panel (a) shows circular red device with yellow piezoelectric sensor in center from top, panel (b) shows bottom with electronics compartment, panel (c) displays flexible red neck band configuration, panel (d) shows device worn around person's neck near throat for capturing swallowing, speaking, and physiological signals.

Wearable neck-mounted sensor device shown in four views: (a) top view showing circular sensor with yellow piezoelectric element, (b) bottom view with electronics and battery compartment, (c) flexible neck band form factor, and (d) device worn on a person’s neck. This design enables continuous physiological and activity monitoring.

Source: Carnegie Mellon University - Building User-Focused Sensing Systems

Wearable sensors complement smartphone sensing:

  • Form factor: Neck-mounted devices capture throat vibrations, swallowing, and vocalization
  • Piezoelectric sensing: Converts mechanical vibration to electrical signal for eating/drinking detection
  • Continuous monitoring: Unlike phone sensors, wearables provide always-on physiological data
  • Fusion opportunity: Combine with smartphone accelerometer and audio for robust activity recognition

Annotated smart glasses showing sensor placement: camera module at bridge position A for first-person video capture, proximity sensor at position B for gesture detection, light sensor at C for ambient monitoring, bone conduction transducer at D behind ear for audio, IMU at E for head tracking. Right image shows natural appearance when worn.

Smart glasses prototype with labeled sensor positions: (A) wide-angle camera at bridge, (B) proximity/gesture sensor, (C) environmental light sensor, (D) bone conduction speaker, (E) IMU (accelerometer/gyroscope). Right panel shows glasses worn by user demonstrating unobtrusive design.

Source: Carnegie Mellon University - Building User-Focused Sensing Systems

Smart glasses extend mobile sensing capabilities:

  • First-person vision: Camera captures what user sees, enabling visual context awareness
  • Proximity/gesture: Hand gesture detection near face without touch
  • Head motion tracking: IMU measures head orientation and movement patterns
  • Bone conduction: Audio feedback without blocking ears, enabling ambient awareness
  • Integration with phone: Glasses sensors complement smartphone for richer activity context

20.7 Concept Relationships

Concept Relationships
Core Concept Builds On Enables Related To
Smartphone Sensors MEMS technology, ADC Participatory sensing, mobile IoT Sensor fusion, power optimization
Generic Sensor API Browser standards, JavaScript Web-based sensing apps PWA, Service Workers
Participatory Sensing Crowdsourcing, location services City-wide monitoring, air quality maps Differential privacy, k-anonymity
Battery Optimization Power analysis, duty cycling Multi-year deployments Adaptive sampling, sensor fusion
Differential Privacy Statistical noise, anonymization Privacy-preserving data collection GDPR compliance, k-anonymity

Key insight: Mobile sensing combines hardware capabilities (sensors), software frameworks (Web APIs), privacy techniques (differential privacy), and power management (adaptive sampling) into complete participatory sensing systems.

20.8 See Also

Web APIs and Standards:

Privacy and Compliance:

Participatory Sensing:

  • Waze - Crowdsourced traffic and navigation
  • PurpleAir - Participatory air quality monitoring
  • Noise Planet - Community noise mapping

Related Chapters:

Common Pitfalls

Different smartphone manufacturers use different sensor chips with varying sensitivity, noise levels, and axis orientations. Test your application on multiple device models rather than a single reference device.

Requesting the highest sensor sampling rate drains battery rapidly and generates data volumes that overwhelm processing. Profile the minimum rate needed for your use case and request the nearest available batch mode.

Sensor axes are defined relative to the physical device, but users hold phones in landscape, portrait, or tilted orientations. Normalise to world frame using gravity vector and magnetic north before activity classification.

Activity recognition models trained in controlled lab conditions often fail in real-world deployment due to placement variation (pocket vs. hand vs. bag) and demographic differences. Collect diverse training data or use transfer learning.

20.9 What’s Next

If you want to… Read this
Understand the raw sensor types used in smartphones Sensor Types: Introduction
Learn about sensor fusion algorithms Signal Processing for IoT Sensors
Explore mobile phone APIs for sensor access Mobile Phone Sensor APIs
Apply sensor data in IoT application pipelines Sensor Applications Overview

Now that you understand sensors and actuators in both dedicated IoT devices and smartphones, you’re ready to dive into the electrical foundations that power these systems. The next section covers fundamental electricity concepts essential for understanding power requirements, circuits, and energy management in IoT deployments.

Continue to Electricity

Related Chapters and Resources

Phone Sensor Technologies:

Product Examples Using Phone Sensors:

  • Fitbit - Phone app integration

User Experience:

The Sensor Squad just finished their biggest test ever!

Sammy the Sensor stretched. “Wow, that quiz covered everything – from how accelerometers count steps to how GPS finds your location!”

Lila the LED was glowing green. “I got the step counting question right! The accelerometer feels each bounce when you walk, and it counts the peaks. Like counting how many times you jump on a trampoline!”

Max the Microcontroller nodded. “The tricky part was privacy. When phones collect data from lots of people, we have to scramble the information so nobody can figure out exactly who is who. It is like mixing up everyone’s answers in a suggestion box – you can see what people think, but you cannot tell who wrote what.”

Bella the Battery looked tired. “And the battery question was about ME! The smartest way to save energy is adaptive sampling – only check the GPS when you are actually moving. If you are sitting still, why keep asking ‘where am I?’ every second? That is just wasteful!”

“The coolest thing I learned,” Sammy said, “is dead reckoning. When GPS does not work – like inside a building – you can use the accelerometer and gyroscope together to guess where you are by counting your steps and tracking which way you turn. It is like being a detective following footprints!”

The Sensor Squad Lesson: Understanding phone sensors means knowing what each sensor does, how to access them, how to protect people’s privacy, and how to save battery. If you got all the quiz questions right, you are officially a Sensor Squad expert!

20.10 Resources

20.10.1 Web APIs

20.10.2 Mobile Frameworks

20.10.3 Research Papers

  • “Mobile Phone Sensing Systems: A Survey” (IEEE Communications Surveys)
  • “Participatory Sensing: Applications and Architecture” (IEEE Internet Computing)
  • “Privacy in Mobile Sensing” (IEEE Pervasive Computing)

20.10.4 Privacy Guidelines