19  Mobile Sensors Labs

19.1 Learning Objectives

  • Build web-based sensor applications using the Generic Sensor API for accelerometer activity recognition and GPS geofencing
  • Create installable Progressive Web Apps with offline capability using Service Workers for mobile sensing
  • Implement real-time audio processing with the Web Audio API for participatory noise monitoring
  • Validate understanding of smartphone sensing concepts through self-assessment and identify areas for further study
In 60 Seconds

This section provides four hands-on mobile sensing labs – accelerometer activity recognition, GPS geofencing, Progressive Web Apps with offline support, and participatory noise monitoring – plus comprehensive knowledge assessments to validate your understanding of smartphone sensing for IoT.

These hands-on labs let you build real mobile sensing applications using just your web browser and phone. You will create an activity tracker that detects walking versus running, a location-aware app with virtual boundaries (geofencing), and even a noise monitoring tool. No app store downloads needed – everything runs as a web page, making it easy to experiment and learn.

Use a simple planning equation to scope completion time:

\[ T_{\text{path}} = \sum_{i=1}^{n} T_{\text{lab},i} + T_{\text{quiz}} \]

Worked example: A focused path with Lab 1 (30 min), Lab 4 (30 min), and a 15-minute quiz run takes:

\[ T_{\text{path}} = 30 + 30 + 15 = 75 \text{ minutes} \]

For full coverage, a 4-lab route at 35 minutes average plus 30 minutes assessment is: \(4 \times 35 + 30 = 170\) minutes (about 2.8 hours). This makes it easier to choose a realistic session plan instead of stopping mid-lab.

Overview

This section provides hands-on experience building mobile sensing applications and assesses your understanding of smartphone sensor concepts. Through practical labs, you’ll create real applications that demonstrate the sensing capabilities covered in the previous chapters.

What You’ll Build

Complete four hands-on labs covering the full spectrum of mobile sensing:

  1. Activity recognition using accelerometer data
  2. GPS tracking with geofencing for location-aware applications
  3. Progressive Web Apps with offline capability
  4. Noise monitoring for participatory sensing

Plus comprehensive knowledge checks to validate your understanding.

Chapter Structure

19.1.1 Part 4A: Web-Based Sensor Applications

Labs covered:

  • Lab 1: Accelerometer Activity Recognition - Build a web app that recognizes walking, running, and stationary activities using motion patterns
  • Lab 2: GPS Geofencing Tracker - Create location tracking with entry/exit notifications using the Haversine formula

Learning outcomes:

  • Access smartphone sensors via Generic Sensor API
  • Implement activity classification using statistical analysis
  • Build geofencing with persistent storage

Estimated time: 45-60 minutes


19.1.2 Part 4B: PWA and Audio Sensing

Labs covered:

  • Lab 3: Multi-Sensor PWA - Create an installable Progressive Web App that works offline
  • Lab 4: Noise Level Monitor - Build a participatory sensing app using the Web Audio API

Learning outcomes:

  • Create offline-capable PWAs with Service Workers
  • Process audio data in real-time
  • Implement crowdsourced data collection with location

Estimated time: 45-60 minutes


19.1.3 Part 4C: Assessment and Review

Content:

  • 10 Knowledge Check Questions - Test understanding of sensor types, APIs, and best practices
  • 10 Comprehensive Review Questions - In-depth assessment covering participatory sensing, privacy, and battery optimization
  • Visual Reference Gallery - AI-generated figures for key concepts

Learning outcomes:

  • Validate understanding through self-assessment
  • Diagnose knowledge gaps for further study

Estimated time: 30-40 minutes


Prerequisites

Before starting the labs, ensure you’ve completed:

Learning Path

Recommended Sequence

Complete beginner (90 minutes): 1. Start with Lab 1 (accelerometer) - easiest entry point 2. Try the first 5 knowledge check questions 3. Experiment with Lab 4 (noise monitoring) - visually engaging

Intermediate (2-3 hours): 1. Complete Labs 1-3 in order 2. Take the full knowledge check 3. Review explanations for any incorrect answers

Advanced (4+ hours): 1. Complete all four labs 2. Implement the extension exercises 3. Take both knowledge check and comprehensive quiz 4. Build a custom application combining multiple sensors

Materials Required

  • Smartphone with sensors (accelerometer, GPS, microphone)
  • Modern web browser (Chrome or Edge recommended for full API support)
  • Text editor for HTML/JavaScript
  • Local web server (Python’s http.server or similar)
  • Optional: Internet connection for uploading data

19.2 Knowledge Checks

Question 1: Lab Selection

A student wants to build a mobile app that monitors environmental noise levels across their campus and creates a noise map. Which lab combination is most relevant?

  1. Lab 1 (Activity Recognition) + Lab 2 (GPS Geofencing)
  2. Lab 3 (PWA) + Lab 4 (Noise Monitoring)
  3. Lab 1 (Activity Recognition) only
  4. Lab 2 (GPS Geofencing) only

B) Lab 3 (PWA) + Lab 4 (Noise Monitoring). A campus noise mapping app needs audio sensing (Lab 4 teaches Web Audio API noise level measurement with location tagging) and ideally should work as an installable, offline-capable PWA (Lab 3 teaches Service Workers and offline caching). Lab 4 directly implements participatory noise monitoring with GPS, and Lab 3 ensures the app works reliably even with spotty campus Wi-Fi.

Question 2: Technical Requirements

Why do the Web API labs recommend Chrome or Edge browser instead of Safari?

  1. Safari has a faster JavaScript engine
  2. Chrome and Edge have broader support for the Generic Sensor API and related Web APIs
  3. Safari cannot access GPS
  4. Chrome is the only browser that supports HTML

B) Chrome and Edge have broader support for the Generic Sensor API and related Web APIs. The Generic Sensor API (for accelerometer, gyroscope, etc.) has the best support in Chromium-based browsers. Safari on iOS has more restrictive sensor API support and requires explicit user permission for motion events (DeviceMotionEvent.requestPermission). All modern browsers support basic Geolocation and Web Audio, but Chromium offers the most complete sensor API ecosystem.

Key Takeaway

The best way to learn mobile sensing is by building. These four labs progress from basic sensor access (accelerometer, GPS) to advanced patterns (PWAs with offline support, participatory audio sensing). Complete them in order for the smoothest learning path, or jump directly to the lab that matches your project needs.

The Sensor Squad was ready for their lab day!

Sammy the Sensor put on safety goggles. “Today we are building four real projects with phone sensors!”

“What kind of projects?” asked Lila the LED eagerly.

“First, we will teach a phone to know if you are walking, running, or sitting still – just by feeling the bouncing with the accelerometer!” Sammy explained. “Then we will build invisible fences on a map. When you walk into a fenced area, the phone vibrates!”

Max the Microcontroller grinned. “I love the third project – a special website that works even without internet! It saves everything on the phone and uploads later.”

Bella the Battery added, “And the fourth project is my favorite. You use the phone’s microphone to measure how loud things are around you. Then you share the data with other people’s phones to make a noise map of your whole neighborhood!”

“So we are building things that actually work?” Lila asked.

“Absolutely!” Sammy said. “That is the best part about labs – you do not just read about sensors, you USE them. By the end, you will have four working apps you built yourself!”

The Sensor Squad Lesson: Reading about sensors is great, but building with them is even better. These labs let you create real working apps using your phone’s sensors – no special equipment needed, just your phone and a web browser!


Begin Labs

Ready to start building? Begin with web-based sensor applications:

Start Lab 1: Accelerometer Activity Recognition

Common Pitfalls

Mobile browser sensor API support varies between Chrome, Firefox, Safari, and Samsung Internet. A PWA sensor app that works perfectly in Chrome Android may fail in Safari iOS due to missing Generic Sensor API support. Always test on the specific browser and OS version your users are expected to have before submitting a lab assessment.

Different mobile devices expose sensor data at different maximum rates. Requesting accelerometer data at 100 Hz may return data at 50 Hz on one device and 25 Hz on another. Implement adaptive rate detection: measure actual received event timestamps and report the effective rate rather than assuming the requested rate was granted.

Assessment rubrics typically include error handling as a graded criterion. If the required sensor is not available on the test device, the application should display an informative message explaining which sensor is missing, rather than crashing, showing a blank screen, or returning silent NaN values.

The DeviceMotion API and Generic Sensor API require HTTPS connections — they silently return no data on HTTP pages. When testing locally, use localhost (which is treated as a secure origin) rather than accessing via IP address (which is not). For deployed apps, verify your SSL certificate is valid on the target domain.

19.3 What’s Next

If you want to… Read this
Review mobile sensor API implementation Mobile Phone APIs
Practice PWA and audio sensor labs Mobile Phone Labs: PWA and Audio
Practice web sensor API integration Mobile Phone Labs: Web APIs
Assess your understanding of the full mobile sensing module Mobile Phone Assessment