Build two web-based mobile sensing applications from scratch: an accelerometer-powered activity recognition app that detects walking, running, and stationary states using statistical analysis and step counting, plus a GPS geofencing tracker that calculates distances with the Haversine formula and triggers alerts on zone entry/exit.
Key Concepts
Accelerometer API: Web API providing the device’s linear acceleration in m/s^2 along three axes at up to 60 Hz through the browser; accessed via new Accelerometer() from the Generic Sensor API (Chrome Android only) or via DeviceMotionEvent (broader support)
DeviceOrientationEvent: Browser event providing Euler angles (alpha: z-axis rotation 0-360, beta: x-axis tilt -180 to 180, gamma: y-axis tilt -90 to 90) derived from fusion of accelerometer and gyroscope by the device OS
Geolocation.watchPosition(): Continuously tracks device position, calling a callback with updated coordinates as the device moves; provides accuracy, altitude, heading, and speed when available; stop tracking with clearWatch() to prevent battery drain
Web Bluetooth GATT: Allows web pages to discover, connect to, and read/write characteristics of BLE 4.0+ peripherals; requires HTTPS and user gesture; enables browser-based dashboards for custom BLE sensor hardware
Generic Sensor API: A unified W3C API surface for browser sensor access (Accelerometer, Gyroscope, Magnetometer, RelativeOrientationSensor, AbsoluteOrientationSensor); better structured than legacy DeviceMotionEvent but only available in Chrome/Chromium-based browsers
Sensor Reading Timestamp: Each sensor reading includes a timestamp from performance.now() (milliseconds since navigation start); useful for computing sensor sampling intervals and detecting dropped readings, though not synchronized to wall clock time
Browser DevTools Sensor Override: Chrome DevTools allows simulating geolocation and device orientation on desktop; useful for testing location-based and orientation-based features without a physical mobile device
Event Throttling and Batching: Mobile browsers throttle sensor events when the page is in the background or the screen is off; sensor applications must handle irregular event timing and not assume a fixed sampling rate in data processing
17.1 Learning Objectives
By completing these labs, you will be able to:
Access smartphone accelerometer data using Web APIs
Implement activity recognition algorithms based on motion patterns
Detect steps using peak detection techniques
Build GPS tracking applications with geofencing capabilities
Objective: Create a web application that uses the smartphone accelerometer to recognize user activities.
Materials:
Smartphone with accelerometer
Web browser (Chrome/Edge recommended)
Text editor for HTML/JavaScript
For Beginners: What is Activity Recognition?
Activity recognition is like teaching your phone to understand what you’re doing - walking, running, or sitting still. Your phone has a tiny sensor called an accelerometer that feels every movement, just like how you feel when you’re in a car that speeds up or slows down.
When you walk, your phone bounces up and down in a pattern. When you run, it bounces faster and harder. When you’re sitting still, it barely moves at all. By measuring these patterns, we can figure out what activity you’re doing!
The Sensor Squad says: “Think of the accelerometer like a tiny ball on a spring inside your phone. When you move, the ball wiggles - and we measure how much it wiggles to guess what you’re doing!”
17.2.1 Complete Application Code
<!DOCTYPE html><html><head><title>Activity Recognition</title><meta name="viewport" content="width=device-width, initial-scale=1"><style> body {font-family:Arial,sans-serif;padding:20px;max-width:600px;margin:0auto; }.activity-display {font-size:48px;text-align:center;padding:40px;margin:20px0;background:linear-gradient(135deg,#667eea0%,#764ba2100%);color:white;border-radius:15px;box-shadow:010px30pxrgba(0,0,0,0.2); }.sensor-data {background:#f5f5f5;padding:15px;border-radius:10px;margin:10px0; } button {width:100%;padding:15px;font-size:18px;margin:10px0;border:none;border-radius:8px;cursor:pointer; }.start-btn {background:#4CAF50;color:white; }.stop-btn {background:#f44336;color:white; }.stats {display:grid;grid-template-columns:1fr1fr;gap:10px;margin:20px0; }.stat-card {background:white;border:2pxsolid#e0e0e0;border-radius:10px;padding:15px;text-align:center; }.stat-value {font-size:32px;font-weight:bold;color:#667eea; }.stat-label {font-size:14px;color:#666;margin-top:5px; }</style></head><body><h1>Activity Recognition</h1><button class="start-btn" onclick="startTracking()">Start Tracking</button><button class="stop-btn" onclick="stopTracking()">Stop Tracking</button><div class="activity-display" id="activity">Stationary</div><div class="stats"><div class="stat-card"><div class="stat-value" id="step-count">0</div><div class="stat-label">Steps</div></div><div class="stat-card"><div class="stat-value" id="distance">0.0</div><div class="stat-label">Distance (m)</div></div></div><div class="sensor-data"><h3>Accelerometer Data</h3><div id="accel-data">Waiting for sensor...</div></div><div class="sensor-data"><h3>Activity Log</h3><div id="activity-log"></div></div><script>let accelerometer =null;let readings = [];let stepCount =0;let lastPeakTime =0;let currentActivity ='Stationary';functionstartTracking() {if ('Accelerometer'inwindow) {try { accelerometer =newAccelerometer({ frequency:50 }); accelerometer.addEventListener('reading', processReading); accelerometer.addEventListener('error', e => {console.error('Accelerometer error:', e);alert('Accelerometer error: '+ e.error.message); }); accelerometer.start();console.log('Accelerometer started');logActivity('Tracking started'); } catch (error) {alert('Failed to start accelerometer: '+ error.message); } } else {alert('Accelerometer not supported by your browser.\n'+'Try Chrome or Edge on Android.'); } }functionstopTracking() {if (accelerometer) { accelerometer.stop(); accelerometer =null;logActivity('Tracking stopped'); } }functionprocessReading() {const x = accelerometer.x;const y = accelerometer.y;const z = accelerometer.z;const magnitude =Math.sqrt(x*x + y*y + z*z);// Update displaydocument.getElementById('accel-data').innerHTML=` X: ${x.toFixed(2)} m/s²<br> Y: ${y.toFixed(2)} m/s²<br> Z: ${z.toFixed(2)} m/s²<br> <strong>Magnitude: ${magnitude.toFixed(2)} m/s²</strong> `;// Store reading readings.push({ x, y, z, magnitude,timestamp:Date.now() });// Keep last 50 readings (1 second at 50 Hz)if (readings.length>50) { readings.shift(); }// Detect activity (every 10 readings)if (readings.length===50&& readings.length%10===0) {detectActivity(); }// Detect stepsdetectSteps(magnitude); }functiondetectActivity() {const magnitudes = readings.map(r => r.magnitude);const mean = magnitudes.reduce((a, b) => a + b) / magnitudes.length;const variance = magnitudes.reduce((sum, m) => sum +Math.pow(m - mean,2),0) / magnitudes.length;const stdDev =Math.sqrt(variance);let newActivity;if (stdDev <0.5) { newActivity ='Stationary'; } elseif (stdDev <2.5) { newActivity ='Walking'; } elseif (stdDev <4.0) { newActivity ='Running'; } else { newActivity ='Intense Activity'; }if (newActivity !== currentActivity) { currentActivity = newActivity;document.getElementById('activity').textContent= currentActivity;logActivity(`Activity changed: ${currentActivity}`); } }functiondetectSteps(magnitude) {const now =Date.now();const minStepInterval =300;// 300ms between steps// Check if it's a peak (> 10.5 m/s²) and enough time passedif (magnitude >10.5&& now - lastPeakTime > minStepInterval) { stepCount++; lastPeakTime = now;document.getElementById('step-count').textContent= stepCount;// Calculate distance (0.7m per step)const distance = (stepCount *0.7).toFixed(1);document.getElementById('distance').textContent= distance; } }functionlogActivity(message) {const log =document.getElementById('activity-log');const time =newDate().toLocaleTimeString(); log.innerHTML=`[${time}] ${message}<br>`+ log.innerHTML;// Keep only last 10 messagesconst messages = log.innerHTML.split('<br>');if (messages.length>10) { log.innerHTML= messages.slice(0,10).join('<br>'); } }// Request permission on iOS 13+if (typeof DeviceMotionEvent !=='undefined'&&typeof DeviceMotionEvent.requestPermission==='function') {document.querySelector('.start-btn').addEventListener('click', () => { DeviceMotionEvent.requestPermission().then(permissionState => {if (permissionState ==='granted') {startTracking(); } else {alert('Permission denied for motion sensors'); } }).catch(console.error); }); }</script></body></html>
17.2.2 How the Algorithm Works
The activity recognition algorithm uses statistical analysis of accelerometer readings:
Figure 17.1
Try It: Activity Recognition Calculator
Experiment with the activity classification algorithm by adjusting the standard deviation:
Add gyroscope data for better activity classification (use Gyroscope API)
Implement calorie estimation based on activity type and duration
Store activity history in localStorage for later analysis
Add data visualization with charts using Chart.js library
Quick Check: Accelerometer Activity Recognition
17.3 Lab 2: GPS Location Tracker with Geofencing Alerts
Objective: Build a web-based GPS tracker with geofencing notifications.
Materials:
Smartphone with GPS
Web browser
Internet connection
For Beginners: What is Geofencing?
Geofencing is like drawing an invisible circle around a place on a map. When you enter or leave that circle, your phone can do something - like send a notification or play a sound.
Imagine putting an invisible fence around your home. When you leave the fence area, your phone could remind you to grab your keys! When you return, it could turn on the lights automatically.
The Sensor Squad explains: “Think of geofencing like a magic trip wire. Step over it, and something happens! It’s super useful for reminders, security, and smart home automation.”
The Haversine formula calculates great-circle distance on a sphere: \(a = \sin^2\left(\frac{\Delta\phi}{2}\right) + \cos(\phi_1) \times \cos(\phi_2) \times \sin^2\left(\frac{\Delta\lambda}{2}\right)\) then \(c = 2 \times \text{atan2}(\sqrt{a}, \sqrt{1-a})\) and \(d = R \times c\) where R = 6,371 km. Worked example: London (51.5074°N, 0.1278°W) to point 0.01° west: \(\Delta\lambda = 0.01°\), giving distance \(\approx 695\text{ m}\). Using incorrect Euclidean \(\sqrt{(\Delta\text{lat})^2 + (\Delta\text{lon})^2} \times 111\text{ km}\) gives 1110 m (59% error at 51° latitude).
Distance calculations on a sphere require the Haversine formula:
Figure 17.2
Where R = 6,371 km is Earth’s radius.
Try It: Haversine Distance Calculator
Calculate the great-circle distance between two GPS coordinates:
Default example: London (51.5074°N, 0.1278°W) to a point 0.01° west shows the ~59% error from Euclidean calculation at 51° latitude.
17.3.3 Expected Learning Outcomes
After completing this lab, you will understand:
Use Geolocation API for continuous tracking
Implement geofencing with entry/exit detection
Calculate distances using the Haversine formula
Persist data using localStorage
Trigger notifications on geofence events
17.3.4 Exercises
Add polygon geofences - Support non-circular boundaries using point-in-polygon algorithms
Implement route playback - Visualize stored location history on a map
Send geofence events to IoT backend - Use MQTT or HTTP to report events
Add battery drain estimation - Track GPS usage time and estimate power consumption
Label the Diagram
💻 Code Challenge
Order the Steps
Match the Concepts
17.4 Summary
In these two labs, you built foundational mobile sensing applications:
Key Accomplishments
Lab 1 - Activity Recognition:
Accessed accelerometer data at 50 Hz using the Generic Sensor API
Implemented statistical activity classification using standard deviation
Built step detection using peak detection with debouncing
Calculated derived metrics (distance walked)
Lab 2 - GPS Geofencing:
Implemented continuous location tracking with the Geolocation API
Built geofence management with localStorage persistence
Used the Haversine formula for accurate distance calculations
Triggered notifications and vibrations on geofence events
Common Mistake: Using Euclidean Distance for GPS Coordinates
The mistake: Calculating distance between GPS coordinates using the Pythagorean theorem (√(Δlat² + Δlon²)) instead of the Haversine formula, causing significant errors.
Real scenario: A delivery app calculated distances to determine if a driver entered a 500-meter delivery radius. Developer used simple Euclidean distance:
const dx =51.5074-51.5074=0;const dy =-0.1278- (-0.1378) =0.01;const distance =Math.sqrt(0+0.01*0.01) *111000=1110 meters// ERROR: Reports 1110m when actual is 694m (60% too high!)
Correct calculation (Haversine):
functionhaversineDistance(lat1, lon1, lat2, lon2) {const R =6371000;// Earth radius in metersconst φ1 = lat1 *Math.PI/180;const φ2 = lat2 *Math.PI/180;const Δφ = (lat2 - lat1) *Math.PI/180;const Δλ = (lon2 - lon1) *Math.PI/180;const a =Math.sin(Δφ/2) *Math.sin(Δφ/2) +Math.cos(φ1) *Math.cos(φ2) *Math.sin(Δλ/2) *Math.sin(Δλ/2);const c =2*Math.atan2(Math.sqrt(a),Math.sqrt(1-a));return R * c;// Distance in meters}const correct =haversineDistance(51.5074,-0.1278,51.5074,-0.1378);// Result: 694 meters (correct!)
Error magnitude by latitude:
Latitude
Location
Euclidean Error for 1km East-West
0°
Equator
0% (Euclidean works)
30°
Cairo
15% too high
45°
Milan
41% too high
51°
London
59% too high
60°
Helsinki
100% too high
80°
North Pole
476% too high
Try It: Latitude Error Calculator
See how Euclidean distance error grows with latitude:
Visualizations on projected maps (not real-world distances)
When Haversine is required:
Geofencing (must be accurate)
Navigation (routing, ETA calculations)
Any distance >1 km
Any latitude >30° (Northern/Southern hemisphere)
Real impact on geofencing:
Delivery radius: 500 meters Location: London (51°N)
With Euclidean distance: - App thinks 500m radius covers area of 785,000 m² - Actual coverage using Haversine: only 493,000 m² (37% smaller!) - Result: Drivers marked “out of range” when they’re actually inside delivery zone
Bottom line: GPS coordinates are latitude/longitude on a sphere. Always use Haversine formula for accurate distance calculations. The performance difference is negligible (microseconds), but the accuracy improvement is critical.
Key Takeaway
Activity recognition and geofencing are two foundational mobile sensing patterns: one uses motion sensors (accelerometer) with statistical analysis to understand what the user is doing, and the other uses position sensors (GPS) with geometric calculations to understand where the user is. Combining both enables context-aware IoT applications that respond to both activity and location.
For Kids: Meet the Sensor Squad!
Sammy the Sensor was bouncing with excitement – literally! “Watch this! Every time I bounce, the accelerometer inside the phone feels it!”
Max the Microcontroller was watching the numbers carefully. “When Sammy bounces a little, the numbers wiggle a little. When Sammy bounces a LOT, the numbers go wild! That is how we tell the difference between walking and running.”
“How?” asked Lila the LED.
“We use something called standard deviation,” Max explained. “It is a fancy way of asking: how much do the numbers jump around? Sitting still means barely any jumping. Walking means medium jumping. Running means BIG jumping!”
Bella the Battery chimed in about the second project. “The geofencing lab is like drawing invisible circles on a map. When you step inside the circle, BUZZ! Your phone vibrates to tell you!”
“How does the phone know you walked into the circle?” Lila asked.
“GPS satellites in space tell the phone exactly where it is,” Sammy explained. “Then Max uses a special math formula called the Haversine formula to calculate how far you are from the center of the circle. If the distance is less than the circle’s size – you are inside!”
“I love the name Haversine,” Lila giggled. “It sounds like a wizard spell!”
“It kind of is!” Max laughed. “It is the spell that lets phones measure distances on a round Earth instead of a flat map.”
The Sensor Squad Lesson: Activity recognition uses the accelerometer to figure out how you are moving (walking, running, sitting), while geofencing uses GPS to figure out where you are. Both are just clever math applied to sensor data!
17.5 Knowledge Checks
Quiz: Mobile Web Sensor Labs
Interactive Quiz: Match Web Sensor Lab Concepts Concepts
Interactive Quiz: Sequence the Steps
Common Pitfalls
1. Using window.addEventListener(‘devicemotion’) Without Checking Support
DeviceMotionEvent is supported on most mobile browsers but is undefined on desktop. On iOS 13+, it also requires explicit permission via DeviceMotionEvent.requestPermission(). Always check if DeviceMotionEvent is defined and, on iOS, request permission before attaching event listeners.
2. Not Unregistering Sensor Listeners on Page Navigate
Active sensor listeners (DeviceMotion event handlers, Accelerometer.start(), Geolocation.watchPosition()) continue running after the user navigates away from the page, draining battery. Always stop sensors and remove event listeners in the page’s beforeunload or pagehide event handler.
3. Computing Velocity from Accelerometer by Integration
Integrating smartphone accelerometer data to compute velocity or position accumulates error rapidly due to sensor bias, quantization, and gravity contamination. Without careful bias removal and complementary GPS fusion, dead-reckoning position errors grow to tens of meters within seconds. Limit integration to short intervals (< 1 second) or use the OS fusion outputs instead.
4. Ignoring Coordinate Frame Transformations Between Devices
The Geolocation API returns WGS84 coordinates (latitude, longitude, altitude) while DeviceOrientation returns device-relative Euler angles. Combining these to determine which direction the user is facing requires converting between coordinate frames. Always understand and document which coordinate system each API uses before writing fusion algorithms.
17.6 What’s Next
If you want to…
Read this
Understand the theory behind mobile phone sensor capabilities