18  Mobile PWA & Audio Labs

In 60 Seconds

Build two advanced mobile sensing applications: a Progressive Web App (PWA) with Service Workers for offline multi-sensor data collection, and a participatory noise monitoring system using the Web Audio API that measures decibel levels and tags readings with GPS coordinates for crowdsourced noise mapping.

Key Concepts
  • Progressive Web App (PWA): A web application that uses modern APIs to deliver app-like experiences; key features: service worker for offline caching, Web App Manifest for home screen installation, and HTTPS for secure sensor API access
  • Service Worker: A JavaScript file running in a background thread separate from the web page; intercepts network requests for offline caching, enables background sync for sensor data collected without connectivity, and can receive push notifications
  • Web Audio API: A JavaScript API providing low-level audio processing: capturing microphone input (getUserMedia), analyzing frequency content (AnalyserNode), computing sound level (dB SPL approximation from FFT magnitude), and visualizing waveforms
  • AudioContext / AnalyserNode: AudioContext manages the Web Audio processing graph; AnalyserNode computes real-time FFT spectrums and time-domain waveforms from microphone input; provides getByteFrequencyData() and getByteTimeDomainData() for visualization
  • Sound Level Meter Implementation: Approximating dB SPL from AnalyserNode data: sum the RMS of time-domain samples, convert to dB: level_dB = 20 * log10(rms); note this gives relative dB, not calibrated SPL without a reference calibration point
  • Offline-First Architecture: Designing the application to function without network connectivity by pre-caching sensor UI assets and buffering collected data in IndexedDB or localStorage for later sync; essential for mobile sensing in areas with unreliable connectivity
  • Web App Manifest: A JSON file (manifest.json) linked from the HTML that describes the app name, icon, theme color, and display mode; enables ‘Add to Home Screen’ on Android and iOS, giving the PWA a native-like launch experience
  • getUserMedia API: Browser API requesting access to microphone (and camera) streams; requires user permission; returns a MediaStream object that can be connected to Web Audio API nodes or recorded as audio blobs via MediaRecorder

18.1 Learning Objectives

By completing these labs, you will be able to:

  • Create Progressive Web Apps (PWAs) with offline capability
  • Implement Service Workers for resource caching
  • Build installable mobile applications without app stores
  • Capture smartphone microphone input using the Web Audio API
  • Process audio data in real-time for noise level measurement
  • Implement participatory sensing with location-tagged data

18.2 Lab 3: Progressive Web App for Multi-Sensor Data Collection

Objective: Create an installable PWA that collects data from multiple smartphone sensors and works offline.

Materials:

  • Smartphone
  • Web server (can use Python’s http.server)
  • Text editor

A Progressive Web App (PWA) is like a website that pretends to be a real app! You can add it to your home screen, use it offline, and it looks just like any other app on your phone - but you don’t need to download it from an app store.

Think of it like a magic website that can: - Work without internet (it saves stuff on your phone) - Send you notifications - Look and feel like a “real” app

The Sensor Squad says: “PWAs are super cool because they’re easier to make than regular apps, but they can do almost everything a regular app can do!”

18.2.1 Step 1: Create the Main Application (index.html)

<!DOCTYPE html>
<html>
<head>
    <title>IoT Multi-Sensor App</title>
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <link rel="manifest" href="manifest.json">
    <meta name="theme-color" content="#007bff">
    <style>
        body {
            font-family: Arial, sans-serif;
            padding: 20px;
            max-width: 600px;
            margin: 0 auto;
            background: #f5f5f5;
        }
        .sensor-card {
            background: white;
            border-radius: 10px;
            padding: 15px;
            margin: 15px 0;
            box-shadow: 0 2px 10px rgba(0,0,0,0.1);
        }
        .sensor-title {
            font-size: 18px;
            font-weight: bold;
            color: #007bff;
            margin-bottom: 10px;
        }
        .sensor-value {
            font-size: 24px;
            font-weight: bold;
            color: #333;
        }
        .sensor-unit {
            font-size: 14px;
            color: #666;
        }
        button {
            width: 100%;
            padding: 15px;
            font-size: 18px;
            border: none;
            border-radius: 8px;
            cursor: pointer;
            margin: 5px 0;
        }
        .btn-primary { background: #007bff; color: white; }
        .btn-success { background: #28a745; color: white; }
        .btn-danger { background: #dc3545; color: white; }
        .offline-indicator {
            position: fixed;
            top: 0;
            left: 0;
            right: 0;
            background: #dc3545;
            color: white;
            text-align: center;
            padding: 10px;
            display: none;
        }
        .offline-indicator.show { display: block; }
        .data-count {
            text-align: center;
            padding: 10px;
            background: #e9ecef;
            border-radius: 5px;
            margin: 10px 0;
        }
    </style>
</head>
<body>
    <div class="offline-indicator" id="offline-indicator">
        Offline - Data will sync when connected
    </div>

    <h1>IoT Multi-Sensor</h1>

    <button class="btn-primary" onclick="startAllSensors()">Start All Sensors</button>
    <button class="btn-danger" onclick="stopAllSensors()">Stop All Sensors</button>

    <div class="sensor-card">
        <div class="sensor-title">Accelerometer</div>
        <div class="sensor-value" id="accel-value">--</div>
        <div class="sensor-unit">m/s² (magnitude)</div>
    </div>

    <div class="sensor-card">
        <div class="sensor-title">Gyroscope</div>
        <div class="sensor-value" id="gyro-value">--</div>
        <div class="sensor-unit">rad/s (rotation rate)</div>
    </div>

    <div class="sensor-card">
        <div class="sensor-title">Location</div>
        <div class="sensor-value" id="location-value">--</div>
        <div class="sensor-unit">latitude, longitude</div>
    </div>

    <div class="sensor-card">
        <div class="sensor-title">Light Level</div>
        <div class="sensor-value" id="light-value">--</div>
        <div class="sensor-unit">lux</div>
    </div>

    <div class="data-count">
        <strong>Collected Readings:</strong> <span id="reading-count">0</span>
    </div>

    <button class="btn-success" onclick="syncData()">Sync Data to Cloud</button>

    <script>
        let sensors = {};
        let readings = [];
        let readingCount = 0;

        // Register Service Worker for offline support
        if ('serviceWorker' in navigator) {
            navigator.serviceWorker.register('sw.js')
                .then(reg => console.log('Service Worker registered'))
                .catch(err => console.error('SW registration failed:', err));
        }

        // Offline detection
        window.addEventListener('online', () => {
            document.getElementById('offline-indicator').classList.remove('show');
            syncData(); // Auto-sync when back online
        });

        window.addEventListener('offline', () => {
            document.getElementById('offline-indicator').classList.add('show');
        });

        function startAllSensors() {
            // Accelerometer
            if ('Accelerometer' in window) {
                sensors.accel = new Accelerometer({ frequency: 10 });
                sensors.accel.addEventListener('reading', () => {
                    const mag = Math.sqrt(
                        sensors.accel.x ** 2 +
                        sensors.accel.y ** 2 +
                        sensors.accel.z ** 2
                    );
                    document.getElementById('accel-value').textContent = mag.toFixed(2);
                    storeReading('accelerometer', { x: sensors.accel.x, y: sensors.accel.y, z: sensors.accel.z });
                });
                sensors.accel.start();
            }

            // Gyroscope
            if ('Gyroscope' in window) {
                sensors.gyro = new Gyroscope({ frequency: 10 });
                sensors.gyro.addEventListener('reading', () => {
                    const rate = Math.sqrt(
                        sensors.gyro.x ** 2 +
                        sensors.gyro.y ** 2 +
                        sensors.gyro.z ** 2
                    );
                    document.getElementById('gyro-value').textContent = rate.toFixed(3);
                    storeReading('gyroscope', { x: sensors.gyro.x, y: sensors.gyro.y, z: sensors.gyro.z });
                });
                sensors.gyro.start();
            }

            // Geolocation
            if ('geolocation' in navigator) {
                sensors.geoWatch = navigator.geolocation.watchPosition(
                    pos => {
                        const lat = pos.coords.latitude.toFixed(5);
                        const lon = pos.coords.longitude.toFixed(5);
                        document.getElementById('location-value').textContent = `${lat}, ${lon}`;
                        storeReading('location', {
                            lat: pos.coords.latitude,
                            lon: pos.coords.longitude,
                            accuracy: pos.coords.accuracy
                        });
                    },
                    err => console.error('Geolocation error:', err),
                    { enableHighAccuracy: true }
                );
            }

            // Ambient Light Sensor
            if ('AmbientLightSensor' in window) {
                sensors.light = new AmbientLightSensor();
                sensors.light.addEventListener('reading', () => {
                    document.getElementById('light-value').textContent = sensors.light.illuminance.toFixed(0);
                    storeReading('light', { illuminance: sensors.light.illuminance });
                });
                sensors.light.start();
            }
        }

        function stopAllSensors() {
            if (sensors.accel) sensors.accel.stop();
            if (sensors.gyro) sensors.gyro.stop();
            if (sensors.geoWatch) navigator.geolocation.clearWatch(sensors.geoWatch);
            if (sensors.light) sensors.light.stop();
        }

        function storeReading(type, data) {
            const reading = {
                type,
                data,
                timestamp: Date.now()
            };

            readings.push(reading);
            readingCount++;
            document.getElementById('reading-count').textContent = readingCount;

            // Store in localStorage for offline persistence
            localStorage.setItem('sensor-readings', JSON.stringify(readings));

            // Keep only last 1000 readings in memory
            if (readings.length > 1000) {
                readings = readings.slice(-1000);
            }
        }

        async function syncData() {
            const stored = localStorage.getItem('sensor-readings');
            if (!stored) {
                alert('No data to sync');
                return;
            }

            const data = JSON.parse(stored);

            try {
                const response = await fetch('https://iot-backend.example.com/api/sensor-data', {
                    method: 'POST',
                    headers: { 'Content-Type': 'application/json' },
                    body: JSON.stringify({
                        deviceId: 'pwa-' + (localStorage.getItem('device-id') || Math.random().toString(36).substr(2, 9)),
                        readings: data
                    })
                });

                if (response.ok) {
                    localStorage.removeItem('sensor-readings');
                    readings = [];
                    readingCount = 0;
                    document.getElementById('reading-count').textContent = '0';
                    alert('Data synced successfully!');
                }
            } catch (error) {
                console.error('Sync failed:', error);
                alert('Sync failed. Data saved locally for later.');
            }
        }

        // Load saved readings on startup
        const savedReadings = localStorage.getItem('sensor-readings');
        if (savedReadings) {
            readings = JSON.parse(savedReadings);
            readingCount = readings.length;
            document.getElementById('reading-count').textContent = readingCount;
        }
    </script>
</body>
</html>

18.2.2 Step 2: Create the Web App Manifest (manifest.json)

{
  "name": "IoT Multi-Sensor App",
  "short_name": "IoT Sensors",
  "description": "Collect data from smartphone sensors for IoT applications",
  "start_url": "/",
  "display": "standalone",
  "background_color": "#ffffff",
  "theme_color": "#007bff",
  "icons": [
    {
      "src": "/icon-192.png",
      "sizes": "192x192",
      "type": "image/png"
    },
    {
      "src": "/icon-512.png",
      "sizes": "512x512",
      "type": "image/png"
    }
  ]
}

18.2.3 Step 3: Create the Service Worker (sw.js)

const CACHE_NAME = 'iot-sensors-v1';
const urlsToCache = [
  '/',
  '/index.html',
  '/manifest.json'
];

// Install service worker and cache resources
self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(CACHE_NAME)
      .then(cache => cache.addAll(urlsToCache))
  );
});

// Serve cached content when offline
self.addEventListener('fetch', event => {
  event.respondWith(
    caches.match(event.request)
      .then(response => response || fetch(event.request))
  );
});

// Clean up old caches on activation
self.addEventListener('activate', event => {
  event.waitUntil(
    caches.keys().then(cacheNames => {
      return Promise.all(
        cacheNames.filter(name => name !== CACHE_NAME)
          .map(name => caches.delete(name))
      );
    })
  );
});

18.2.4 Step 4: Serve and Install the PWA

# Start a local web server
python3 -m http.server 8000

# Access on smartphone: http://[your-computer-ip]:8000
# Use browser's "Add to Home Screen" option to install

18.2.5 PWA Architecture

Mobile sensing applications at scale showing progression from individual activity tracking to group crowd sensing to urban city analytics

18.2.6 Expected Learning Outcomes

After completing this lab, you will understand:

  • Create Progressive Web Apps with offline support
  • Use Service Workers for caching
  • Implement installable mobile applications
  • Access multiple sensors simultaneously

18.2.7 Exercises

  1. Add background sync for uploading data when online using navigator.serviceWorker.sync
  2. Implement push notifications for sensor alerts
  3. Add camera integration for QR code scanning
  4. Create data visualization dashboard with Chart.js

18.3 Lab 4: Participatory Noise Monitoring Application

Objective: Build a crowdsourced noise level monitoring system using smartphone microphones.

Materials:

  • Smartphone with microphone
  • Web browser
  • Optional: Location services enabled

Sound is vibrations in the air, and we measure how loud sounds are using decibels (dB). Your phone’s microphone can “hear” these vibrations and tell you how loud things are around you!

Here’s what different noise levels feel like: - 0-30 dB: Very quiet, like a library - 30-60 dB: Normal talking volume - 60-80 dB: Busy traffic - 80-100 dB: Lawnmower - getting uncomfortably loud! - 100+ dB: Concert or fireworks - can hurt your ears!

The Sensor Squad explains: “When lots of people use an app like this and share their noise readings, we can make a map of where it’s noisy and where it’s quiet in a whole city! That’s called participatory sensing - everyone participates to help!”

18.3.1 Complete Application Code

<!DOCTYPE html>
<html>
<head>
    <title>Noise Level Monitor</title>
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <style>
        body {
            font-family: Arial, sans-serif;
            padding: 20px;
            max-width: 600px;
            margin: 0 auto;
            background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
            min-height: 100vh;
        }
        .container {
            background: white;
            border-radius: 15px;
            padding: 20px;
            box-shadow: 0 10px 40px rgba(0,0,0,0.3);
        }
        h1 {
            color: #667eea;
            text-align: center;
        }
        .noise-meter {
            background: linear-gradient(to right, green, yellow, orange, red);
            height: 50px;
            border-radius: 25px;
            position: relative;
            margin: 30px 0;
        }
        .noise-indicator {
            position: absolute;
            width: 10px;
            height: 60px;
            background: #333;
            border-radius: 5px;
            top: -5px;
            transition: left 0.1s;
        }
        .db-display {
            text-align: center;
            font-size: 72px;
            font-weight: bold;
            color: #667eea;
            margin: 20px 0;
        }
        .db-label {
            text-align: center;
            color: #666;
            font-size: 18px;
        }
        button {
            width: 100%;
            padding: 15px;
            font-size: 18px;
            border: none;
            border-radius: 8px;
            cursor: pointer;
            margin: 10px 0;
        }
        .start-btn {
            background: #4CAF50;
            color: white;
        }
        .stop-btn {
            background: #f44336;
            color: white;
        }
        .upload-btn {
            background: #2196F3;
            color: white;
        }
        .noise-level {
            padding: 15px;
            margin: 15px 0;
            border-radius: 8px;
            text-align: center;
            font-weight: bold;
        }
        .quiet { background: #d4edda; color: #155724; }
        .moderate { background: #fff3cd; color: #856404; }
        .loud { background: #f8d7da; color: #721c24; }
        .stats {
            background: #f5f5f5;
            padding: 15px;
            border-radius: 8px;
            margin: 15px 0;
        }
    </style>
</head>
<body>
    <div class="container">
        <h1>Noise Level Monitor</h1>

        <button class="start-btn" onclick="startMonitoring()">Start Monitoring</button>
        <button class="stop-btn" onclick="stopMonitoring()">Stop Monitoring</button>

        <div class="db-display" id="db-display">--</div>
        <div class="db-label">Decibels (dB)</div>

        <div class="noise-meter">
            <div class="noise-indicator" id="indicator"></div>
        </div>

        <div class="noise-level" id="noise-level">Quiet</div>

        <div class="stats" id="stats">
            Click "Start Monitoring" to begin
        </div>

        <button class="upload-btn" onclick="uploadData()">Upload to Server</button>

        <div style="margin-top: 20px; padding: 10px; background: #e3f2fd; border-radius: 5px; font-size: 14px;">
            <strong>About Noise Levels:</strong><br>
            - 0-30 dB: Very Quiet (library)<br>
            - 30-60 dB: Quiet (conversation)<br>
            - 60-80 dB: Moderate (traffic)<br>
            - 80-100 dB: Loud (lawnmower)<br>
            - 100+ dB: Very Loud (concert, harmful)
        </div>
    </div>

    <script>
        let audioContext = null;
        let microphone = null;
        let analyser = null;
        let dataArray = null;
        let animationId = null;

        let measurements = [];
        let startTime = null;

        async function startMonitoring() {
            try {
                const stream = await navigator.mediaDevices.getUserMedia({ audio: true });

                audioContext = new (window.AudioContext || window.webkitAudioContext)();
                microphone = audioContext.createMediaStreamSource(stream);
                analyser = audioContext.createAnalyser();

                analyser.fftSize = 2048;
                analyser.smoothingTimeConstant = 0.8;

                microphone.connect(analyser);

                dataArray = new Uint8Array(analyser.frequencyBinCount);

                startTime = Date.now();
                measureNoise();

            } catch (error) {
                alert('Error accessing microphone: ' + error.message);
            }
        }

        function stopMonitoring() {
            if (animationId) {
                cancelAnimationFrame(animationId);
            }

            if (microphone) {
                microphone.disconnect();
            }

            if (audioContext) {
                audioContext.close();
            }

            updateStats();
        }

        function measureNoise() {
            analyser.getByteFrequencyData(dataArray);

            // Calculate RMS (root mean square) of audio data
            let sum = 0;
            for (let i = 0; i < dataArray.length; i++) {
                sum += dataArray[i] * dataArray[i];
            }
            const rms = Math.sqrt(sum / dataArray.length);

            // Convert to decibels (rough approximation)
            // This is a simplified conversion - real dB meters require calibration
            const db = Math.round(20 * Math.log10(rms + 1));

            // Store measurement
            measurements.push({
                db: db,
                timestamp: Date.now()
            });

            // Update display
            document.getElementById('db-display').textContent = db;

            // Update indicator position (0-100dB range)
            const percent = Math.min(db / 100 * 100, 100);
            document.getElementById('indicator').style.left = percent + '%';

            // Update noise level category
            let category, className;
            if (db < 40) {
                category = 'Quiet';
                className = 'quiet';
            } else if (db < 70) {
                category = 'Moderate';
                className = 'moderate';
            } else {
                category = 'Loud';
                className = 'loud';
            }

            const levelDiv = document.getElementById('noise-level');
            levelDiv.textContent = category + ` (${db} dB)`;
            levelDiv.className = 'noise-level ' + className;

            // Update stats
            if (measurements.length % 10 === 0) {
                updateStats();
            }

            // Continue measuring
            animationId = requestAnimationFrame(measureNoise);
        }

        function updateStats() {
            if (measurements.length === 0) {
                return;
            }

            const dbs = measurements.map(m => m.db);
            const avg = dbs.reduce((a, b) => a + b) / dbs.length;
            const max = Math.max(...dbs);
            const min = Math.min(...dbs);

            const duration = (Date.now() - startTime) / 1000;

            document.getElementById('stats').innerHTML = `
                <strong>Statistics:</strong><br>
                Average: ${avg.toFixed(1)} dB<br>
                Maximum: ${max} dB<br>
                Minimum: ${min} dB<br>
                Measurements: ${measurements.length}<br>
                Duration: ${duration.toFixed(1)} seconds
            `;
        }

        async function uploadData() {
            if (measurements.length === 0) {
                alert('No measurements to upload');
                return;
            }

            // Get current location
            if ('geolocation' in navigator) {
                navigator.geolocation.getCurrentPosition(async (position) => {
                    const data = {
                        measurements: measurements,
                        location: {
                            latitude: position.coords.latitude,
                            longitude: position.coords.longitude
                        },
                        deviceId: 'web-' + Math.random().toString(36).substr(2, 9),
                        timestamp: new Date().toISOString()
                    };

                    try {
                        const response = await fetch('https://iot-backend.example.com/api/noise-data', {
                            method: 'POST',
                            headers: {
                                'Content-Type': 'application/json'
                            },
                            body: JSON.stringify(data)
                        });

                        if (response.ok) {
                            alert('Data uploaded successfully!');
                            measurements = []; // Clear measurements
                        } else {
                            alert('Upload failed: ' + response.statusText);
                        }
                    } catch (error) {
                        // If server not available, store locally
                        localStorage.setItem('noise-data', JSON.stringify(data));
                        alert('Server unavailable. Data saved locally.');
                    }
                }, error => {
                    alert('Location unavailable. Data will be uploaded without location.');
                });
            } else {
                alert('Geolocation not supported');
            }
        }
    </script>
</body>
</html>

18.3.2 Audio Processing Pipeline

FFT spectrum analysis diagram showing transformation of time-domain audio signals into frequency components for noise level measurement

18.3.3 Expected Learning Outcomes

After completing this lab, you will understand:

  • Access smartphone microphone via Web Audio API
  • Process audio data in real-time
  • Calculate noise levels from audio signals
  • Implement participatory sensing with location
  • Manage offline data storage

18.3.4 Exercises

  1. Implement frequency analysis to identify noise sources (traffic vs. construction vs. voices)
  2. Create noise pollution heatmap from crowdsourced data using mapping libraries
  3. Add time-based analysis showing noise patterns throughout the day
  4. Implement privacy-preserving location obfuscation using grid-based reporting (snap to 100m grid)

18.4 Summary

In these two labs, you built advanced mobile sensing applications:

Key Accomplishments

Lab 3 - Progressive Web App:

  • Created an installable PWA with manifest.json
  • Implemented Service Worker for offline caching
  • Built multi-sensor data collection (accelerometer, gyroscope, GPS, light)
  • Added automatic sync when connectivity restored

Lab 4 - Noise Monitoring:

  • Accessed microphone using Web Audio API
  • Processed audio with FFT analysis
  • Calculated dB levels from RMS values
  • Implemented participatory sensing with location tagging

Objective: Convert raw audio samples from the microphone into meaningful decibel (dB) measurements for noise monitoring.

The challenge: The Web Audio API provides raw audio samples (0-255 byte values), but we need dB SPL (Sound Pressure Level) for noise pollution monitoring. How do we convert?

Step 1: Capture audio data

const audioContext = new AudioContext();
const microphone = audioContext.createMediaStreamSource(stream);
const analyser = audioContext.createAnalyser();
analyser.fftSize = 2048; // 2048 samples
microphone.connect(analyser);

const dataArray = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(dataArray); // Values: 0-255

Step 2: Calculate RMS (Root Mean Square)

RMS measures signal energy: \(\text{RMS} = \sqrt{\frac{1}{N}\sum_{i=1}^{N} x_i^2}\) where \(x_i\) are audio samples (0-255). Worked example: With samples [100, 120, 110], RMS = \(\sqrt{\frac{100^2 + 120^2 + 110^2}{3}} = \sqrt{\frac{10000 + 14400 + 12100}{3}} = \sqrt{12167} = 110.3\). Convert to dB: \(\text{dB} = 20 \times \log_{10}(\text{RMS} + 1) = 20 \times \log_{10}(111.3) \approx 41\text{ dB}\) (uncalibrated relative value).

// RMS = sqrt(sum of squares / count)
function calculateRMS(dataArray) {
    let sum = 0;
    for (let i = 0; i < dataArray.length; i++) {
        sum += dataArray[i] * dataArray[i];
    }
    return Math.sqrt(sum / dataArray.length);
}

const rms = calculateRMS(dataArray); // Result: 0-255

Step 3: Convert RMS to dB (simplified)

// Decibel formula: dB = 20 × log10(RMS / reference)
// For Web Audio API, we use a heuristic conversion
function rmsToDecibels(rms) {
    // Add 1 to avoid log10(0) = -Infinity
    // Scale factor chosen empirically
    return Math.round(20 * Math.log10(rms + 1));
}

const db = rmsToDecibels(rms);
console.log(`Noise level: ${db} dB`);

Why this is an approximation:

Professional sound level meters:

  1. Calibrated microphone: Known sensitivity (e.g., 50 mV/Pa)
  2. Reference SPL: 20 µPa (threshold of human hearing)
  3. Frequency weighting: A-weighting filter mimics human ear sensitivity
  4. Time weighting: Fast (125ms) or Slow (1s) averaging

Formula: dB SPL = 20 × log10(P / P₀) - P = measured sound pressure (Pa) - P₀ = 20 µPa (reference)

Web Audio API limitations:

  • ✗ Microphone sensitivity unknown (varies by phone model)
  • ✗ No calibration against reference tone (94 dB @ 1 kHz)
  • ✗ No A-weighting filter built-in
  • ✗ Automatic Gain Control (AGC) adjusts levels dynamically
  • Result: Relative dB values, not absolute SPL

Real-world calibration:

To get absolute dB SPL values, calibrate against a known sound source:

// Step 1: Measure 94 dB calibrator tone
const calibratorRMS = measureRMS(); // Get RMS of 94 dB tone
const REFERENCE_DB = 94;

// Step 2: Calculate calibration offset
const measuredDB = rmsToDecibels(calibratorRMS);
const offset = REFERENCE_DB - measuredDB;

// Step 3: Apply offset to all measurements
function calibratedDB(rms) {
    return rmsToDecibels(rms) + offset;
}

Example measurements (relative values, not calibrated):

Environment RMS (Web Audio) Calculated dB Actual dB SPL
Library (silent) 5 14 dB ~30 dB
Office (quiet) 15 24 dB ~50 dB
Street traffic 60 36 dB ~75 dB
Concert 180 46 dB ~105 dB

Notice: Web Audio dB values are 20-60 dB lower than actual SPL because the microphone sensitivity and AGC compression are unknown.

Practical approach for participatory sensing:

Instead of absolute dB values, use relative noise categories:

function categorizeNoise(rms) {
    const db = rmsToDecibels(rms);

    if (db < 25) return { level: 'Quiet', color: 'green' };
    else if (db < 35) return { level: 'Moderate', color: 'yellow' };
    else if (db < 45) return { level: 'Loud', color: 'orange' };
    else return { level: 'Very Loud', color: 'red' };
}

This works because relative comparisons are valid even without calibration. A reading of 40 dB on one phone and 42 dB on another phone at the same location still indicates “both phones measured loud noise.”

For research/regulatory use: Use professional calibrated sound level meters (Class 1: ±1 dB, Class 2: ±2 dB) that cost $200-2000. Smartphone apps are useful for crowdsourcing relative noise patterns but not for legal compliance measurements.

Key insight: Web Audio API provides uncalibrated relative measurements. Useful for identifying noisy vs. quiet areas (heatmaps), detecting anomalies (sudden loud event), and comparing trends over time - but not for regulatory compliance or medical diagnosis without proper calibration.

18.5 Knowledge Checks

Question 1: PWA Architecture

What is the primary role of a Service Worker in a Progressive Web App?

  1. To increase the phone’s sensor sampling rate
  2. To intercept network requests and serve cached resources when offline
  3. To connect directly to Bluetooth sensors
  4. To replace the phone’s operating system

B) To intercept network requests and serve cached resources when offline. Service Workers act as a programmable proxy between the web app and the network. During installation, they cache essential resources (HTML, CSS, JavaScript). When the device goes offline, the Service Worker intercepts fetch requests and serves the cached versions, allowing the app to function without connectivity – critical for field data collection in areas with poor coverage.

Question 2: Audio Sensing

In the noise monitoring lab, why is the dB calculation described as a “rough approximation” rather than a precise measurement?

  1. The Web Audio API does not support microphones
  2. Smartphone microphones are not calibrated against a reference sound pressure level
  3. JavaScript cannot perform mathematical calculations accurately
  4. The FFT algorithm is too slow for real-time processing

B) Smartphone microphones are not calibrated against a reference sound pressure level. Professional sound level meters are calibrated against a known 94 dB or 114 dB reference tone, with frequency weighting (A-weighting) applied. Smartphone microphones vary in sensitivity between models and have automatic gain control that adjusts input levels. The dB values from the Web Audio API are relative, not absolute, so they are useful for comparing levels but not for regulatory compliance measurements.

Key Takeaway

PWAs with Service Workers solve the fundamental offline challenge of mobile IoT sensing – data collection must not stop when connectivity does. Combined with the Web Audio API for sound sensing and Geolocation for spatial tagging, you can build a complete participatory sensing platform using only web technologies, deployable to any device with a modern browser.

Sammy the Sensor was proud of what the team built today. “We made a magic website that works even without internet!”

“How is that possible?” Lila the LED asked, puzzled. “Websites need the internet!”

Max the Microcontroller explained: “It is called a Progressive Web App. When you first visit the website, a helper called a Service Worker saves a copy of everything onto your phone. So the next time you open it – even in an airplane or a cave – it works perfectly!”

“That is amazing!” Lila blinked. “What about the noise measuring app?”

Bella the Battery jumped in. “That one uses the phone’s microphone to listen to how loud things are. It measures sound in decibels – that is like a ruler for noise. A library is about 30 decibels. A rock concert is over 100 decibels – ouch!”

Sammy added, “And the coolest part is that the app also records WHERE you measured the noise using GPS. So if thousands of kids all around a city use the app, we can make a giant map showing quiet neighborhoods and noisy ones!”

“Like a treasure map, but for sound!” Lila exclaimed.

“Exactly! And since the data saves on your phone when there is no internet, you never lose a single measurement,” Max added. “The Service Worker stores it safely until you are back online.”

The Sensor Squad Lesson: PWAs are websites with superpowers – they work offline and can be installed like real apps. Combined with the microphone and GPS, you can build apps that measure sound levels anywhere and help create noise maps of entire cities!

18.6 Knowledge Check

Common Pitfalls

Once a service worker is installed, it serves cached assets even when you update the files. During development, hard-refresh (Ctrl+Shift+R) or unregister the service worker in DevTools to force a fresh load. In production, version your cache keys so updated deployments invalidate old cached files.

Modern browsers prevent AudioContext from starting until a user interaction (click, tap). If you create AudioContext on page load, it starts in ‘suspended’ state and microphone capture fails silently. Always call audioContext.resume() inside a user gesture event handler before starting audio capture.

Web Audio API provides relative amplitude values, not calibrated sound pressure level. The dB values computed from AnalyserNode are device and microphone dependent — a 70 dB reading on one smartphone may correspond to 65 dB on another. To get calibrated SPL, measure against a reference sound level meter and apply a device-specific offset correction.

The browser will only display the ‘Add to Home Screen’ prompt if specific criteria are met: served over HTTPS, has a valid manifest with icons, and a registered service worker. Missing any one of these silently prevents the install prompt. Verify all three in Chrome DevTools > Application > Manifest before expecting the install prompt to appear.

18.7 What’s Next

If you want to… Read this
Explore mobile sensor Web APIs for motion and location Mobile Phone Labs: Web APIs
Understand the full scope of smartphone sensing capabilities Mobile Phone as a Sensor
Learn about participatory sensing with mobile devices Mobile Phone: Participatory Sensing
Assess your mobile sensing PWA and audio API knowledge Mobile Phone Labs Assessment

18.8 Resources

18.8.1 Web APIs Used

18.8.2 PWA Resources

18.8.3 Participatory Sensing