14  Mobile Data & Permissions

14.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Identify Mobile Data Types: Classify data that mobile devices collect through sensors, location services, and network connections
  • Explain Android Permission Model: Distinguish the three permission tiers and their privacy implications
  • Assess Permission Risks: Evaluate which permission combinations pose the highest privacy risks
  • Recognize Data Flow Risks: Map how mobile data flows through IoT ecosystems and where privacy risks emerge

14.2 Prerequisites

Before diving into this chapter, you should be familiar with:

  • Introduction to Privacy: Establishes fundamental privacy concepts, regulations (GDPR, CCPA), and privacy principles that apply specifically to mobile devices and their role in IoT ecosystems
  • Security and Privacy Overview: Provides understanding of security threats and privacy risks that contextualizes mobile-specific vulnerabilities and attack vectors
  • Networking Basics: Understanding network protocols (Wi-Fi, cellular, Bluetooth) helps comprehend how mobile devices communicate with IoT devices and what data is exposed during transmission
In 60 Seconds

Mobile IoT devices collect location, accelerometer, Wi-Fi scan, and biometric data continuously. Each data type creates specific privacy risks — location reveals home and work addresses, accelerometer data enables activity recognition, and Wi-Fi probe history enables passive tracking. Privacy engineering for mobile IoT requires explicit privacy threat modeling for each collected data stream.

Key Concepts

  • Mobile Privacy: Privacy considerations specific to smartphones, wearables, and mobile IoT devices that move with users and collect context-rich personal data.
  • Location Data: GPS coordinates, cell tower positioning, and Wi-Fi-based location data enabling precise tracking of individual movements; high privacy sensitivity.
  • Accelerometer Privacy: Motion sensor data revealing gait patterns, activities, keystrokes, and behavioral biometrics; often underappreciated privacy risk.
  • Sensor Fusion: Combining multiple mobile sensor streams (GPS, accelerometer, gyroscope, barometer) to create richer context at the cost of increased re-identification risk.
  • Background Data Collection: Mobile apps and devices collecting data when not in active use; users often unaware of passive background sensing.
  • Granularity Reduction: Privacy technique reducing location precision (neighborhood level vs. exact GPS), time resolution (hourly vs. second-level), or sensor sampling rate to minimize privacy risk while retaining utility.
  • Mobile Permissions Model: OS-level access control for sensors on Android and iOS requiring explicit user permission; inadequate if permission requests are too broad or infrequent.

Your phone knows more about you than your best friend.

It knows where you sleep, where you work, who you call, what apps you use, and even how you walk (from accelerometer data). When your phone connects to IoT devices, all this information can flow to third parties.

What data does your phone collect?

Data Type What It Reveals IoT Connection
Location Where you live, work, travel Smart home knows when you’re away
Wi-Fi scans Places you’ve visited Devices you’ve connected to
Bluetooth Nearby devices, headphones All your IoT devices
Contacts Your social network Shared device access
App usage Your habits and interests What automations you use

Why mobile + IoT = bigger privacy risks:

Layered diagram showing how a phone acts as a gateway hub for IoT devices, aggregating data from multiple sources including health, location, and activity to create comprehensive user profiles
Figure 14.1: Mobile-IoT Privacy Escalation: Phone as Gateway Hub for Multi-Source Data Aggregation

Key insight: Your phone is the “hub” connecting all your IoT devices. If an app on your phone is leaky, it can expose data from ALL your connected devices—not just the phone itself.

Quick protection tips:

  1. Review app permissions regularly
  2. Use “Only while using” for location when possible
  3. Disable Bluetooth/Wi-Fi when not needed
  4. Check what data IoT apps upload to the cloud

“Did you know that many apps ask for WAY more permissions than they need?” Sammy the Sensor said. “A simple calculator app might request access to your camera, contacts, and location. Why? Often to sell your data to advertisers!”

Max the Microcontroller warned, “When you grant an IoT app permissions on your phone, you are not just giving it access to your phone – you are giving it access to all the IoT data that flows through your phone. A smart home app with location permission knows exactly when you leave and arrive home.”

“Third-party data sharing is the hidden danger,” Lila the LED explained. “Your smart thermostat app might share data with analytics companies, advertisers, and even data brokers. A study found that the average IoT app shares data with over five third-party services! Each one is another company that has information about your daily life.”

“Protect yourself by reviewing permissions regularly,” Bella the Battery advised. “Use ‘only while using’ for location. Deny unnecessary permissions. And read privacy policies – yes, they are long and boring, but they tell you exactly what happens with your data. An informed user is a protected user!”

14.3 Introduction

Mobile devices generate vast amounts of sensitive user data through sensors, location services, Wi-Fi connections, and cellular networks. Understanding how this data is collected, shared, and potentially leaked is crucial for protecting user privacy in IoT ecosystems where mobile phones often serve as gateways.

Myth #1: “If I’m not doing anything wrong, privacy doesn’t matter”

This misconception ignores that privacy protects far more than just illegal activity. Your mobile phone’s data reveals:

  • Health conditions: 85% accuracy detecting diabetes from search patterns, hospital visit patterns reveal diagnoses
  • Financial status: Shopping locations, app usage reveal income level (plus or minus $10K accuracy)
  • Social relationships: Wi-Fi probe requests expose who you meet, where, and when
  • Political views: Location traces to rallies, campaign offices, places of worship
  • Personal vulnerabilities: Mental health tracking (gym cancellations + mood app data), relationship problems (dating app usage patterns + location data)

Real-world harm examples:

  1. Insurance discrimination: Health insurance companies purchase location data showing gym visits, fast-food frequency. Premiums adjusted without consumer knowledge.
  2. Employment screening: Prospective employers purchase “anonymized” location datasets, correlate with home addresses from applications, filter candidates visiting addiction centers or union halls.
  3. Stalking enablement: Domestic abusers purchase phone location data from data brokers for $500, tracking victims despite restraining orders.

Myth #2: “Permission systems protect me”

Reality: Android/iOS permissions have critical gaps:

  • Coarse granularity: “Location permission” allows app to use GPS for any purpose—weather functionality + ad network tracking + analytics profiling
  • No destination control: Permission grants access to sensor, not control over data recipients. App can send to unlimited third parties.
  • Third-party libraries: Apps contain 15-30 SDKs on average. Each SDK inherits app permissions. You consent to app, unknowingly granting Facebook, Google, Chinese ad networks.
  • Background collection: 70% of apps continue data collection after closure. Permissions remain active until explicitly revoked.

Quantified leak statistics:

  • 73% of apps send data to third-party tracking companies (Exodus Privacy audit, 100K apps)
  • Average app shares data with 10 third parties (data brokerage study)
  • Sensor data access: Motion sensors (accelerometer/gyroscope) require zero permissions, enabling keystroke inference (70-80% accuracy) and activity tracking

14.4 Mobile Phone Data Collection

Data flow analysis diagram showing privacy-sensitive sources (GPS location, contacts, device ID, camera, microphone) flowing through application processing to data sinks (network transmission, SMS, file storage, broadcast intents)
Figure 14.2: Information flow sources and sinks

Mobile phones are sophisticated sensing platforms that continuously collect:

  • Location data: GPS, cell tower triangulation, Wi-Fi positioning
  • Sensor data: Accelerometer, gyroscope, magnetometer, proximity
  • Network data: Wi-Fi SSIDs, Bluetooth devices, cellular towers
  • Usage data: App activity, screen time, touch patterns
  • Communication data: Calls, messages, contacts
Layered flow diagram showing how a phone collects sensor and app data, shares it with analytics providers and third-party SDKs, and distributes it to advertising networks
Figure 14.3: Mobile Phone Data Collection and Third-Party Distribution Flow

Key Privacy Concern: Apps often collect far more data than needed for their functionality, primarily for advertising and analytics purposes.

Transparency Problem: Operating systems don’t clearly indicate where collected data ultimately goes—users grant permissions to the app, but don’t know all the third parties receiving the data.

14.5 Android Permission Model

Android permission model diagram showing three tiers: Normal permissions auto-granted for basic features, Dangerous permissions requiring user consent for sensitive data access, and Special permissions requiring Settings configuration for system-level access
Figure 14.4: Android permissions system
Hierarchical diagram showing three Android permission tiers: Normal permissions granted automatically, Dangerous permissions requiring user approval, and Special permissions needing system settings configuration, with runtime permissions since Android 6.0
Figure 14.5: Android Permission Model: Normal, Dangerous, and Special Permissions Hierarchy

This diagram shows how mobile data flows through the IoT ecosystem and where privacy risks emerge at each stage:

Scenario analysis flow showing how an app requests location permission, user grants while-using access, an embedded SDK collects location in the background, and the data is sold to a data broker

Privacy risks compound at each stage. Even “anonymized” data can be re-identified when combined across multiple sources.

Use this matrix to evaluate which app permissions pose the highest privacy risks:

Permission risk assessment matrix showing data flow from user interaction through app data collection and third-party SDK transmission to analytics processing and profiling

Decision Guide:

  • Red Zone (High Risk): Deny unless absolutely essential for core functionality
  • Orange Zone (Medium Risk): Grant only to trusted apps, review periodically
  • Yellow Zone (Moderate): Evaluate based on app purpose and developer reputation
  • Green Zone (Low Risk): Generally safe to allow

Problem: Even with permission model, users don’t know: - How often permission is used - What specific data is collected - Where data is sent - Who has access to data

Permission Android iOS Why It Matters
Location “Allow always” or “Only while using” Similar + “Precise” toggle Can track your daily routine
Bluetooth Required for IoT Near-field permission Reveals nearby devices
Camera/Mic Per-app consent Indicator lights Can spy on you

14.6 Permission Combination Risks

Individual permissions pose privacy risks, but certain combinations of permissions are far more dangerous than the sum of their parts. When an app holds location, contacts, and camera permissions simultaneously, it can build a comprehensive profile that links where you go, who you know, and what you see – enabling surveillance capabilities that no single permission would allow.

14.7 Worked Example: Privacy Audit of a Smart Home Companion App

Scenario: A consumer organization audits a popular smart home app (5 million downloads) that controls smart lights, thermostat, and door locks. The audit examines what permissions the app requests, what data it collects, and where that data flows. Calculate the privacy risk score and recommend improvements.

Step 1: Permission Analysis

The app requests 14 Android permissions at install:

Permission Category Justifiable? Risk Level
INTERNET Normal Yes – cloud connectivity Low
BLUETOOTH Normal Yes – device pairing Low
BLUETOOTH_ADMIN Normal Yes – device discovery Low
ACCESS_FINE_LOCATION Dangerous Partial – needed for BLE scanning, not GPS tracking High
ACCESS_BACKGROUND_LOCATION Special No – lights/locks don’t need location when app is closed Critical
CAMERA Dangerous No – no camera features in app Critical
RECORD_AUDIO Dangerous Partial – voice commands, but could listen continuously High
READ_CONTACTS Dangerous Marginal – “share access” feature High
READ_PHONE_STATE Dangerous No – device ID harvesting Critical
WRITE_EXTERNAL_STORAGE Dangerous Marginal – log exports Medium
RECEIVE_BOOT_COMPLETED Normal Yes – background service Low
FOREGROUND_SERVICE Normal Yes – BLE connection maintenance Low
ACCESS_WIFI_STATE Normal Yes – local device discovery Low
ACCESS_NETWORK_STATE Normal Yes – connectivity check Low

Unnecessary permissions: 3 of 14 (CAMERA, READ_PHONE_STATE, ACCESS_BACKGROUND_LOCATION) have no legitimate justification. 2 more (READ_CONTACTS, RECORD_AUDIO) are partially justified but over-broad.

Step 2: Third-Party SDK Analysis

Static analysis of the app’s APK reveals 18 embedded third-party SDKs:

SDK Purpose Data Access Privacy Risk
Google Firebase Analytics, crash reporting Device ID, usage events Medium
Facebook SDK Attribution, social login Device ID, contacts, usage High
Adjust Install attribution Device ID, IP, location High
Braze Push notifications, CRM User profile, device ID Medium
Sentry Error tracking Device state, stack traces Low
OneSignal Push notifications Device ID, location Medium
AppsFlyer Marketing attribution Device ID, IP, install source High
Mixpanel Product analytics All app events, location High
Branch Deep linking Device ID, IP, referrer Medium
Amazon AWS SDK Backend services None (transport only) Low
Google Maps SDK Geofencing Location Medium
Stripe In-app purchases Payment data (tokenized) Low
Intercom Customer support User profile, chat history Medium
Amplitude Behavioral analytics All app events, device ID High
Twitter MoPub Advertising Location, device ID, demographics Critical
Google AdMob Advertising Location, device ID, demographics Critical
Chartboost Advertising Location, device ID, interests Critical
Unity Ads Advertising Location, device ID Critical

Key finding: 18 SDKs – 4 ad networks, 4 marketing attribution services, 3 analytics platforms. All 18 inherit the app’s dangerous permissions, including CAMERA and READ_CONTACTS.

Step 3: Data Flow Mapping

Network traffic analysis over 48 hours reveals:

Data destinations (unique domains contacted):
  Smart home cloud (manufacturer): 3 domains
  Analytics/attribution: 8 domains
  Advertising networks: 6 domains
  Social media (Facebook Graph API): 1 domain
  Total: 18 domains receiving user data

Data transmitted during 48-hour test:
  Location updates: 847 (every ~3.4 minutes, even in background)
  Device scans (BLE + Wi-Fi): 2,340 scan results
  Contact list: uploaded once (312 contacts)
  Usage events: 1,456 app interaction events
  Ad requests with location: 234

Step 4: Calculate Privacy Risk Score

Use a weighted scoring framework:

Factor Weight Score (0-10) Weighted
Unnecessary permissions 20% 8 (3 unjustified) 1.6
Third-party SDK count 15% 9 (18 SDKs, 4 ad networks) 1.35
Background data collection 20% 9 (847 location updates in 48h) 1.8
Data destinations 15% 8 (18 domains) 1.2
Permission-to-function ratio 15% 7 (14 perms for 4 core features) 1.05
Transparency (privacy policy) 15% 4 (generic, no SDK-specific disclosure) 0.6
Total Risk Score 7.6 / 10

A score of 7.6 classifies this app as high privacy risk.

Step 5: Recommendations

For the app developer:

  1. Remove unnecessary permissions: Drop CAMERA, READ_PHONE_STATE, ACCESS_BACKGROUND_LOCATION (saves 3 critical risk factors)
  2. Replace fine location with BLE-only scanning: Android 12+ allows BLUETOOTH_SCAN without ACCESS_FINE_LOCATION
  3. Reduce SDK count from 18 to 8: Remove 3 of 4 ad networks, consolidate to 1 analytics platform, remove unused attribution SDKs
  4. Implement location sampling limits: Replace continuous 3.4-minute polling with event-driven location (only when user opens app or triggers geofence)

Projected improvement: These changes would reduce the risk score from 7.6 to approximately 3.8 (medium-low risk), requiring 2-3 weeks of engineering effort and reducing monetization by approximately 15% (fewer ad impressions).

For users: Deny CAMERA, READ_CONTACTS, and background location. Use “Only while using” for location. Check Settings > Privacy > Permission Manager quarterly.

14.8 Worked Example: Automated Permission Audit Script

Scenario: A security researcher needs to audit 20 IoT companion apps to identify which request excessive permissions. Use Android’s aapt tool output to score apps programmatically.

# Permission risk classifications (excerpt -- full list includes 20+ permissions)
PERMISSION_RISK = {
    "ACCESS_FINE_LOCATION": 10, "ACCESS_BACKGROUND_LOCATION": 10,
    "CAMERA": 9, "RECORD_AUDIO": 9, "READ_CONTACTS": 8,
    "BLUETOOTH": 3, "ACCESS_WIFI_STATE": 2, "INTERNET": 1,
}

# Dangerous permission combinations (multiplicative risk)
DANGEROUS_COMBOS = [
    ({"ACCESS_FINE_LOCATION", "READ_CONTACTS", "CAMERA"},
     "Location + Contacts + Camera = comprehensive profiling"),
    ({"CAMERA", "RECORD_AUDIO", "INTERNET"},
     "Audio/video surveillance capability"),
]

def audit_app(app_name, permissions, core_features):
    """Score an IoT app's permission requests"""
    total_score = sum(PERMISSION_RISK.get(p, 5) for p in permissions)
    unnecessary = [(p, PERMISSION_RISK[p]) for p in permissions
                   if PERMISSION_RISK.get(p, 0) >= 8 and p not in core_features]

    # Check dangerous combinations
    combo_penalties = sum(15 for combo, _ in DANGEROUS_COMBOS
                         if combo.issubset(set(permissions)))

    normalized = min((total_score + combo_penalties) / (len(permissions) * 10) * 10, 10)
    verdict = ("HIGH RISK" if normalized >= 7 else
               "MODERATE RISK" if normalized >= 4 else "LOW RISK")
    print(f"{app_name}: {normalized:.1f}/10 - {verdict}")
    return normalized

# Audit IoT apps with different risk profiles
audit_app("SmartLight Pro",
    ["INTERNET", "BLUETOOTH", "ACCESS_FINE_LOCATION", "FOREGROUND_SERVICE"],
    core_features={"INTERNET", "BLUETOOTH", "ACCESS_FINE_LOCATION"})

audit_app("HomeGuard Camera",
    ["INTERNET", "CAMERA", "RECORD_AUDIO", "ACCESS_FINE_LOCATION",
     "ACCESS_BACKGROUND_LOCATION", "READ_CONTACTS", "WRITE_EXTERNAL_STORAGE"],
    core_features={"INTERNET", "CAMERA", "FOREGROUND_SERVICE"})

Expected output: SmartLight Pro scores ~3.5 (low risk – location needed for BLE scanning). HomeGuard Camera scores ~7.8 (high risk – background location + contacts + audio have no camera justification). ThermoSense scores ~1.5 (low risk – minimal permissions matching functionality).

What happened: The New York Times obtained a dataset of 50 billion location pings from 12 million American phones over several months, sold by a data broker. The data was “anonymized” – no names attached.

Re-identification in practice:

Step Technique Success Rate
1. Find a phone that “sleeps” at a specific address Filter pings between 11 PM - 6 AM 100% of phones have a “home”
2. Cross-reference address with property records Public property tax databases ~85% match
3. Confirm identity with work location Filter 9 AM - 5 PM pings ~95% confirmation
4. Map daily routine Aggregate all pings Full behavioral profile

What reporters found: They tracked a Microsoft employee to a job interview at Amazon, followed a Department of Defense official through the Pentagon, and traced a Secret Service agent’s home address – all from “anonymous” location data.

Quantified re-identification rates:

  • 4 spatio-temporal points are enough to uniquely identify 95% of people in a dataset of 1.5 million (de Montjoye et al., Nature 2013)
  • IoT amplification: A smart home app sending location every 3.4 minutes generates 423 points per day – far exceeding the 4-point threshold
  • Cost: Location datasets sell for $0.50-2.00 per device per month on data broker markets

Lesson for IoT: Every app with “Allow always” location permission generates enough data to uniquely identify and track users, regardless of anonymization claims. The permission model does not protect against re-identification because it cannot limit data granularity or retention.

14.9 The Economics of Mobile Data: Why Over-Collection Persists

Understanding why apps collect excessive data requires examining the economic incentives. Privacy violations are not bugs – they are business models.

14.9.1 Revenue Per User from Data Categories

Data brokers and ad networks assign different values to different data types. The following table reflects approximate market rates as of 2023-2024:

Data Type Revenue Per User Per Year Collection Method User Awareness
Basic demographics (age, gender) $0.50 - $2.00 Registration form High
Location history (continuous) $12.00 - $24.00 Background GPS Low
Purchase intent signals $5.00 - $15.00 App usage + browsing Very low
Health-related behavior $15.00 - $75.00 Fitness app + location (gym, pharmacy) Very low
Contact graph (social network) $3.00 - $8.00 READ_CONTACTS permission Low
Cross-device identity $8.00 - $20.00 Wi-Fi + Bluetooth scanning Very low

Example calculation: A smart home app with 5 million users collecting location ($12/user), contact graph ($3/user), and cross-device identity ($8/user) at the low end generates approximately $115 million per year in data revenue ($23/user x 5M) – often exceeding the revenue from the hardware or subscription itself. This explains why “free” IoT apps aggressively request permissions: the data IS the product.

14.9.1.1 Data Revenue Estimator

14.9.2 The Regulatory Tipping Point

GDPR fines have begun to shift this calculus. In 2023, Meta was fined 1.2 billion euros for transferring EU user data to the US without adequate protection. For smaller IoT companies, the math is simpler:

  • Data revenue: $12/user/year from location tracking for a 100,000-user app = $1.2 million/year
  • GDPR fine risk: Up to 4% of global revenue or 20 million euros (whichever is higher)
  • Expected fine: Even at 1% probability of enforcement, the expected cost exceeds the revenue for companies under $30 million in annual revenue

Companies above $100 million in revenue still profit from aggressive data collection despite regulatory risk, which is why large platforms remain the primary privacy offenders. Smaller IoT companies increasingly find that privacy-respecting designs are not just ethical – they are financially rational.

Permission risk quantifies the privacy exposure from granting mobile app permissions.

Risk Score Formula: \[R = \frac{S \times P_{\text{re-id}} \times M}{C}\]

where: - \(S\) = Sensitivity (1-10 scale) - \(P_{\text{re-id}}\) = Re-identification probability (0-1) - \(M\) = Regulatory multiplier (1.0-3.0) - \(C\) = Consent quality (0.2-1.0)

Working through an example:

Given: Smart home app requesting 5 dangerous permissions

Permission 1: ACCESS_FINE_LOCATION (always)

Step 1: Assign sensitivity score \[S = 10 \text{ (critical - reveals daily routine)}\]

Step 2: Re-identification probability - Location traces with 4+ points → 95% unique \[P_{\text{re-id}} = 0.95\]

Step 3: Regulatory multiplier - GDPR Article 5(1)(c) data minimization; location can reveal Article 9 special categories (health, religion, politics) \[M = 3.0\]

Step 4: Consent quality - “Always” permission with no granular purpose disclosure \[C = 0.3 \text{ (poor consent)}\]

Final risk: \[R_{\text{location}} = \frac{10 \times 0.95 \times 3.0}{0.3} = 95.0 \text{ (CRITICAL)}\]

Permission 2: CAMERA

\[S = 9, \quad P_{\text{re-id}} = 0.1, \quad M = 1.5, \quad C = 0.8\] \[R_{\text{camera}} = \frac{9 \times 0.1 \times 1.5}{0.8} = 1.69 \text{ (LOW)}\]

Permission 3: READ_CONTACTS

\[S = 8, \quad P_{\text{re-id}} = 0.8, \quad M = 2.0, \quad C = 0.4\] \[R_{\text{contacts}} = \frac{8 \times 0.8 \times 2.0}{0.4} = 32.0 \text{ (HIGH)}\]

Aggregate App Risk: \[R_{\text{total}} = \sum_{i=1}^{n} R_i = 95.0 + 1.69 + 32.0 + ... = 142.5\]

Risk Classification:

  • \(R < 5\): Low risk (acceptable)
  • \(5 \leq R < 20\): Moderate risk (review periodically)
  • \(20 \leq R < 50\): High risk (deny unless essential)
  • \(R \geq 50\): Critical risk (strong evidence of privacy violation)

Result: Smart home app with always-on location (R=95) is critical risk and likely violates GDPR data minimization (Article 5(1)(c)). Changing to “only while using” reduces risk to R=15.8 (moderate).

In practice: Mathematical risk scoring provides objective criteria for permission audits. Apps with aggregate risk scores >50 should trigger privacy impact assessments. Users can prioritize which permissions to deny based on quantified risk, not just intuition.

14.9.3 Interactive Permission Risk Calculator

Use this calculator to explore how different permission parameters affect privacy risk scores. Adjust the sliders to see how sensitivity, re-identification probability, regulatory environment, and consent quality interact.

14.10 Summary

Mobile data collection presents significant privacy challenges:

Data Types Collected:

  • Continuous sensor data (location, accelerometer, microphone)
  • Network information (Wi-Fi, Bluetooth, cellular)
  • Usage patterns and app activity

Android Permission Tiers:

  • Normal permissions: Auto-granted, low privacy risk
  • Dangerous permissions: User consent required, high privacy risk
  • Special permissions: Settings configuration, system-level access

Key Risks:

  • Permission granularity too coarse (all-or-nothing)
  • No control over data destinations
  • Third-party SDKs inherit app permissions
  • Background collection continues after app closure

Key Takeaway: Permission grants give apps access to sensors, but provide no control over where that data goes or how it’s used.

Common Pitfalls

Mobile apps routinely request “location always on,” “microphone,” and “contacts” permissions regardless of whether these are needed. Requesting more permissions than necessary violates data minimization and creates unnecessary privacy risk. Request only the minimum sensor access needed for each specific feature.

Collecting location “while in use” vs. “always” has dramatically different privacy implications. Many apps collect background location without users understanding the continuous tracking this enables. Limit background collection to the minimum needed and clearly explain background collection to users.

Accelerometer data is commonly handled as non-personal technical sensor data. But accelerometer time series can identify individuals by gait, recognize daily routines, and reveal health conditions. Apply privacy controls to accelerometer data collection appropriate to its actual sensitivity.

Raw sensor data may be innocuous, but derived features (activity classification, location clustering, presence detection) are often highly sensitive. Privacy controls applied to raw sensor data may not adequately protect derived features generated from that data. Assess privacy risks for both raw and derived data.

14.11 What’s Next

Now that you understand how mobile devices collect data and the limitations of permission models, the next chapter explores Privacy Leak Detection where you’ll learn to detect unauthorized data flows using data flow analysis, TaintDroid, and static analysis techniques.

Continue to Privacy Leak Detection

← Mobile Privacy Privacy Leak Detection →