Identify Mobile Data Types: Classify data that mobile devices collect through sensors, location services, and network connections
Explain Android Permission Model: Distinguish the three permission tiers and their privacy implications
Assess Permission Risks: Evaluate which permission combinations pose the highest privacy risks
Recognize Data Flow Risks: Map how mobile data flows through IoT ecosystems and where privacy risks emerge
14.2 Prerequisites
Before diving into this chapter, you should be familiar with:
Introduction to Privacy: Establishes fundamental privacy concepts, regulations (GDPR, CCPA), and privacy principles that apply specifically to mobile devices and their role in IoT ecosystems
Security and Privacy Overview: Provides understanding of security threats and privacy risks that contextualizes mobile-specific vulnerabilities and attack vectors
Networking Basics: Understanding network protocols (Wi-Fi, cellular, Bluetooth) helps comprehend how mobile devices communicate with IoT devices and what data is exposed during transmission
In 60 Seconds
Mobile IoT devices collect location, accelerometer, Wi-Fi scan, and biometric data continuously. Each data type creates specific privacy risks — location reveals home and work addresses, accelerometer data enables activity recognition, and Wi-Fi probe history enables passive tracking. Privacy engineering for mobile IoT requires explicit privacy threat modeling for each collected data stream.
Key Concepts
Mobile Privacy: Privacy considerations specific to smartphones, wearables, and mobile IoT devices that move with users and collect context-rich personal data.
Location Data: GPS coordinates, cell tower positioning, and Wi-Fi-based location data enabling precise tracking of individual movements; high privacy sensitivity.
Accelerometer Privacy: Motion sensor data revealing gait patterns, activities, keystrokes, and behavioral biometrics; often underappreciated privacy risk.
Sensor Fusion: Combining multiple mobile sensor streams (GPS, accelerometer, gyroscope, barometer) to create richer context at the cost of increased re-identification risk.
Background Data Collection: Mobile apps and devices collecting data when not in active use; users often unaware of passive background sensing.
Granularity Reduction: Privacy technique reducing location precision (neighborhood level vs. exact GPS), time resolution (hourly vs. second-level), or sensor sampling rate to minimize privacy risk while retaining utility.
Mobile Permissions Model: OS-level access control for sensors on Android and iOS requiring explicit user permission; inadequate if permission requests are too broad or infrequent.
For Beginners: Why Does Mobile Privacy Matter for IoT?
Your phone knows more about you than your best friend.
It knows where you sleep, where you work, who you call, what apps you use, and even how you walk (from accelerometer data). When your phone connects to IoT devices, all this information can flow to third parties.
What data does your phone collect?
Data Type
What It Reveals
IoT Connection
Location
Where you live, work, travel
Smart home knows when you’re away
Wi-Fi scans
Places you’ve visited
Devices you’ve connected to
Bluetooth
Nearby devices, headphones
All your IoT devices
Contacts
Your social network
Shared device access
App usage
Your habits and interests
What automations you use
Why mobile + IoT = bigger privacy risks:
Figure 14.1: Mobile-IoT Privacy Escalation: Phone as Gateway Hub for Multi-Source Data Aggregation
Key insight: Your phone is the “hub” connecting all your IoT devices. If an app on your phone is leaky, it can expose data from ALL your connected devices—not just the phone itself.
Quick protection tips:
Review app permissions regularly
Use “Only while using” for location when possible
Disable Bluetooth/Wi-Fi when not needed
Check what data IoT apps upload to the cloud
Sensor Squad: The App Permission Detective!
“Did you know that many apps ask for WAY more permissions than they need?” Sammy the Sensor said. “A simple calculator app might request access to your camera, contacts, and location. Why? Often to sell your data to advertisers!”
Max the Microcontroller warned, “When you grant an IoT app permissions on your phone, you are not just giving it access to your phone – you are giving it access to all the IoT data that flows through your phone. A smart home app with location permission knows exactly when you leave and arrive home.”
“Third-party data sharing is the hidden danger,” Lila the LED explained. “Your smart thermostat app might share data with analytics companies, advertisers, and even data brokers. A study found that the average IoT app shares data with over five third-party services! Each one is another company that has information about your daily life.”
“Protect yourself by reviewing permissions regularly,” Bella the Battery advised. “Use ‘only while using’ for location. Deny unnecessary permissions. And read privacy policies – yes, they are long and boring, but they tell you exactly what happens with your data. An informed user is a protected user!”
14.3 Introduction
Mobile devices generate vast amounts of sensitive user data through sensors, location services, Wi-Fi connections, and cellular networks. Understanding how this data is collected, shared, and potentially leaked is crucial for protecting user privacy in IoT ecosystems where mobile phones often serve as gateways.
Common Misconception: “I Have Nothing to Hide” and Anonymization Myths
This misconception ignores that privacy protects far more than just illegal activity. Your mobile phone’s data reveals:
Health conditions: 85% accuracy detecting diabetes from search patterns, hospital visit patterns reveal diagnoses
Financial status: Shopping locations, app usage reveal income level (plus or minus $10K accuracy)
Social relationships: Wi-Fi probe requests expose who you meet, where, and when
Political views: Location traces to rallies, campaign offices, places of worship
Personal vulnerabilities: Mental health tracking (gym cancellations + mood app data), relationship problems (dating app usage patterns + location data)
Real-world harm examples:
Insurance discrimination: Health insurance companies purchase location data showing gym visits, fast-food frequency. Premiums adjusted without consumer knowledge.
Employment screening: Prospective employers purchase “anonymized” location datasets, correlate with home addresses from applications, filter candidates visiting addiction centers or union halls.
Stalking enablement: Domestic abusers purchase phone location data from data brokers for $500, tracking victims despite restraining orders.
Myth #2: “Permission systems protect me”
Reality: Android/iOS permissions have critical gaps:
Coarse granularity: “Location permission” allows app to use GPS for any purpose—weather functionality + ad network tracking + analytics profiling
No destination control: Permission grants access to sensor, not control over data recipients. App can send to unlimited third parties.
Third-party libraries: Apps contain 15-30 SDKs on average. Each SDK inherits app permissions. You consent to app, unknowingly granting Facebook, Google, Chinese ad networks.
Background collection: 70% of apps continue data collection after closure. Permissions remain active until explicitly revoked.
Quantified leak statistics:
73% of apps send data to third-party tracking companies (Exodus Privacy audit, 100K apps)
Average app shares data with 10 third parties (data brokerage study)
Sensor data access: Motion sensors (accelerometer/gyroscope) require zero permissions, enabling keystroke inference (70-80% accuracy) and activity tracking
14.4 Mobile Phone Data Collection
Figure 14.2: Information flow sources and sinks
Mobile phones are sophisticated sensing platforms that continuously collect:
Network data: Wi-Fi SSIDs, Bluetooth devices, cellular towers
Usage data: App activity, screen time, touch patterns
Communication data: Calls, messages, contacts
Figure 14.3: Mobile Phone Data Collection and Third-Party Distribution Flow
Key Privacy Concern: Apps often collect far more data than needed for their functionality, primarily for advertising and analytics purposes.
Transparency Problem: Operating systems don’t clearly indicate where collected data ultimately goes—users grant permissions to the app, but don’t know all the third parties receiving the data.
14.5 Android Permission Model
Figure 14.4: Android permissions system
Figure 14.5: Android Permission Model: Normal, Dangerous, and Special Permissions Hierarchy
Alternative View: Mobile-IoT Data Flow Risk Pipeline
This diagram shows how mobile data flows through the IoT ecosystem and where privacy risks emerge at each stage:
Privacy risks compound at each stage. Even “anonymized” data can be re-identified when combined across multiple sources.
Alternative View: Permission Risk Assessment Matrix
Use this matrix to evaluate which app permissions pose the highest privacy risks:
Decision Guide:
Red Zone (High Risk): Deny unless absolutely essential for core functionality
Orange Zone (Medium Risk): Grant only to trusted apps, review periodically
Yellow Zone (Moderate): Evaluate based on app purpose and developer reputation
Green Zone (Low Risk): Generally safe to allow
Problem: Even with permission model, users don’t know: - How often permission is used - What specific data is collected - Where data is sent - Who has access to data
Permission
Android
iOS
Why It Matters
Location
“Allow always” or “Only while using”
Similar + “Precise” toggle
Can track your daily routine
Bluetooth
Required for IoT
Near-field permission
Reveals nearby devices
Camera/Mic
Per-app consent
Indicator lights
Can spy on you
14.6 Permission Combination Risks
Individual permissions pose privacy risks, but certain combinations of permissions are far more dangerous than the sum of their parts. When an app holds location, contacts, and camera permissions simultaneously, it can build a comprehensive profile that links where you go, who you know, and what you see – enabling surveillance capabilities that no single permission would allow.
Knowledge Check: Permission Combinations
14.7 Worked Example: Privacy Audit of a Smart Home Companion App
Scenario: A consumer organization audits a popular smart home app (5 million downloads) that controls smart lights, thermostat, and door locks. The audit examines what permissions the app requests, what data it collects, and where that data flows. Calculate the privacy risk score and recommend improvements.
Step 1: Permission Analysis
The app requests 14 Android permissions at install:
Permission
Category
Justifiable?
Risk Level
INTERNET
Normal
Yes – cloud connectivity
Low
BLUETOOTH
Normal
Yes – device pairing
Low
BLUETOOTH_ADMIN
Normal
Yes – device discovery
Low
ACCESS_FINE_LOCATION
Dangerous
Partial – needed for BLE scanning, not GPS tracking
High
ACCESS_BACKGROUND_LOCATION
Special
No – lights/locks don’t need location when app is closed
Critical
CAMERA
Dangerous
No – no camera features in app
Critical
RECORD_AUDIO
Dangerous
Partial – voice commands, but could listen continuously
High
READ_CONTACTS
Dangerous
Marginal – “share access” feature
High
READ_PHONE_STATE
Dangerous
No – device ID harvesting
Critical
WRITE_EXTERNAL_STORAGE
Dangerous
Marginal – log exports
Medium
RECEIVE_BOOT_COMPLETED
Normal
Yes – background service
Low
FOREGROUND_SERVICE
Normal
Yes – BLE connection maintenance
Low
ACCESS_WIFI_STATE
Normal
Yes – local device discovery
Low
ACCESS_NETWORK_STATE
Normal
Yes – connectivity check
Low
Unnecessary permissions: 3 of 14 (CAMERA, READ_PHONE_STATE, ACCESS_BACKGROUND_LOCATION) have no legitimate justification. 2 more (READ_CONTACTS, RECORD_AUDIO) are partially justified but over-broad.
Step 2: Third-Party SDK Analysis
Static analysis of the app’s APK reveals 18 embedded third-party SDKs:
SDK
Purpose
Data Access
Privacy Risk
Google Firebase
Analytics, crash reporting
Device ID, usage events
Medium
Facebook SDK
Attribution, social login
Device ID, contacts, usage
High
Adjust
Install attribution
Device ID, IP, location
High
Braze
Push notifications, CRM
User profile, device ID
Medium
Sentry
Error tracking
Device state, stack traces
Low
OneSignal
Push notifications
Device ID, location
Medium
AppsFlyer
Marketing attribution
Device ID, IP, install source
High
Mixpanel
Product analytics
All app events, location
High
Branch
Deep linking
Device ID, IP, referrer
Medium
Amazon AWS SDK
Backend services
None (transport only)
Low
Google Maps SDK
Geofencing
Location
Medium
Stripe
In-app purchases
Payment data (tokenized)
Low
Intercom
Customer support
User profile, chat history
Medium
Amplitude
Behavioral analytics
All app events, device ID
High
Twitter MoPub
Advertising
Location, device ID, demographics
Critical
Google AdMob
Advertising
Location, device ID, demographics
Critical
Chartboost
Advertising
Location, device ID, interests
Critical
Unity Ads
Advertising
Location, device ID
Critical
Key finding: 18 SDKs – 4 ad networks, 4 marketing attribution services, 3 analytics platforms. All 18 inherit the app’s dangerous permissions, including CAMERA and READ_CONTACTS.
Step 3: Data Flow Mapping
Network traffic analysis over 48 hours reveals:
Data destinations (unique domains contacted):
Smart home cloud (manufacturer): 3 domains
Analytics/attribution: 8 domains
Advertising networks: 6 domains
Social media (Facebook Graph API): 1 domain
Total: 18 domains receiving user data
Data transmitted during 48-hour test:
Location updates: 847 (every ~3.4 minutes, even in background)
Device scans (BLE + Wi-Fi): 2,340 scan results
Contact list: uploaded once (312 contacts)
Usage events: 1,456 app interaction events
Ad requests with location: 234
Step 4: Calculate Privacy Risk Score
Use a weighted scoring framework:
Factor
Weight
Score (0-10)
Weighted
Unnecessary permissions
20%
8 (3 unjustified)
1.6
Third-party SDK count
15%
9 (18 SDKs, 4 ad networks)
1.35
Background data collection
20%
9 (847 location updates in 48h)
1.8
Data destinations
15%
8 (18 domains)
1.2
Permission-to-function ratio
15%
7 (14 perms for 4 core features)
1.05
Transparency (privacy policy)
15%
4 (generic, no SDK-specific disclosure)
0.6
Total Risk Score
7.6 / 10
A score of 7.6 classifies this app as high privacy risk.
Replace fine location with BLE-only scanning: Android 12+ allows BLUETOOTH_SCAN without ACCESS_FINE_LOCATION
Reduce SDK count from 18 to 8: Remove 3 of 4 ad networks, consolidate to 1 analytics platform, remove unused attribution SDKs
Implement location sampling limits: Replace continuous 3.4-minute polling with event-driven location (only when user opens app or triggers geofence)
Projected improvement: These changes would reduce the risk score from 7.6 to approximately 3.8 (medium-low risk), requiring 2-3 weeks of engineering effort and reducing monetization by approximately 15% (fewer ad impressions).
For users: Deny CAMERA, READ_CONTACTS, and background location. Use “Only while using” for location. Check Settings > Privacy > Permission Manager quarterly.
14.8 Worked Example: Automated Permission Audit Script
Scenario: A security researcher needs to audit 20 IoT companion apps to identify which request excessive permissions. Use Android’s aapt tool output to score apps programmatically.
# Permission risk classifications (excerpt -- full list includes 20+ permissions)PERMISSION_RISK = {"ACCESS_FINE_LOCATION": 10, "ACCESS_BACKGROUND_LOCATION": 10,"CAMERA": 9, "RECORD_AUDIO": 9, "READ_CONTACTS": 8,"BLUETOOTH": 3, "ACCESS_WIFI_STATE": 2, "INTERNET": 1,}# Dangerous permission combinations (multiplicative risk)DANGEROUS_COMBOS = [ ({"ACCESS_FINE_LOCATION", "READ_CONTACTS", "CAMERA"},"Location + Contacts + Camera = comprehensive profiling"), ({"CAMERA", "RECORD_AUDIO", "INTERNET"},"Audio/video surveillance capability"),]def audit_app(app_name, permissions, core_features):"""Score an IoT app's permission requests""" total_score =sum(PERMISSION_RISK.get(p, 5) for p in permissions) unnecessary = [(p, PERMISSION_RISK[p]) for p in permissionsif PERMISSION_RISK.get(p, 0) >=8and p notin core_features]# Check dangerous combinations combo_penalties =sum(15for combo, _ in DANGEROUS_COMBOSif combo.issubset(set(permissions))) normalized =min((total_score + combo_penalties) / (len(permissions) *10) *10, 10) verdict = ("HIGH RISK"if normalized >=7else"MODERATE RISK"if normalized >=4else"LOW RISK")print(f"{app_name}: {normalized:.1f}/10 - {verdict}")return normalized# Audit IoT apps with different risk profilesaudit_app("SmartLight Pro", ["INTERNET", "BLUETOOTH", "ACCESS_FINE_LOCATION", "FOREGROUND_SERVICE"], core_features={"INTERNET", "BLUETOOTH", "ACCESS_FINE_LOCATION"})audit_app("HomeGuard Camera", ["INTERNET", "CAMERA", "RECORD_AUDIO", "ACCESS_FINE_LOCATION","ACCESS_BACKGROUND_LOCATION", "READ_CONTACTS", "WRITE_EXTERNAL_STORAGE"], core_features={"INTERNET", "CAMERA", "FOREGROUND_SERVICE"})
Expected output: SmartLight Pro scores ~3.5 (low risk – location needed for BLE scanning). HomeGuard Camera scores ~7.8 (high risk – background location + contacts + audio have no camera justification). ThermoSense scores ~1.5 (low risk – minimal permissions matching functionality).
Real-World Case Study: Location Data Re-identification (New York Times, 2018)
What happened: The New York Times obtained a dataset of 50 billion location pings from 12 million American phones over several months, sold by a data broker. The data was “anonymized” – no names attached.
Re-identification in practice:
Step
Technique
Success Rate
1. Find a phone that “sleeps” at a specific address
Filter pings between 11 PM - 6 AM
100% of phones have a “home”
2. Cross-reference address with property records
Public property tax databases
~85% match
3. Confirm identity with work location
Filter 9 AM - 5 PM pings
~95% confirmation
4. Map daily routine
Aggregate all pings
Full behavioral profile
What reporters found: They tracked a Microsoft employee to a job interview at Amazon, followed a Department of Defense official through the Pentagon, and traced a Secret Service agent’s home address – all from “anonymous” location data.
Quantified re-identification rates:
4 spatio-temporal points are enough to uniquely identify 95% of people in a dataset of 1.5 million (de Montjoye et al., Nature 2013)
IoT amplification: A smart home app sending location every 3.4 minutes generates 423 points per day – far exceeding the 4-point threshold
Cost: Location datasets sell for $0.50-2.00 per device per month on data broker markets
Lesson for IoT: Every app with “Allow always” location permission generates enough data to uniquely identify and track users, regardless of anonymization claims. The permission model does not protect against re-identification because it cannot limit data granularity or retention.
14.9 The Economics of Mobile Data: Why Over-Collection Persists
Understanding why apps collect excessive data requires examining the economic incentives. Privacy violations are not bugs – they are business models.
14.9.1 Revenue Per User from Data Categories
Data brokers and ad networks assign different values to different data types. The following table reflects approximate market rates as of 2023-2024:
Data Type
Revenue Per User Per Year
Collection Method
User Awareness
Basic demographics (age, gender)
$0.50 - $2.00
Registration form
High
Location history (continuous)
$12.00 - $24.00
Background GPS
Low
Purchase intent signals
$5.00 - $15.00
App usage + browsing
Very low
Health-related behavior
$15.00 - $75.00
Fitness app + location (gym, pharmacy)
Very low
Contact graph (social network)
$3.00 - $8.00
READ_CONTACTS permission
Low
Cross-device identity
$8.00 - $20.00
Wi-Fi + Bluetooth scanning
Very low
Example calculation: A smart home app with 5 million users collecting location ($12/user), contact graph ($3/user), and cross-device identity ($8/user) at the low end generates approximately $115 million per year in data revenue ($23/user x 5M) – often exceeding the revenue from the hardware or subscription itself. This explains why “free” IoT apps aggressively request permissions: the data IS the product.
{const rates = {location: {low:12,high:24,label:"Location history"},contacts: {low:3,high:8,label:"Contact graph"},crossDevice: {low:8,high:20,label:"Cross-device identity"},health: {low:15,high:75,label:"Health behavior"},purchase: {low:5,high:15,label:"Purchase intent"} };const active = [];if (collectLocation) active.push("location");if (collectContacts) active.push("contacts");if (collectCrossDevice) active.push("crossDevice");if (collectHealth) active.push("health");if (collectPurchase) active.push("purchase");let lowTotal =0, highTotal =0;const rows = active.map(k => {const r = rates[k]; lowTotal += r.low; highTotal += r.high;return`<tr><td>${r.label}</td><td>$${r.low}</td><td>$${r.high}</td></tr>`; });const lowRevenue = lowTotal * appUsers;const highRevenue = highTotal * appUsers;returnhtml`<div style="background: #f8f9fa; border-radius: 8px; padding: 16px; border-left: 4px solid #2C3E50; margin: 8px 0;"> <table style="width: 100%; border-collapse: collapse; font-size: 13px;"> <tr style="border-bottom: 2px solid #2C3E50;"><th style="text-align:left; padding:4px;">Data Type</th><th style="padding:4px;">Low $/user/yr</th><th style="padding:4px;">High $/user/yr</th></tr>${rows.join("")} <tr style="border-top: 2px solid #2C3E50; font-weight: bold;"><td style="padding:4px;">Total per user</td><td style="padding:4px;">$${lowTotal}</td><td style="padding:4px;">$${highTotal}</td></tr> </table> <div style="margin-top: 12px; font-size: 15px; color: #2C3E50;"> <strong>Estimated annual data revenue (${appUsers.toLocaleString()} users):</strong><br/> <span style="color: #16A085; font-size: 18px; font-weight: bold;">$${lowRevenue.toLocaleString()}</span> to <span style="color: #E74C3C; font-size: 18px; font-weight: bold;">$${highRevenue.toLocaleString()}</span> </div> <div style="margin-top: 8px; font-size: 12px; color: #7F8C8D;"> Toggle data types to see how each category contributes to total data revenue. Health behavior data commands the highest per-user rates. </div> </div>`;}
14.9.2 The Regulatory Tipping Point
GDPR fines have begun to shift this calculus. In 2023, Meta was fined 1.2 billion euros for transferring EU user data to the US without adequate protection. For smaller IoT companies, the math is simpler:
Data revenue: $12/user/year from location tracking for a 100,000-user app = $1.2 million/year
GDPR fine risk: Up to 4% of global revenue or 20 million euros (whichever is higher)
Expected fine: Even at 1% probability of enforcement, the expected cost exceeds the revenue for companies under $30 million in annual revenue
Companies above $100 million in revenue still profit from aggressive data collection despite regulatory risk, which is why large platforms remain the primary privacy offenders. Smaller IoT companies increasingly find that privacy-respecting designs are not just ethical – they are financially rational.
Putting Numbers to It: Permission Risk Scoring and Data Flow Analysis
Permission risk quantifies the privacy exposure from granting mobile app permissions.
\(5 \leq R < 20\): Moderate risk (review periodically)
\(20 \leq R < 50\): High risk (deny unless essential)
\(R \geq 50\): Critical risk (strong evidence of privacy violation)
Result: Smart home app with always-on location (R=95) is critical risk and likely violates GDPR data minimization (Article 5(1)(c)). Changing to “only while using” reduces risk to R=15.8 (moderate).
In practice: Mathematical risk scoring provides objective criteria for permission audits. Apps with aggregate risk scores >50 should trigger privacy impact assessments. Users can prioritize which permissions to deny based on quantified risk, not just intuition.
14.9.3 Interactive Permission Risk Calculator
Use this calculator to explore how different permission parameters affect privacy risk scores. Adjust the sliders to see how sensitivity, re-identification probability, regulatory environment, and consent quality interact.
{const presets = {"ACCESS_FINE_LOCATION (always)": {s:10,p:0.95,m:3.0,c:0.3},"ACCESS_FINE_LOCATION (while using)": {s:7,p:0.5,m:2.0,c:0.7},"CAMERA": {s:9,p:0.1,m:1.5,c:0.8},"READ_CONTACTS": {s:8,p:0.8,m:2.0,c:0.4},"RECORD_AUDIO": {s:9,p:0.3,m:2.5,c:0.5},"READ_PHONE_STATE": {s:8,p:0.9,m:2.0,c:0.3},"Custom":null };const s = sensitivity;const p = reIdProb;const m = regMultiplier;const c = consentQuality;const risk = (s * p * m) / c;const riskLevel = risk <5?"LOW": risk <20?"MODERATE": risk <50?"HIGH":"CRITICAL";const riskColor = risk <5?"#16A085": risk <20?"#3498DB": risk <50?"#E67E22":"#E74C3C";returnhtml`<div style="background: #f8f9fa; border-radius: 8px; padding: 16px; border-left: 4px solid ${riskColor}; margin: 8px 0;"> <div style="font-size: 14px; color: #2C3E50; margin-bottom: 8px;"> <strong>Formula:</strong> R = (${s} x ${p.toFixed(2)} x ${m.toFixed(1)}) / ${c.toFixed(2)} = <strong style="color: ${riskColor}; font-size: 18px;">${risk.toFixed(1)}</strong> </div> <div style="font-size: 16px; font-weight: bold; color: ${riskColor};"> Risk Level: ${riskLevel} </div> <div style="margin-top: 8px; font-size: 12px; color: #7F8C8D;"> Thresholds: <span style="color:#16A085;">Low <5</span> | <span style="color:#3498DB;">Moderate 5-20</span> | <span style="color:#E67E22;">High 20-50</span> | <span style="color:#E74C3C;">Critical ≥50</span> </div> <div style="margin-top: 8px; font-size: 12px; color: #7F8C8D;"> Try: Set consent quality to 1.0 (informed consent) to see how transparency reduces risk. Or change "while using" vs "always" location to compare risk profiles. </div> </div>`;}
Match the Key Concepts
Order the Steps
Label the Diagram
💻 Code Challenge
14.10 Summary
Mobile data collection presents significant privacy challenges:
Data Types Collected:
Continuous sensor data (location, accelerometer, microphone)
Network information (Wi-Fi, Bluetooth, cellular)
Usage patterns and app activity
Android Permission Tiers:
Normal permissions: Auto-granted, low privacy risk
Dangerous permissions: User consent required, high privacy risk
Special permissions: Settings configuration, system-level access
Key Risks:
Permission granularity too coarse (all-or-nothing)
No control over data destinations
Third-party SDKs inherit app permissions
Background collection continues after app closure
Key Takeaway: Permission grants give apps access to sensors, but provide no control over where that data goes or how it’s used.
Common Pitfalls
1. Requesting Excessive Sensor Permissions
Mobile apps routinely request “location always on,” “microphone,” and “contacts” permissions regardless of whether these are needed. Requesting more permissions than necessary violates data minimization and creates unnecessary privacy risk. Request only the minimum sensor access needed for each specific feature.
2. Not Distinguishing Foreground vs. Background Collection
Collecting location “while in use” vs. “always” has dramatically different privacy implications. Many apps collect background location without users understanding the continuous tracking this enables. Limit background collection to the minimum needed and clearly explain background collection to users.
3. Treating Accelerometer as Non-Personal Data
Accelerometer data is commonly handled as non-personal technical sensor data. But accelerometer time series can identify individuals by gait, recognize daily routines, and reveal health conditions. Apply privacy controls to accelerometer data collection appropriate to its actual sensitivity.
4. Ignoring Derived Data Privacy
Raw sensor data may be innocuous, but derived features (activity classification, location clustering, presence detection) are often highly sensitive. Privacy controls applied to raw sensor data may not adequately protect derived features generated from that data. Assess privacy risks for both raw and derived data.
14.11 What’s Next
Now that you understand how mobile devices collect data and the limitations of permission models, the next chapter explores Privacy Leak Detection where you’ll learn to detect unauthorized data flows using data flow analysis, TaintDroid, and static analysis techniques.