4  Introduction to Privacy in IoT

4.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Distinguish privacy from security: Explain why encryption alone does not guarantee privacy
  • Navigate the privacy curriculum: Choose the right learning path based on your role and goals
  • Identify IoT privacy challenges: Recognize the unique risks of always-on, interconnected devices
  • Apply the six privacy concepts: Use fundamentals, principles, regulations, threats, techniques, and compliance in IoT design
  • Evaluate privacy trade-offs: Balance user convenience with data protection requirements
In 60 Seconds

IoT privacy is fundamentally about control over personal data generated by always-on devices — what is collected, by whom, and for how long. Unlike security (which blocks unauthorized access), privacy requires questioning whether data should exist at all, making privacy-by-design essential from the first line of code.

Key Concepts

  • Privacy vs. Security: Security prevents unauthorized access; privacy controls what data is collected and by whom. A system can be secure but still violate privacy through authorized data misuse.
  • Data Minimization: Principle requiring collection of only the minimum personal data necessary for a specified purpose; fundamental to GDPR and Privacy by Design.
  • Informed Consent: Legal and ethical requirement that users explicitly agree to data collection with clear understanding of what data is collected and how it is used.
  • Data Subject Rights: Individual rights regarding personal data including access, correction, deletion (right to be forgotten), portability, and objection to processing.
  • IoT Privacy Risks: Unique privacy challenges from IoT including always-on sensing, passive data collection, behavioral inference, and lack of user awareness.
  • Privacy by Design: Proactive approach embedding privacy protections into system architecture from the start rather than adding them as an afterthought.
  • Anonymization vs. Pseudonymization: Anonymization removes all identifying information permanently; pseudonymization replaces identifiers with codes while retaining ability to re-identify.
Most Valuable Understanding (MVU)

Privacy protects you from authorized users; Security protects you from unauthorized users.

Your smart home can have military-grade encryption (security) but zero privacy if the company collects everything you do. A fitness tracker that encrypts your heart rate but shares it with insurers is secure but NOT private. Privacy is about CONTROL: what data is collected, who can access it, how long it’s kept, and your right to delete it.

Remember: Security asks “Who can see my data?” Privacy asks “Should this data exist at all?”

Prerequisites:

This Chapter Covers:

  • Privacy curriculum overview with 6 focused chapters
  • Quick-start learning paths for different roles
  • Key concepts and essential takeaways

Next Steps:


4.2 Overview

Privacy represents the fundamental right of individuals to control their personal information, distinct from security which protects systems from unauthorized access. IoT systems present unique privacy challenges through always-on sensors enabling continuous monitoring, passive data collection without explicit user awareness, interconnected devices facilitating data aggregation revealing sensitive patterns, and cloud processing moving data beyond user control.

This comprehensive guide is organized into six focused chapters covering all aspects of IoT privacy, from foundational concepts to compliance implementation.

Think of privacy like the locks on your diary, not just your front door.

Security (the front door lock) stops strangers from breaking in. But privacy (the diary lock) controls what gets written in the diary in the first place and who is ALLOWED to read it.

Here’s the scary part about IoT:

  • Your smart speaker is like having a friend who writes down EVERYTHING you say
  • Your fitness tracker is like having a doctor who shares your health with insurance companies
  • Your smart TV is like having a stranger sitting on your couch noting what you watch

Real numbers that matter:

  • The average smart home generates 50+ data points per hour
  • A significant proportion of IoT devices transmit data without adequate encryption
  • A single compromised device can expose your entire household’s patterns

The good news: You can design IoT systems that respect privacy. This guide shows you how.


4.3 Privacy in IoT: The Big Picture

IoT privacy ecosystem diagram showing six interconnected domains: Fundamentals explaining what privacy means, Principles providing ethical guidelines, Regulations defining legal requirements, Threats identifying risks, Techniques offering technical solutions, and Compliance showing implementation steps. All connect to a central user icon representing individual data rights.
Figure 4.1: IoT privacy ecosystem showing six interconnected domains

Hey future tech star! Let’s learn about privacy with the Sensor Squad!

Sammy the Sensor loves to measure things - temperature, light, even how many steps you take! But Sammy has an important rule: Always ask before watching!

Imagine this story:

Your friend comes to visit. Would you like it if they: - Took photos of everything in your room? NO! - Wrote down everything you said? NO! - Told everyone at school your secrets? Definitely NO!

That’s what some IoT devices do without asking! They watch, record, and share - without your permission.

The Sensor Squad’s Privacy Pledge:

  1. Sammy says: “I only collect what I NEED” (Not everything I CAN!)
  2. Lila says: “I tell you exactly what I’m watching” (No secrets!)
  3. Max says: “I delete old data I don’t need anymore” (No hoarding!)
  4. Bella says: “I let YOU control your information” (You’re the boss!)

Fun Activity: Look at a smart device in your home. Can you answer: - What does it watch or listen to? - Who does it share information with? - Can you turn off the watching?

If you can’t answer these questions, that device might not respect your privacy!

Remember: Good technology asks permission. Great technology protects your secrets!


4.4 Privacy vs Security: Understanding the Difference

Comparison diagram showing Security on left protecting against hackers with firewall, encryption, authentication icons, and Privacy on right protecting against authorized misuse with data minimization, consent, user control icons. Center shows overlap area labeled Defense in Depth covering both unauthorized and authorized access risks.
Figure 4.2: Privacy vs Security comparison
Aspect Security Privacy
Protects against Hackers, malware, unauthorized access Companies, employees, data brokers, governments
Key question “Who can access my data?” “Should this data exist at all?”
Example AES-256 encryption on voice recordings Not recording voice unless user explicitly speaks
Failure mode Data breach (stolen by outsiders) Data misuse (used by insiders)
Solution tools Firewalls, encryption, authentication Data minimization, consent, anonymization

4.5 IoT Privacy Threat Landscape

Understanding where privacy threats come from helps you design better protections.

IoT privacy threat landscape diagram showing four threat categories: Device-level threats (sensors, storage), Network-level threats (data in transit, cloud), Organizational threats (data sharing, retention), and Environmental threats (inference, aggregation). Arrows show how data flows between layers and where privacy can be compromised.
Figure 4.3: IoT privacy threat landscape across four categories

Threat Categories Explained:

Category Examples Mitigation
Device-Level Always-on microphones, cameras collecting beyond stated purpose Data minimization, local processing
Network-Level Unencrypted transmissions, cloud providers accessing data End-to-end encryption, zero-knowledge protocols
Organizational Data sold to brokers, kept indefinitely, used for new purposes Strong privacy policies, data retention limits
Environmental Combining smart meter + fitness data reveals health conditions Differential privacy, k-anonymity

4.6 Chapter Guide

4.6.1 1. Privacy Fundamentals

Privacy Fundamentals in IoT

Start here to understand what privacy means in the IoT context and why it matters.

  • What is privacy vs security
  • Why “I have nothing to hide” is wrong
  • Real-world privacy nightmares (Vizio, Ring, Alexa)
  • The five privacy rights you should know
  • IoT-specific privacy challenges

Difficulty: Beginner | Time: 15-20 minutes


4.6.2 2. Privacy Principles

Privacy Principles and Ethics

Learn the foundational principles that guide all privacy regulations and technical implementations.

  • OECD Privacy Principles (1980) - the foundation
  • Fair Information Practice Principles (FIPPs)
  • IEEE Ethically Aligned Design for IoT
  • Applying principles to IoT design decisions

Difficulty: Intermediate | Time: 20-25 minutes


4.6.3 3. Privacy Regulations

Privacy Regulations for IoT

Understand the legal requirements governing IoT privacy globally.

  • GDPR requirements and user rights
  • CCPA compliance obligations
  • HIPAA, COPPA, LGPD, PIPL comparison
  • Handling regulatory conflicts

Difficulty: Intermediate | Time: 25-30 minutes


4.6.4 4. Privacy Threats

Privacy Threats in IoT

Identify and understand the privacy risks specific to IoT systems.

  • Five categories of privacy threats
  • Case study: “The House That Spied On Me”
  • Real-world privacy violations (Strava, Ring, Roomba)
  • The aggregation attack explained

Difficulty: Intermediate | Time: 20-25 minutes


4.6.5 5. Privacy-Preserving Techniques

Privacy-Preserving Techniques for IoT

Learn technical approaches to protect user privacy.

  • Data minimization strategies
  • Anonymization and pseudonymization
  • Differential privacy implementation
  • Edge analytics: security without surveillance
  • Encryption for privacy

Difficulty: Advanced | Time: 30-35 minutes


4.6.6 6. Privacy Compliance

Privacy Compliance for IoT

Implement privacy protection in your IoT systems.

  • Consent management implementation
  • Privacy Impact Assessments (PIAs)
  • Privacy by Default principles
  • Compliance documentation requirements
  • Phased compliance roadmap

Difficulty: Intermediate | Time: 25-30 minutes


4.7 Learning Paths

4.7.1 Quick Start (1 hour)

For a foundational understanding:

  1. Privacy Fundamentals - 20 min
  2. Privacy Threats - 20 min
  3. Privacy Techniques (Edge Analytics section) - 20 min

4.7.2 Compliance Focus (2 hours)

For regulatory compliance:

  1. Privacy Regulations - 30 min
  2. Privacy Compliance - 30 min
  3. Privacy Techniques - 35 min
  4. Review: Privacy Principles - 25 min

4.7.3 Complete Coverage (3+ hours)

For comprehensive understanding, follow chapters 1-6 in order.


4.8 Smart Home Privacy Calculator

Explore how IoT devices in a typical smart home generate data and create privacy exposure. Adjust the device counts below to see the impact.

Scenario: A household installs a smart home hub connecting 15 IoT devices. Calculate the privacy exposure and data generation rate.

Connected Devices:

1. Smart thermostat (temp every 15 min)
2. Smart doorbell (motion + video on trigger)
3. 3x Smart lights (on/off state every 5 min)
4. Smart TV (viewing data every 30 sec when on)
5. 2x Smart speakers (voice commands + ambient audio)
6. Smart refrigerator (door openings, inventory scan)
7. Fitness tracker (heart rate, steps, sleep every 1 min)
8. Smart lock (unlock events)
9. 2x Security cameras (motion alerts, 10-sec clips)
10. Robot vacuum (floor map, cleaning schedule)

Daily Data Generation:

Calculation:
  Thermostat: 96 readings/day x 20 bytes = 1.9 KB/day
  Doorbell: 10 motion events x 500 KB video = 4.9 MB/day
  Lights: 3 x 288 state changes x 10 bytes = 8.4 KB/day
  Smart TV: 4 hrs x 120 readings/hr x 50 bytes = 23.4 KB/day
  Speakers: 20 commands/day x 2 devices x 100 KB audio = 3.9 MB/day
  Fridge: 15 door opens x 1 KB = 15 KB/day
  Fitness tracker: 1,440 readings/day x 50 bytes = 70.3 KB/day
  Smart lock: 8 events/day x 500 bytes = 3.9 KB/day
  Cameras: 2 x 20 clips/day x 1 MB = 40 MB/day
  Vacuum: 1 map/day x 500 KB = 500 KB/day

Total: ~50 MB/day = 1.5 GB/month = 18 GB/year

Privacy Insights Derivable:

1. Daily Routine (from aggregate patterns)

6:30 AM: Fitness tracker detects wake-up (heart rate change)
6:45 AM: Smart coffee maker starts (kitchen light on)
7:15 AM: Front door unlocks (leaving for work)
12:30 PM: Thermostat adjusts (home for lunch? or just weather?)
6:00 PM: Front door unlocks (return from work)
7:30 PM: Smart TV on (dinner + entertainment)
11:00 PM: Bedroom lights off (bedtime)

2. Health Conditions (from fitness tracker + other devices)

Indicators:
  - Heart rate elevated at night -> sleep disorder or anxiety
  - No movement detected by vacuum/cameras for 48 hours -> illness or injury
  - Fridge opened 30 times/day (usual: 10) -> stress eating or dietary change
  - Smart scale shows rapid weight loss -> potential health issue

Insurance Impact:
  - Health insurance: Pre-existing conditions detected before disclosure
  - Life insurance: Risky behaviors identified (irregular sleep, poor diet)

3. Socioeconomic Status (from usage patterns)

Indicators:
  - Thermostat always at 16 C -> trying to save money (low income)
  - TV watching 8+ hours/day -> unemployed or retired
  - Smart lock unused during work hours -> work-from-home (professional job)
  - Robot vacuum runs daily -> values cleanliness or has allergies

Marketing Impact:
  - Luxury ads shown to high-usage smart home users
  - Budget ads shown to energy-saving households

Privacy Risk Score:

Data Sensitivity:
  High: Voice recordings, video footage, health data (fitness tracker)
  Medium: Location inference (door/thermostat/lights)
  Low: Device status (lights on/off)

Aggregation Risk: VERY HIGH
  - Individual data points innocuous
  - Combined patterns reveal intimate life details
  - De-identification nearly impossible (unique household patterns)

Annual Privacy Exposure:
  Data points generated: ~1.1 million/year
  Behavioral insights: 47 inferences about occupants
  Re-identification risk: 99.7% (unique household fingerprint)

Mitigation Strategies:

Strategy Privacy Gain Functionality Impact Recommended?
Edge processing (all inference local) Very High (no cloud data) None YES
Aggregate to hourly Medium (masks fine-grain patterns) Low (insights still useful) YES
Delete after 7 days High (limits temporal exposure) Medium (no long-term trends) MAYBE
Disable cameras High (eliminates video surveillance) High (security reduced) NO
Use separate hubs per room Medium (compartmentalizes data) Medium (complexity increases) MAYBE

Recommended Setup:

  1. Process all data at edge (hub, not cloud)
  2. Store only aggregated hourly summaries
  3. Delete raw data after 48 hours
  4. Encrypt all data with user-controlled keys
  5. Disable cameras unless explicitly activated

Result:

  • Data sent to cloud: 0 MB/day (vs 50 MB)
  • Privacy exposure: 85% reduction
  • Functionality preserved: 95%
  • User control: 100%

Before building any IoT data collection feature, ask these 5 questions:

Question If Answer is NO Action
1. Is there a clear, specific purpose? Purpose is vague (“improve service”) STOP: Define specific purpose or don’t collect
2. Can we achieve the goal with less data? Collecting “just in case” REDUCE: Collect minimum necessary
3. Are users genuinely benefiting? Benefit accrues only to company REDESIGN: Align user and business value
4. Would I want this in my own home? Feels creepy or invasive RETHINK: If you wouldn’t use it, users won’t trust it
5. Can we explain this in one sentence? Requires paragraph of justification SIMPLIFY: If you can’t explain it simply, it’s too complex

Example: “Always-on camera for ‘better security’”

1. Clear purpose? YES ("Detect intruders")
2. Less data possible? YES (motion sensor instead of video)
3. User benefit? YES (security)
4. Want in own home? NO (privacy invasion)
5. Explain simply? NO ("Records everything but only alerts on motion
   but employees may review for quality assurance...")

Verdict: REDESIGN using motion sensor + triggered camera (not always-on)
Common Mistake: “Anonymous” IoT Data That Isn’t Actually Anonymous

Myth: “We remove usernames and email addresses, so the data is anonymous and GDPR doesn’t apply.”

Reality: IoT data is almost impossible to truly anonymize due to unique behavioral patterns.

Example - Fitness Tracker “Anonymized” Data:

Published Dataset:
  - User ID: ABC123 (randomized, not linked to name)
  - Daily steps, heart rate, sleep patterns
  - GPS coordinates removed
  - Timestamps preserved

Re-identification Attack:
  Step 1: Find unique pattern (Person walks exactly 10,000 steps
          every day at 7 AM)
  Step 2: Cross-reference with Strava public activities
          (same 10k morning walks)
  Step 3: Strava profile reveals: John Smith, lives in Seattle
  Step 4: ABC123 is John Smith (re-identified from "anonymous" data)

Success Rate: 87% of users re-identified in academic studies

Why IoT Data is Uniquely Identifiable:

  1. Behavioral fingerprints: Your daily routines are unique (like a signature)
  2. Temporal patterns: When you do things reveals who you are
  3. Spatial patterns: Where your devices are located narrows identity
  4. Device combinations: The SET of devices you own is unique

Real Incidents:

  • 2018: Strava heatmap revealed secret military bases (from soldiers’ fitness trackers)
  • 2014: NYC taxi data “anonymized” but reporters identified celebrities’ rides
  • 2017: Roomba maps showed apartment layouts, raising data-sharing concerns

Lesson: For IoT data, assume re-identification is possible. Use differential privacy or don’t release data at all.

Test: If you can link the “anonymous” data to ANY external dataset and recover real identities, it’s NOT anonymous.

Re-identification probability quantifies the likelihood that “anonymized” records can be linked back to real individuals.

Uniqueness Metric: \[P(\text{unique}) = \frac{\text{Records with unique quasi-identifier combination}}{\text{Total records}}\]

De-identification Theorem (Sweeney, 2000): In the US, 87% of the population can be uniquely identified by the combination of: - 5-digit ZIP code - Birth date (month, day, year) - Gender

Working through an example:

Given: “Anonymized” smart home dataset with 10,000 users

Quasi-identifiers included:

  • 3-digit ZIP prefix (941**)
  • Age bracket (35-39)
  • Household size (2 people)
  • Average nightly kWh usage

Step 1: Estimate population per equivalence class

US Census data for ZIP 941**: - Population: ~340,000 - Age 35-39: 6.8% -> 23,120 people - 2-person households: 34% -> 7,861 households

Step 2: Calculate uniqueness from energy signature

Nighttime energy patterns (11 PM - 6 AM) create unique fingerprints: - Device-specific power draw curves (EV charging, medical equipment, etc.) - Study (Enev et al., 2011): 90% household identification from 1-week smart meter data

Step 3: Combine quasi-identifiers

\[P(\text{unique | ZIP, Age, HH-size, Energy}) \approx 0.9\]

Even though each individual quasi-identifier has low uniqueness: - 3-digit ZIP: \(\frac{1}{7,861} = 0.013\%\) unique - Combined with energy signature: \(90\%\) unique

Step 4: Re-identification attack simulation

Cross-reference with public voter registration (has exact address): - 7,861 households in equivalence class - Match energy pattern to known resident at address - Success rate: \(\frac{9,000}{10,000} = 90\%\)

Result: “Anonymized” smart home data has 90% re-identification risk due to unique energy consumption fingerprints, making GDPR compliance impossible without differential privacy or extreme aggregation (k>=5,000).

In practice: IoT devices generate behavioral fingerprints (usage patterns, temporal signatures, device combinations) that are as unique as DNA. Traditional anonymization (removing names/IDs) fails because behavior itself is identifying. This is why GDPR considers pseudonymized data still “personal data” – the risk of re-identification remains high.

4.9 Summary

This chapter introduced the fundamental concepts of privacy in IoT systems and distinguished privacy from security:

Key Concepts Covered:

  1. Privacy vs Security Distinction: Security protects against unauthorized access (hackers), while privacy protects against authorized misuse (companies, employees, data brokers). Both are necessary for complete protection.

  2. Six Privacy Domains: The IoT privacy curriculum covers Fundamentals (what privacy means), Principles (ethical guidelines), Regulations (legal requirements), Threats (privacy risks), Techniques (technical solutions), and Compliance (implementation).

  3. IoT-Specific Challenges: Always-on sensors, passive data collection, device interconnection, and cloud processing create unique privacy risks not present in traditional computing.

  4. Data Minimization: Collect only what’s necessary, retain only as long as needed, and delete when no longer required. This is the single most effective privacy technique.

  5. Aggregation Risk: Combining multiple innocuous data points reveals sensitive patterns. Even anonymous data can be de-anonymized through aggregation.

  6. Edge Processing: Processing data locally on devices and transmitting only anonymized results dramatically reduces privacy attack surface.

Practical Takeaway: Privacy must be designed into IoT systems from the start. Retrofitting privacy controls is significantly more expensive than building them in initially. Use the six-chapter framework to systematically address all aspects of IoT privacy.


4.10 Videos

Explores the ethical considerations surrounding IoT data collection, user consent, and the balance between innovation and privacy protection.

Learn about privacy-enhancing technologies and tools that help users understand and control how their IoT devices collect and share personal information.

4.11 Knowledge Check

Test your understanding of IoT privacy concepts with these questions.

What is the fundamental difference between privacy and security in IoT systems?

  1. Privacy protects against hackers; security protects against authorized users
  2. Security protects against unauthorized access; privacy protects against authorized misuse
  3. Privacy and security are the same thing
  4. Security is for networks; privacy is for devices

B) Security protects against unauthorized access; privacy protects against authorized misuse

Security controls WHO can access your data (keeping hackers out). Privacy controls WHAT data is collected and how authorized users can use it. A fitness tracker can have perfect security (encrypted data, strong authentication) but zero privacy if the company sells your health data to insurers.

Key insight: “I have nothing to hide” only addresses security, not privacy. You may not mind your data being secure, but you should care about how it’s used.

A smart thermostat company wants to improve their product. Which approach follows the data minimization principle?

  1. Collect all sensor data 24/7 and store indefinitely for future analysis
  2. Record audio to detect when users say “I’m cold” or “I’m hot”
  3. Collect only temperature readings needed for the heating algorithm, delete after 30 days
  4. Share anonymized data with third parties for market research

C) Collect only temperature readings needed for the heating algorithm, delete after 30 days

Data minimization requires: - Collecting ONLY what’s necessary for the stated purpose - Retaining data ONLY as long as needed - Deleting data when no longer required

Options A, B, and D all collect more data than necessary or use data beyond the original purpose.

Why is the aggregation attack particularly dangerous for IoT privacy?

  1. It requires physical access to devices
  2. Combining innocuous data points reveals sensitive patterns
  3. It only works on unencrypted data
  4. It targets only industrial IoT systems

B) Combining innocuous data points reveals sensitive patterns

Aggregation attacks combine multiple seemingly harmless data points to reveal sensitive information: - Smart meter + motion sensors = When you’re home and away - Thermostat + sleep tracker = Bedroom activity patterns - Voice assistant + calendar = Confidential meeting topics

Even “anonymous” data can be de-anonymized when combined with other sources. This is why differential privacy and edge processing are crucial techniques.

What is the primary privacy benefit of edge processing in IoT systems?

  1. It makes devices faster
  2. Sensitive data is processed locally and only anonymized results are transmitted
  3. It reduces manufacturing costs
  4. It eliminates the need for encryption

B) Sensitive data is processed locally and only anonymized results are transmitted

Edge processing keeps raw sensitive data on the device. For example: - A smart camera detects “person present” locally, sends only this boolean, not video - A voice assistant processes wake words locally, not streaming all audio to the cloud - A health monitor calculates heart rate locally, sends only the number, not raw ECG

This dramatically reduces the privacy attack surface because sensitive raw data never leaves the device.

Under GDPR, valid consent for IoT data collection must be:

  1. Implied by using the device
  2. Buried in terms and conditions
  3. Freely given, specific, informed, and unambiguous
  4. Obtained once and valid forever

C) Freely given, specific, informed, and unambiguous

GDPR requires consent to be: - Freely given: Users can refuse without penalty - Specific: Each purpose requires separate consent - Informed: Users understand what they’re agreeing to - Unambiguous: Clear affirmative action (no pre-checked boxes) - Withdrawable: Users can revoke consent at any time

Pre-checked consent boxes, bundled consent, or “use = consent” approaches are NOT valid under GDPR.


Common Pitfalls

Implementing strong encryption while ignoring what data is collected and retained is a common design failure. Security prevents unauthorized access; privacy requires questioning whether data should exist at all. Both are needed, and security does not automatically provide privacy.

Implementing minimal regulatory requirements without building genuinely privacy-respecting systems creates legal risk and erodes user trust. Privacy regulations set a floor, not a ceiling. Design systems users would be comfortable with if data practices were fully transparent.

IoT devices can collect enormous sensor data, and teams often retain everything without a specific purpose. This violates data minimization principles and creates liability for data breaches. Collect only data with a specific, documented, lawful purpose.

IoT users rarely understand what data their devices collect. Don’t assume informed consent from a single terms-of-service agreement at device setup. Provide clear, ongoing notifications and give users meaningful control.

4.12 What’s Next

If you want to… Read this
Learn the foundational privacy concepts Privacy Fundamentals
Understand ethical privacy principles Privacy Principles and Ethics
Navigate GDPR and CCPA compliance Privacy Regulations
Apply privacy-by-design architecture Privacy by Design Schemes
Understand security foundations alongside privacy Security and Privacy Overview

4.13 Resources

4.13.1 Regulations

4.13.2 Tools

  • Privacy Policy Generators: Termly, PrivacyPolicies.com
  • Consent Management: OneTrust, Cookiebot
  • Data Mapping: BigID, OneTrust
  • Differential Privacy: Google DP Library, OpenDP

4.13.3 Standards

  • ISO/IEC 27701: Privacy Information Management
  • ISO/IEC 29100: Privacy framework
  • IEEE P7002: Data privacy process
← Security and Privacy Overview Privacy Fundamentals →