1420 Privacy by Design: Implementation Examples
1420.1 Learning Objectives
By the end of this chapter, you should be able to:
- Implement GDPR-compliant consent management systems with layered architecture
- Design pseudonymization strategies that balance privacy with operational utility
- Apply data minimization techniques to reduce collection by 99%+
- Configure privacy-by-default settings for smart home and IoT devices
- Build tier-aware consent flows for healthcare IoT systems
- Privacy Foundations - Review Privacy by Design: Foundations for the seven foundational principles
- Design Patterns - Review Privacy Design Patterns and Data Tiers for the Three-Tier model
- Anti-Patterns - Review Privacy Anti-Patterns and Assessment for what to avoid
- Encryption - Pair with Encryption Principles and Crypto Basics for implementing encryption techniques
In one sentence: Privacy by Design implementation requires concrete techniques - these worked examples demonstrate GDPR-compliant consent, pseudonymization, data minimization, and tier-aware systems in real-world IoT scenarios.
Remember this rule: Always start with “Do we need this data?” before asking “How do we protect this data?”
1420.2 Prerequisites
Before diving into this chapter, you should be familiar with:
- Privacy by Design: Foundations: Understanding of the seven foundational principles
- Privacy Design Patterns and Data Tiers: Knowledge of privacy patterns and the Three-Tier model
- Privacy Anti-Patterns and Assessment: Understanding of PIAs and what to avoid
1420.3 Worked Example: GDPR-Compliant Consent Flow
Scenario: A smart home company is launching a voice assistant in the EU market. Under GDPR Article 7, consent must be freely given, specific, informed, and unambiguous. Design a consent management system that complies with GDPR while providing a smooth user experience during device setup.
Given:
- Device: Smart speaker with always-on microphone
- Data processed: Voice commands, wake word detection, speech-to-text, command history
- Third-party integrations: Music streaming, smart home control, shopping, calendar
- Regulatory requirements: GDPR Articles 6 (lawful basis), 7 (consent conditions), 9 (special categories)
- User demographics: Non-technical consumers, including elderly users
1420.3.1 Step 1: Design Layered Consent Architecture
GDPR Article 7(2) requires consent to be “distinguishable from other matters”:
| Consent Layer | Purpose | Legal Basis | Default | Required for Device Function? |
|---|---|---|---|---|
| Layer 1: Essential | Wake word detection, local command processing | Contract (Art. 6(1)(b)) | ON | Yes (device won’t work without) |
| Layer 2: Core Features | Cloud speech-to-text, command execution | Contract | ON | Yes (primary use case) |
| Layer 3: Personalization | Voice recognition, preference learning | Legitimate Interest (Art. 6(1)(f)) | OFF | No (device works without) |
| Layer 4: History | Command history storage for review | Consent (Art. 6(1)(a)) | OFF | No |
| Layer 5: Improvement | Anonymous voice samples for ML training | Consent | OFF | No |
| Layer 6: Third-party | Sharing with music, shopping, calendar | Consent (per integration) | OFF | No |
1420.3.2 Step 2: Implement Progressive Consent Flow
SETUP FLOW - GDPR COMPLIANT CONSENT
+------------------------------------------------------------------+
| STEP 1: Welcome & Privacy Overview (Art. 13 - Information) |
+------------------------------------------------------------------+
| "Hi! Before we start, let's talk about your privacy. |
| |
| This device listens for 'Hey Assistant' and processes commands. |
| You control what happens with your voice data. |
| |
| [Continue] - Takes 2 minutes |
| [Read full privacy policy] - Opens detailed document |
+------------------------------------------------------------------+
+------------------------------------------------------------------+
| STEP 2: Essential Processing (Contract - notice only) |
+------------------------------------------------------------------+
| "For the device to work, we need to: |
| - Listen for 'Hey Assistant' (on-device, not sent to cloud) |
| - Send your command to our servers for processing |
| - Return the response to your device |
| |
| Commands are processed but NOT stored by default. |
| |
| [I Understand - Continue] |
+------------------------------------------------------------------+
+------------------------------------------------------------------+
| STEP 3: Optional Features (Consent required - Art. 6(1)(a)) |
+------------------------------------------------------------------+
| "These features are OPTIONAL. The device works without them." |
| |
| [ ] Save my command history (I can review and delete anytime) |
| What this means: Commands stored for 30 days. |
| |
| [ ] Learn my voice (personalized responses) |
| What this means: Voice profile stored until you delete it. |
| |
| [ ] Help improve voice recognition (anonymous samples) |
| What this means: Random 1% of commands help train models. |
| |
| [Skip All - Continue with defaults (most private)] |
| [Save My Choices] |
+------------------------------------------------------------------+
1420.3.3 Step 3: Implement Consent Enforcement
class GDPRConsentManager:
"""GDPR Article 7 compliant consent management."""
def __init__(self, user_id):
self.user_id = user_id
self.consent_record = self._load_consent()
def _load_consent(self):
"""Load consent from secure storage."""
return {
"essential": True, # Not consent, but acknowledgment
"history": False, # Default: OFF
"voice_learning": False,
"ml_improvement": False,
"third_party": {}, # Per-integration consent
"consent_version": "2.1",
"consent_timestamp": None,
"consent_method": "setup_wizard",
"withdrawal_available": True
}
def grant_consent(self, consent_type, explicit_action=True):
"""Grant consent with GDPR Article 7 requirements."""
if not explicit_action:
raise GDPRViolation("Consent must be unambiguous (Art. 7)")
self.consent_record[consent_type] = True
self.consent_record["consent_timestamp"] = datetime.now()
self._log_consent_event("GRANT", consent_type)
self._save_consent()
def withdraw_consent(self, consent_type):
"""Withdraw consent - must be as easy as granting (Art. 7(3))."""
self.consent_record[consent_type] = False
self._stop_processing(consent_type)
self._delete_consented_data(consent_type)
self._log_consent_event("WITHDRAW", consent_type)
self._save_consent()
def can_process(self, data_type, purpose):
"""GDPR-compliant processing check before any data processing."""
consent_mapping = {
"voice_command": ("essential", "contract"),
"command_history": ("history", "consent"),
"voice_profile": ("voice_learning", "consent"),
"ml_training": ("ml_improvement", "consent"),
}
consent_key, legal_basis = consent_mapping.get(data_type, (None, None))
if legal_basis == "contract":
return True # Essential processing
if legal_basis == "consent":
if not self.check_consent(consent_key):
self._log_blocked_processing(data_type, "NO_CONSENT")
return False
return True
return False # Unknown data type = block by default1420.3.4 Step 4: Calculate Compliance Metrics
| GDPR Requirement | Implementation | Compliance Status |
|---|---|---|
| Freely given (Art. 7(4)) | Device works without optional consent | PASS |
| Specific (Art. 7(2)) | Separate consent per purpose | PASS |
| Informed (Art. 7) | Plain language + “Learn More” links | PASS |
| Unambiguous (Art. 7) | Explicit opt-in, no pre-checked boxes | PASS |
| Withdrawable (Art. 7(3)) | One-click disable, same UI as enable | PASS |
| Documented (Art. 7(1)) | Timestamp + method + version logged | PASS |
Result: Users can complete setup in 2 minutes with all optional features disabled (maximum privacy), or take 5 minutes to enable specific features with informed consent.
1420.4 Worked Example: Pseudonymization for Fleet Tracking
Scenario: A logistics company deploys GPS trackers on 5,000 delivery vehicles across Europe. Drivers are concerned about being personally tracked, but the company needs location data for route optimization, delivery ETAs, and theft recovery. Design a GDPR-compliant pseudonymization strategy.
Given:
- Fleet size: 5,000 vehicles, 8,000 drivers (rotating shifts)
- Data collected: GPS coordinates (every 30 seconds), speed, route, delivery stops
- Data retention: 90 days for operational analytics, 7 years for financial audit
- Privacy concerns: Drivers don’t want personal movement tracked; union has raised objections
- Legal context: GDPR (vehicles cross FR, DE, NL, BE), local labor laws
- Business needs: Route optimization, fuel management, customer ETAs, incident investigation
1420.4.1 Step 1: Apply GDPR Article 4(5) Pseudonymization
“Pseudonymization means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information.”
| Data Element | Current State | Pseudonymization Method | Re-identification Risk |
|---|---|---|---|
| Driver ID | “Pierre Martin, ID#12345” | HMAC-SHA256 with rotating salt -> “drv_a3f5d8e2” | Low (requires salt access) |
| Vehicle ID | “License: AB-123-CD” | Static hash -> “veh_7b9c1e4f” | Low (one-way hash) |
| GPS coordinates | Precise lat/long | Round to 100m grid, exclude home/union locations | Medium (route pattern analysis) |
| Timestamps | Exact second | Round to 5-minute intervals | Low |
| Speed data | Exact km/h | Categorize: “normal/speeding/stopped” | Low |
1420.4.2 Step 2: Implement Two-Tier Pseudonymization System
class FleetPseudonymizer:
"""GDPR Article 4(5) compliant pseudonymization for fleet data."""
def __init__(self):
# Tier 1: Operational pseudonyms (for day-to-day use)
self.operational_salt = self._get_daily_salt() # Rotates daily
# Tier 2: Audit pseudonyms (for long-term storage)
self.audit_salt = self._get_secure_salt() # Stored in HSM
# Geofence exclusion zones (never record precise location)
self.exclusion_zones = self._load_exclusion_zones()
def pseudonymize_driver(self, driver_id, tier="operational"):
"""Pseudonymize driver ID with appropriate salt."""
if tier == "operational":
# Reversible by operations team for incident investigation
pseudo_id = hmac_sha256(driver_id, self.operational_salt)
return f"drv_{pseudo_id[:16]}"
elif tier == "audit":
# Only reversible with court order + HSM access
pseudo_id = hmac_sha256(driver_id, self.audit_salt)
return f"aud_{pseudo_id[:16]}"
def pseudonymize_location(self, lat, lon, driver_pseudo_id):
"""Apply location privacy measures."""
# Step 1: Check exclusion zones (home, union hall, medical)
for zone in self.exclusion_zones:
if self._in_zone(lat, lon, zone):
return {
"lat": None, "lon": None,
"zone_type": zone["type"],
"suppressed": True
}
# Step 2: Grid snapping (100m precision)
lat_rounded = round(lat, 3) # ~111m precision
lon_rounded = round(lon, 3)
# Step 3: Population density check
if self._low_density_area(lat_rounded, lon_rounded):
lat_rounded = round(lat, 2) # Reduce to ~1km precision
lon_rounded = round(lon, 2)
return {"lat": lat_rounded, "lon": lon_rounded, "suppressed": False}1420.4.3 Step 3: Implement Re-identification Controls
| Access Level | Can Access | Cannot Access | Use Case |
|---|---|---|---|
| Dispatch Operators | Pseudonymous routes, delivery ETAs | Driver identity, home locations | Day-to-day operations |
| Fleet Managers | Operational pseudonyms + daily salt | Audit pseudonyms, HSM salt | Incident investigation (same-day) |
| HR/Legal | Audit pseudonyms + HSM (with approval) | Raw data (never stored) | Disciplinary, legal proceedings |
| Analytics Team | Aggregated/anonymized data only | Any pseudonyms or salts | Route optimization, ML training |
1420.4.4 Step 4: Calculate Privacy-Utility Tradeoff
| Metric | Before Pseudonymization | After Pseudonymization | Impact |
|---|---|---|---|
| Driver re-identification risk | 100% (direct ID) | <0.1% (requires salt + HSM) | 99.9% reduction |
| Route optimization accuracy | 100% | 98.5% (100m grid acceptable) | Minimal impact |
| Delivery ETA accuracy | 100% | 97% (5-min jitter acceptable) | Acceptable |
| Theft recovery capability | 100% | 100% (operational tier reversible) | No impact |
| GDPR compliance status | Non-compliant | Compliant (Art. 32 appropriate measures) | Critical improvement |
Result: The pseudonymization strategy protects driver privacy while maintaining 97%+ business functionality. The union approved the implementation after reviewing exclusion zones and re-identification controls.
1420.5 Worked Example: Data Minimization for Health Wearable
Scenario: You are designing a fitness wearable that tracks heart rate, steps, sleep, and GPS location. The product manager wants to collect raw sensor data at maximum resolution and store it indefinitely in the cloud for “future AI features.” Apply Privacy by Design principles to minimize data collection while preserving core functionality.
Given:
- Heart rate sensor: 1 Hz sampling (1 reading/second)
- Accelerometer: 50 Hz sampling (for step detection)
- GPS: 1 Hz when exercising
- User expectations: Daily summaries, workout history, sleep quality scores
- Regulatory context: GDPR (EU users), CCPA (California users)
- Storage: Cloud database with 7-year retention policy
1420.5.1 Step 1: Apply the Privacy Hierarchy (Eliminate First)
| Data Point | Current Collection | Can Eliminate? | Reasoning |
|---|---|---|---|
| Raw accelerometer (50 Hz) | Yes | YES | Only step count needed, not raw motion |
| Continuous heart rate | Yes | YES | Only exercise HR and resting HR averages needed |
| Precise GPS coordinates | Yes | PARTIAL | Route shape needed, not exact addresses |
| Sleep raw data | Yes | YES | Only sleep stages and duration needed |
1420.5.2 Step 2: Apply Data Minimization
Heart Rate:
- BEFORE: 86,400 readings/day (1 Hz x 24 hours) = 691 KB/day raw
- AFTER: Calculate on-device:
- Resting HR average (1 value/day)
- Exercise HR zones (5 values per workout)
- HR variability score (1 value/day)
- Data sent to cloud: ~20 values/day = 400 bytes/day
- Reduction: 99.94%
GPS Location:
- BEFORE: 3,600 coordinates/hour of exercise = exact route with home address visible
- AFTER: Apply privacy techniques on-device:
- Geofence exclusion: Suppress GPS within 500m of “Home” and “Work”
- Route generalization: Snap to 100m grid, remove first/last 500m
- Store only: Distance, elevation gain, pace per km (derived metrics)
- Reduction: 95% data volume, 100% home address protection
Step Data:
- BEFORE: 4.3 million accelerometer readings/day (50 Hz x 24 hours)
- AFTER: On-device step counter chip outputs:
- Hourly step counts (24 values/day)
- Daily total steps (1 value/day)
- Data sent to cloud: 25 integers = 100 bytes/day
- Reduction: 99.998%
1420.5.3 Step 3: Implement Privacy-by-Default Settings
# Privacy-by-Default Configuration
data_collection:
heart_rate_raw: false # Only aggregates
gps_precise: false # 100m grid snapping
gps_home_exclusion: true # Auto-exclude home area
sleep_raw: false # Only sleep stages
data_retention:
detailed_workouts: 90_days # Not 7 years
daily_summaries: 2_years # Rolling window
account_deletion: immediate # GDPR right to erasure
data_sharing:
third_party_analytics: false # Opt-in only
anonymized_research: false # Explicit consent required
cloud_backup: true # Core functionality1420.5.4 Step 4: Calculate Privacy Impact
| Metric | Before (Cloud-First) | After (Privacy-by-Design) |
|---|---|---|
| Daily data to cloud | 847 KB | 550 bytes |
| Data reduction | - | 99.93% |
| Sensitive data exposed | Home address, health patterns | Aggregated health metrics only |
| Retention liability | 7 years of raw data | 90 days detailed, 2 years summary |
| GDPR compliance risk | High (excessive collection) | Low (minimization demonstrated) |
Result: By processing data on-device and transmitting only derived metrics, the wearable achieves 99.93% reduction in cloud data storage while maintaining full functionality.
1420.6 Worked Example: Privacy-by-Default for Smart Home Hub
Scenario: A smart home company is launching a new home hub that integrates with cameras, door locks, thermostats, and voice assistants. Design the default privacy configuration that protects users who never touch settings.
Given:
- Hub integrates: 4 cameras, 2 door locks, 3 thermostats, 1 voice assistant
- Data generated: Video streams (24/7), audio (voice commands), presence patterns, temperature schedules
- Business model: Hardware sales + optional premium cloud features (not advertising)
- User persona: Non-technical homeowner who uses defaults
1420.6.1 Step 1: Audit Data Flows and Apply Privacy Hierarchy
| Data Type | Industry Default | Privacy-by-Design Default | Justification |
|---|---|---|---|
| Camera video | Cloud upload 24/7, 30-day retention | Local storage ONLY, 7-day auto-delete | Video is Tier 3 (biometric). Cloud = liability |
| Voice commands | Send all audio to cloud | On-device wake word, local processing | Only transcribed intent sent (not audio) |
| Door lock events | Cloud logging with user names | Local log only, pseudonymous IDs | Who enters home = sensitive pattern |
| Presence detection | Share with ecosystem partners | Never shared, local automation | Occupancy = home invasion enabler |
| Temperature schedule | Cloud analytics for “energy insights” | On-device optimization only | Schedule reveals when home is empty |
1420.6.2 Step 2: Design Default Configuration
# PRIVACY-BY-DEFAULT CONFIGURATION
# Active on first power-on, before user creates account
camera:
recording_enabled: true # Core functionality works
storage_location: "local_only" # NOT cloud (user can opt-in later)
retention_days: 7 # Auto-delete after 7 days
cloud_backup: false # OFF by default
facial_recognition: false # OFF (opt-in required)
sharing_with_family: false # Must explicitly invite
law_enforcement_access: "require_warrant"
voice_assistant:
wake_word_detection: "on_device" # No audio leaves hub until wake word
voice_processing: "on_device" # Local NLU when possible
voice_history_stored: false # Don't store recordings
improve_recognition_sharing: false
door_locks:
event_logging: "local_only" # Log stays on hub
user_identification: "pseudonymous"
remote_access: false # Must be on local network
guest_access_tracking: "minimal"
presence_detection:
enabled: true # For automations
granularity: "home/away" # Not room-level tracking
sharing_with_apps: false # Third-party apps cannot see
historical_patterns: false # Don't build occupancy model
data_sharing:
analytics_to_manufacturer: false # No telemetry by default
third_party_integrations: false # Must explicitly connect
advertising_data: "never" # Hardcoded, cannot be enabled1420.6.3 Step 3: Implement Transparent Privacy Dashboard
PRIVACY DASHBOARD (accessible from hub touchscreen + app):
+------------------------------------------------------------------+
| YOUR PRIVACY STATUS |
+------------------------------------------------------------------+
| DATA STORED ON YOUR HUB (never leaves your home): |
| [=====] Camera recordings: 47 GB (6 days of footage) |
| [== ] Activity logs: 2.3 MB (28 days) |
| [= ] Voice transcripts: 0 KB (disabled) |
+------------------------------------------------------------------+
| DATA SHARED WITH CLOUD: |
| Account info (email, password hash): Required for remote access |
| Device health telemetry: OFF [Enable for better support] |
| Usage analytics: OFF [Enable to help improve product] |
+------------------------------------------------------------------+
| DATA SHARED WITH THIRD PARTIES: |
| No third-party integrations connected |
| [+ Connect an integration] |
+------------------------------------------------------------------+
| QUICK ACTIONS: |
| [Delete all recordings] [Export my data] [Delete account] |
+------------------------------------------------------------------+
Result: Out-of-box: Zero data leaves the home (except account creation). Competitor comparison: Most hubs require 15+ privacy settings changes to achieve this level; ours requires zero changes.
1420.7 Worked Example: Consent Management for Healthcare IoT
Scenario: A hospital is deploying IoT patient monitoring devices (heart rate monitors, blood glucose sensors, fall detectors). Design a consent management system that respects patient autonomy while enabling life-saving care.
Given:
- 500 patient rooms with 3-5 IoT devices each
- Data types: Heart rate, blood pressure, blood glucose, movement patterns, fall events
- Users: Patients (data subjects), nurses, doctors, family members, researchers
- Regulations: GDPR (EU patients), HIPAA (US), local health data laws
- Challenge: Patients may be incapacitated, minors, or non-English speakers
1420.7.1 Step 1: Design Consent Matrix
| Data Type | Care Team (Medical Need) | Family (Patient Choice) | Research (Opt-in) |
|---|---|---|---|
| Heart rate | ALWAYS (vital) | CONFIGURABLE, Default: Share | OPT-IN ONLY |
| Blood glucose | ALWAYS (vital) | CONFIGURABLE, Default: Share | OPT-IN ONLY |
| Fall detection | ALWAYS (safety) | CONFIGURABLE, Default: Alert | OPT-IN ONLY |
| Movement patterns | CARE TEAM ONLY | NEVER (sensitive) | OPT-IN ONLY |
| Location in room | CARE TEAM ONLY (fall response) | NEVER | NEVER |
Consent Types:
- ALWAYS: Required for care, cannot opt out (legal basis: vital interests)
- CONFIGURABLE: Patient chooses, default varies by sensitivity
- OPT-IN ONLY: Must explicitly agree, default is NO
- NEVER: Data category too sensitive for this recipient type
1420.7.2 Step 2: Handle Special Consent Scenarios
| Scenario | Consent Approach | Legal Basis | Implementation |
|---|---|---|---|
| Unconscious patient | Care team access enabled by default | GDPR Art. 6(1)(d): Vital interests | Minimal data, care purposes only |
| Minor patient (under 16) | Parent/guardian provides consent | GDPR Art. 8: Child consent | Parent signs, child informed |
| Dementia patient | Legal guardian + patient assent | Best interests assessment | Simplified assent form |
| Emergency situation | Override consent for life-threatening | GDPR Art. 6(1)(d): Vital interests | Time-limited, documented |
| Patient withdraws consent | Immediately stop non-essential sharing | GDPR Art. 7(3): Right to withdraw | Technical enforcement |
1420.7.3 Step 3: Implement Real-Time Consent Enforcement
CONSENT ENFORCEMENT ENGINE:
When: Doctor requests patient glucose readings
System checks:
1. Is requestor authenticated? --> Yes (Dr. Smith, badge #12345)
2. Is requestor in patient's care team? --> Yes (assigned physician)
3. Is data type covered by consent? --> Yes (glucose = vital interests)
4. Is purpose valid? --> Yes (medical treatment)
5. ACCESS GRANTED
When: Family member requests movement patterns
System checks:
1. Is requestor authenticated? --> Yes (Maria Jr., verified family)
2. Is family access enabled for this data? --> NO (movement = NEVER for family)
3. ACCESS DENIED (Reason: Data category not shareable with family)
When: Researcher requests heart rate data
System checks:
1. Is requestor authenticated? --> Yes (Dr. Research, ethics approval #789)
2. Did patient opt-in to research? --> Check consent record --> NO
3. ACCESS DENIED (Reason: Patient did not consent to research use)
AUDIT LOG (every access attempt):
{
"timestamp": "2026-01-11T14:32:17Z",
"patient_id": "P-98765",
"requestor": "Dr. Smith",
"data_requested": "glucose_readings",
"consent_basis": "vital_interests",
"decision": "GRANTED"
}
Result: Patients understand exactly what data is collected and who sees it. Granular control lets them share heart rate with one family member but not another. Every access is logged and reviewable by the patient.
- Layered Consent: Separate consent for essential, core features, personalization, history, improvement, and third-party sharing
- Two-Tier Pseudonymization: Operational tier (daily salt, reversible by operations) and audit tier (HSM salt, reversible only with legal authority)
- On-Device Processing: Calculate derived metrics locally, send only results to cloud (99%+ data reduction possible)
- Privacy-by-Default Configuration: Most protective settings active out-of-box, users opt-in to sharing
- Consent Enforcement Engine: Technical controls checking consent before every data access
This chapter demonstrated five comprehensive Privacy by Design implementations:
1. GDPR-Compliant Consent Flow: Layered architecture separating essential (contract basis) from optional (consent basis) features, with progressive disclosure and easy withdrawal.
2. Fleet Tracking Pseudonymization: Two-tier system with rotating operational salts and HSM-protected audit salts, achieving 99.9% re-identification risk reduction while maintaining 97%+ operational utility.
3. Health Wearable Data Minimization: On-device processing reducing cloud transmission from 847 KB/day to 550 bytes/day (99.93% reduction) while maintaining full user functionality.
4. Smart Home Privacy-by-Default: Zero data leaves home out-of-box, with transparent dashboard showing exactly what data exists and where it goes.
5. Healthcare Consent Management: Tier-aware consent matrix distinguishing vital interests (always allowed), patient-configurable sharing, and opt-in-only research access, with real-time enforcement and comprehensive audit logging.
Key Implementation Pattern: Start with “Do we need this data?” then “Can we process locally?” then “Can we aggregate?” then “Can we anonymize?” - only then consider encryption as the last resort for data that truly must be collected and stored.
1420.8 What’s Next
With Privacy by Design principles and implementation techniques mastered, continue to Encryption Principles and Crypto Basics where you’ll learn:
- Symmetric encryption algorithms (AES) for efficient data protection
- Asymmetric cryptography (RSA/ECC) for key exchange and digital signatures
- Secure key management and storage using hardware security modules
- TLS/DTLS implementation for IoT communication security
- End-to-end encryption protecting data throughout its lifecycle
Continue to Encryption Principles and Crypto Basics ->
1420.9 Resources
1420.9.1 Standards
- ISO/IEC 29100: Privacy framework
- ISO/IEC 27701: Privacy information management
- NIST Privacy Framework
1420.9.2 Tools
- Privacy by Design Toolkit: Privacy Commissioner Ontario
- LINDDUN: Privacy threat modeling methodology
- GDPR Compliance Checkers: OneTrust, TrustArc