34  IoT Security Practice Labs

34.1 Learning Objectives

By the end of this chapter, you should be able to:

  • Conduct systematic IoT device security audits using OWASP-based checklists
  • Configure network segmentation (VLANs and guest networks) to isolate IoT devices
  • Verify HTTPS/TLS certificate validity and cipher strength using command-line tools
  • Document security findings in professional audit reports with risk ratings
  • Design and implement compensating controls for identified vulnerabilities
In 60 Seconds

IoT security labs bridge the gap between theoretical security knowledge and practical competence by providing hands-on experience implementing, attacking, and defending real IoT systems in controlled environments. Each lab is designed to produce a concrete, observable security outcome that validates or falsifies a security hypothesis, building the empirical intuition that underpins expert security decision-making.

Key Concepts

  • Lab environment: An isolated, controlled network and hardware setup for conducting IoT security experiments without risk to production systems — essential for safely performing attacks and defence testing.
  • Capture The Flag (CTF): A competitive security exercise where participants find and exploit vulnerabilities in deliberately vulnerable systems to capture flag strings, developing practical attack and defence skills.
  • Penetration testing: A structured security assessment simulating real attacker techniques against an IoT system to identify exploitable vulnerabilities before malicious actors do — requires explicit written authorisation.
  • Traffic analysis lab: A hands-on exercise capturing and analysing IoT network traffic with Wireshark or tcpdump to verify encryption, identify protocol weaknesses, and detect anomalous communication patterns.
  • Threat modelling workshop: A collaborative exercise applying STRIDE or attack tree methodology to a specific IoT system to systematically identify threats, rate their risk, and design mitigating controls.

34.2 Introduction

Security knowledge becomes valuable only through practical application. These hands-on labs guide you through real security assessment techniques used by professionals. Each lab includes step-by-step instructions, verification checklists, and templates for documenting findings.

These labs are designed to be safe:

  • Only assess devices you own - never scan or test others’ networks without permission
  • Document everything - good notes help you learn and provide evidence
  • Start simple - complete Lab 1 before attempting advanced labs

If you’re uncomfortable with any step, skip it and move to the next. The goal is learning, not completing every checkbox.

34.3 Lab 1: IoT Device Security Audit Checklist

Objective: Learn to assess the security posture of an IoT device using a systematic checklist approach.

Time Required: 30-45 minutes

Materials Needed:

  • Any IoT device you own (smart plug, camera, sensor, etc.)
  • Computer with network scanning capability
  • Notepad for recording findings

34.3.1 Step 1: Physical Security Assessment (5 min)

Check for physical security vulnerabilities:

Check Item Pass/Fail Notes
Are there exposed debug ports (UART, JTAG)? [ ] Document port locations
Can the device be opened without tools? [ ] Note tamper evidence
Are there any printed credentials on device/packaging? [ ] Document if found
Is the firmware chip accessible/removable? [ ] Note chip type if visible
What to Look For

Debug ports often appear as:

  • 4-pin header (UART: TX, RX, VCC, GND)
  • 10-20 pin header (JTAG)
  • Unpopulated solder pads on PCB

Printed credentials may include:

  • Default password on sticker
  • Setup code for pairing
  • Serial number that doubles as password

34.3.2 Step 2: Network Security Assessment (10 min)

Analyze network behavior:

Network Scanning and Port Attack Surface Calculation

Understanding the mathematical scope of network vulnerabilities helps prioritize security efforts:

\[\text{Attack Surface} = P_{\text{open}} \times V_{\text{avg}} \times E_{\text{exploit}}\]

where \(P_{\text{open}}\) is the number of open ports, \(V_{\text{avg}}\) is average vulnerabilities per service, and \(E_{\text{exploit}}\) is the exploitation probability.

Example network scan results:

  • Device has 5 open ports: SSH (22), HTTP (80), Telnet (23), MQTT (1883), Unknown (8080)
  • Telnet: \(V = 8\) known CVEs, \(E = 0.9\) (trivial to exploit)
  • HTTP: \(V = 3\) CVEs, \(E = 0.4\) (depends on web app)
  • MQTT: \(V = 2\) CVEs, \(E = 0.6\) (authentication bypass)

Risk calculation: \[R_{\text{total}} = \sum_{i=1}^{5} P_i \times V_i \times E_i = (1 \times 8 \times 0.9) + (1 \times 3 \times 0.4) + (1 \times 2 \times 0.6) = 9.6 \text{ risk units}\]

Interactive Attack Surface Calculator:

Subnet scanning efficiency: \[T_{\text{scan}} = \frac{2^{(32-n)} \times t_{\text{host}}}{c}\]

where \(n\) is the CIDR prefix, \(t_{\text{host}}\) is time per host (2s for nmap -sn), and \(c\) is concurrency (default 100).

For /24 network: \(T = \frac{2^{8} \times 2}{100} = \frac{512}{100} = 5.12\) seconds

Interactive Subnet Scan Time Calculator:

This quick scan identifies all potential devices in the subnet, enabling rapid IoT fleet audits.

# Find your IoT device's IP address (run on same network)
nmap -sn 192.168.1.0/24

# Scan open ports on the device (replace IP)
nmap -sV 192.168.1.XXX

# Check for unencrypted traffic (if you have Wireshark)
# Filter: ip.addr == 192.168.1.XXX
Check Item Pass/Fail Notes
Does device use HTTPS for web interface? [ ] Check certificate validity
Are unnecessary ports open? [ ] List open ports
Does device phone home to unexpected servers? [ ] Note domains contacted
Is traffic encrypted (TLS/SSL)? [ ] Check with Wireshark

34.3.3 Step 3: Authentication Assessment (10 min)

Test authentication mechanisms:

Check Item Pass/Fail Notes
Did device ship with default password? [ ] Was change forced?
Is password complexity enforced? [ ] Test weak passwords
Does device support 2FA/MFA? [ ] Enable if available
Are there hidden admin accounts? [ ] Check documentation
Does device lock after failed attempts? [ ] Test brute force protection

34.3.4 Step 4: Firmware and Updates (10 min)

Check Item Pass/Fail Notes
Is automatic update enabled? [ ] Enable if available
When was last update released? [ ] Check manufacturer site
Are updates signed/verified? [ ] Check update process
Can you roll back firmware? [ ] Note if possible

34.3.5 Step 5: Privacy Assessment (5 min)

Check Item Pass/Fail Notes
What data does device collect? [ ] Read privacy policy
Can you disable data sharing? [ ] Check settings
Is data stored locally or cloud? [ ] Note storage location
Can you delete your data? [ ] Test data deletion

34.3.6 Scoring Your Device

Score Range Risk Level Recommended Action
0-5 checks passed HIGH Risk Consider replacing or isolating
6-10 checks passed MEDIUM Risk Implement compensating controls
11-15 checks passed LOWER Risk Maintain vigilance
16+ checks passed GOOD Security Posture Continue monitoring

34.3.7 Audit Report Template

Use this template to document your findings:

## IoT Security Audit Report

**Device:** [Name and Model]
**Date:** [Date]
**Auditor:** [Your Name]

### Executive Summary
[1-2 sentence overall assessment]

### Findings
| Category | Score | Critical Issues |
|----------|-------|-----------------|
| Physical | X/4 | |
| Network | X/4 | |
| Authentication | X/5 | |
| Firmware | X/4 | |
| Privacy | X/4 | |
| **Total** | **X/21** | |

### Recommendations
1. [Most critical fix]
2. [Second priority]
3. [Third priority]

### Risk Acceptance
[Note any risks accepted and justification]

34.4 Lab 2: Network Segmentation for IoT Devices

Objective: Create a separate network segment for IoT devices to limit breach impact.

Time Required: 45-60 minutes

Materials Needed:

  • Router with VLAN or guest network capability
  • IoT devices to move to new network
  • Computer for configuration

34.4.1 Why Segment IoT Devices?

IoT devices on your main network can access everything:

BEFORE (Risky):
IoT Device ←→ Your Computer ←→ Your Files
                    ↑
         No protection between them

AFTER (Safer):
IoT Device ←→ [Firewall] ←→ Your Computer
                    ↑
    IoT cannot reach your computer directly

34.4.2 Option A: Guest Network (Easiest)

Most routers support guest networks. This is the quickest way to isolate IoT devices.

Step 1: Access router admin (usually 192.168.1.1)

Step 2: Enable guest network

Step 3: Configure settings:

  • Name: IoT_Devices
  • Password: [Strong unique password]
  • Enable client isolation
  • Disable access to main network

Step 4: Connect IoT devices to guest network

Step 5: Verify isolation - IoT devices shouldn’t see your computer

34.4.3 Option B: VLAN (More Secure)

For advanced users with managed switches:

VLAN Purpose Devices
VLAN 1 (Default) Trusted Computers, phones
VLAN 10 (IoT) Smart home Lights, thermostats, speakers
VLAN 20 (Cameras) Most restricted Security cameras

34.4.4 Firewall Rules Template

# Allow IoT to reach internet
ALLOW: VLAN_IoT → Internet (ports 80, 443, 8883)

# Block IoT from main network
DENY: VLAN_IoT → VLAN_Main (all ports)

# Allow main network to control IoT
ALLOW: VLAN_Main → VLAN_IoT (specific ports only)

# Block IoT-to-IoT lateral movement (optional paranoid mode)
DENY: VLAN_IoT → VLAN_IoT (all ports)

34.4.5 Verification Checklist

Test Expected Result Actual
IoT device reaches internet Works [ ]
IoT device pings your computer Blocked [ ]
Your computer controls IoT device Works [ ]
IoT device scans network Only sees IoT VLAN [ ]
Common Issues

Problem: IoT device can’t be controlled after segmentation

Solutions:

  1. Check firewall allows traffic FROM main network TO IoT
  2. Some devices require broadcast/multicast - enable mDNS relay
  3. Cloud-based devices may work fine; local-control devices need direct access

34.5 Lab 3: HTTPS Certificate Verification

Objective: Verify that your IoT devices use proper TLS/HTTPS encryption.

Time Required: 20 minutes

Materials Needed:

  • Browser with developer tools
  • IoT device with web interface

34.5.1 Step-by-Step Verification

Step 1: Access device web interface

Navigate to: https://192.168.1.XXX (note: HTTPS not HTTP)

Step 2: Check certificate in browser

  • Click padlock icon → “Certificate”
  • Note issuer, expiration, and validity

Step 3: Use OpenSSL to inspect certificate

# Check certificate details
openssl s_client -connect 192.168.1.XXX:443 -showcerts

# Check supported TLS versions
nmap --script ssl-enum-ciphers -p 443 192.168.1.XXX

Step 4: Evaluate results

Check Secure Insecure
Protocol TLS 1.2 or 1.3 SSL 3.0, TLS 1.0/1.1
Certificate Valid, not expired Self-signed, expired
Cipher Suite AES-256-GCM RC4, DES, 3DES
Key Size RSA 2048+ or ECC 256+ RSA 1024 or less

34.5.2 Common Issues and Fixes

Issue Risk Level Fix
Self-signed certificate Medium Accept for local only, or install custom CA
Expired certificate High Update firmware or contact manufacturer
TLS 1.0/1.1 only Medium Disable old protocols if possible
HTTP only (no HTTPS) Critical Use VPN or replace device
Understanding Self-Signed Certificates

Many IoT devices use self-signed certificates because:

  1. They don’t have domain names (just IP addresses)
  2. Getting CA-signed certs requires internet access during manufacturing
  3. Certificate renewal is complex for embedded devices

Self-signed is acceptable IF:

  • Device is on isolated network
  • You verify the certificate fingerprint manually
  • Traffic is already on a VPN

Self-signed is risky IF:

  • Device is internet-accessible
  • You haven’t verified the fingerprint
  • You’re sending sensitive data

34.5.3 Certificate Fingerprint Verification

For self-signed certificates, manually verify the fingerprint:

Step 1: Get fingerprint from device (usually in admin interface or documentation)

Step 2: Compare with OpenSSL output:

openssl s_client -connect 192.168.1.XXX:443 2>/dev/null | \
  openssl x509 -fingerprint -sha256 -noout

Step 3: If fingerprints match, certificate is authentic (not MITM attack)

34.6 Resources for Further Learning

34.6.1 Books

  • “Practical IoT Hacking” by Fotios Chantzis
  • “IoT Penetration Testing Cookbook” by Aaron Guzman
  • “Abusing the Internet of Things” by Nitesh Dhanjani

34.6.2 Standards and Frameworks

34.6.3 Tools

Category Tools
Vulnerability Scanning Nmap, Nessus, OpenVAS
Firmware Analysis Binwalk, Firmwalker, FACT
Network Analysis Wireshark, tcpdump
Penetration Testing Metasploit, Burp Suite

34.6.4 Online Resources

34.6.5 Certifications

Certification Focus
GIAC GICSP Critical Infrastructure Protection
CISM Information Security Management
CEH Ethical Hacker (IoT module)
IoT Security Practitioner IoT Security Foundation

Scenario: A manufacturing plant operates 500 IoT devices across production lines. The security team has 40 hours available this quarter for hands-on security labs. They must choose which labs to prioritize.

Device Inventory:

  • 200 industrial sensors (temperature, pressure, vibration)
  • 150 IP cameras (perimeter + production monitoring)
  • 100 RFID readers (asset tracking)
  • 50 PLCs (programmable logic controllers)

Step 1: Assess Risk Profile

Device Type Internet-Exposed? Default Creds Risk Network Segmentation Priority Lab
Industrial Sensors No Low Isolated OT network Lab 2 (Segmentation verification)
IP Cameras Yes HIGH Mixed IT/OT network Lab 1 (Device audit) + Lab 3 (Certificate check)
RFID Readers No Medium Shared VLAN Lab 2 (Segmentation)
PLCs No CRITICAL Air-gapped Lab 1 (Physical security audit)

Step 2: Calculate Impact×Likelihood

Interactive Risk Score Calculator:

IP Cameras (Risk Score: 8/10): - Likelihood: 9/10 (internet-exposed, common Mirai targets) - Impact: 7/10 (privacy breach, surveillance compromise) - Action: Allocate 16 hours to Lab 1 audit for all 150 cameras

PLCs (Risk Score: 7.5/10): - Likelihood: 3/10 (air-gapped, physical access only) - Impact: 10/10 (production halt, safety incidents) - Action: Allocate 12 hours to Lab 1 physical security for all 50 PLCs

Industrial Sensors (Risk Score: 4/10): - Likelihood: 5/10 (isolated network reduces exposure) - Impact: 5/10 (sensor spoofing causes false alarms) - Action: Allocate 8 hours to Lab 2 verification for sample of 20 sensors

RFID Readers (Risk Score: 3.5/10): - Likelihood: 4/10 (local network only) - Impact: 4/10 (asset tracking disruption) - Action: Defer to next quarter (lowest risk)

Step 3: Lab Execution Plan

Week Lab Focus Devices Hours Expected Findings
1-2 Lab 1: Camera Audit 150 IP cameras (10 samples initially) 16h Default credentials, HTTP-only, expired certs
3 Lab 3: Certificate Verification Same 10 cameras 4h Self-signed certs, TLS 1.0
4 Lab 1: PLC Physical Security 50 PLCs (all) 12h Exposed JTAG, unlocked cabinets
5 Lab 2: Sensor Segmentation 20 sensors (sample) 8h VLAN leakage, crossover traffic

Step 4: Calculate Return on Investment

Before Labs:

  • 150 cameras: 80% with default credentials = 120 vulnerable
  • 50 PLCs: 60% with accessible JTAG = 30 vulnerable
  • Total: 150 critical vulnerabilities

After Labs + Remediation:

  • Cameras: Fixed credentials on all 150, enforced HTTPS, renewed certificates
  • PLCs: Disabled JTAG on all 50, locked control cabinets, added tamper sensors
  • Total: 8 residual vulnerabilities (cameras with non-standard ports, PLCs with legacy firmware)

Risk Reduction: 150 → 8 vulnerabilities = 94.7% reduction Time Investment: 40 hours of lab work + 120 hours remediation Cost Avoidance: Prevented potential Mirai-style botnet enrollment (cameras) and prevented unauthorized PLC firmware modification (safety incident)

Key Insight: Risk-based lab prioritization ensures limited resources address the highest-impact vulnerabilities first. Internet-exposed cameras required immediate attention despite being “low-value” devices, while air-gapped PLCs needed physical security checks despite network isolation. The 40-hour lab investment identified 150 critical issues that would have cost millions in breach response.

Different IoT deployments require different lab priorities. Use this framework to select appropriate labs:

Deployment Context Primary Risk Recommended Labs Justification
Consumer Smart Home (10-50 devices, home network) Network exposure, default credentials Lab 1 (Device Audit), Lab 2 (Guest Network Setup) Devices often consumer-grade with weak defaults. Network segmentation prevents lateral movement after breach.
Small Business IoT (50-500 devices, managed network) Insider threats, unpatched firmware Lab 1 (Device Audit), Lab 3 (Certificate Verification) Employee network access increases insider risk. Certificate checks ensure encrypted management traffic.
Enterprise Campus (500-5,000 devices, complex network) Scale of attack surface, configuration drift Lab 2 (VLAN Segmentation), Lab 1 (Automated Audit Scripts) Manual audits don’t scale; automation required. Complex networks need strict segmentation across device classes.
Industrial/Manufacturing (100-1,000 devices, OT network) Safety-critical systems, physical access Lab 1 (Physical Security focus), Lab 2 (OT/IT Isolation) Physical tampering can cause safety incidents. Absolute isolation between IT and OT prevents ransomware spread.
Healthcare IoT (Medical devices, PHI data) Regulatory compliance, patient safety Lab 3 (Certificate/Encryption Verification), Lab 1 (Audit with HIPAA checklist) HIPAA mandates encryption for PHI. Medical devices require FDA-validated security configurations.
Public Infrastructure (Sensors, cameras, public spaces) Vandalism, coordinated attacks Lab 1 (Physical Security + Tamper Detection), Lab 2 (Segmentation with VPN) Public deployment increases physical access risk. VPN tunnels ensure secure communication over untrusted networks.

Decision Tree for Lab Selection:

START: What is the primary deployment risk?

├─ Internet-exposed devices?
│  └─ YES → Lab 1 (Device Audit) + Lab 3 (Certificate Check)
│  └─ NO → Continue
│
├─ Physically accessible to public/untrusted persons?
│  └─ YES → Lab 1 (Physical Security focus)
│  └─ NO → Continue
│
├─ More than 100 devices?
│  └─ YES → Lab 2 (Network Segmentation) + Automated Lab 1 scripts
│  └─ NO → Continue
│
├─ Regulatory compliance required? (HIPAA/PCI-DSS/GDPR)
│  └─ YES → Lab 3 (Certificate Verification) + Compliance-focused Lab 1
│  └─ NO → Continue
│
└─ Default: Lab 1 (Device Audit) for baseline security posture

Time Allocation Guidelines:

Available Time Recommended Labs Coverage
4 hours Lab 1 sample audit (5-10 devices) Baseline understanding
8 hours Lab 1 full audit + Lab 3 certificates Critical device classes
16 hours Lab 1 + Lab 2 + Lab 3 Comprehensive security assessment
40 hours All labs + remediation + documentation Production-ready security posture

Key Principle: Labs are not one-size-fits-all. Context dictates which labs provide maximum security ROI.

Common Mistake: Treating Lab Findings as Theoretical

The Mistake: After completing Lab 1 and discovering 15 vulnerabilities, a security team documents findings in a report, files it, and considers the lab “complete.” Six months later, the same vulnerabilities remain unpatched.

Why This Happens:

  • Lab activities feel like training exercises, not real security work
  • Findings aren’t integrated into remediation workflows
  • No accountability for fixing discovered issues
  • Labs are done “for compliance” rather than to improve security

Real-World Consequence:

Case Study: A smart building operator completed Lab 1 on 200 HVAC controllers, discovering: - 180 devices with default password “admin:admin” - 150 devices with HTTP-only management (no HTTPS) - 120 devices with firmware 3+ years out of date

Lab report was filed. No remediation occurred.

8 months later: Ransomware outbreak (unrelated initial infection) spread from corporate network to HVAC controllers via default credentials. Attackers shut down HVAC to all 50 floors, demanding $500K ransom. Building evacuation required. Total cost: $2.1M (downtime + remediation + ransom payment).

Post-Incident Analysis: Every vulnerability exploited by the ransomware was documented in the Lab 1 report 8 months earlier.

The Fix: Integrate Labs into Security Lifecycle

1. Treat Lab Findings as Incident Tickets Every vulnerability discovered in Lab 1 becomes a Jira ticket with: - Severity (Critical/High/Medium/Low based on DREAD score) - Owner (assigned to IT/OT team responsible for the device) - Due date (30/60/90 days based on severity) - Acceptance criteria (how to verify fix)

2. Link Labs to Remediation Budget

  • Lab 1 audit cost: 8 hours ($800 labor)
  • Remediation cost: 40 hours ($4,000 labor) + password management system ($2,000)
  • Total security investment: $6,800
  • Risk reduction: 150 critical vulnerabilities → 5 residual
  • Breach avoidance: $2.1M (ransomware case study above)
  • ROI: 309x return on security investment

3. Schedule Follow-Up Labs After remediation, re-run Lab 1 to verify fixes: - Initial audit: 150 vulnerabilities found - Remediation: 145 fixes claimed by IT team - Verification audit: 15 vulnerabilities still present (incorrect fixes) - Second remediation: 10 additional fixes - Final audit: 5 accepted residual risks (documented)

4. Integrate Lab Results into Risk Register

Vulnerability Initial Risk Score After Remediation Residual Risk Accepted By
Default credentials 9.2 (CRITICAL) 2.1 (LOW) 2.1 CISO
HTTP-only management 7.8 (HIGH) 3.5 (LOW) 3.5 IT Director
Outdated firmware 8.5 (CRITICAL) 4.2 (MEDIUM) 4.2 CTO (5 legacy devices)

Key Takeaway: Security labs discover vulnerabilities. Security organizations FIX vulnerabilities. Without remediation, lab work is waste. Track lab findings through to resolution with the same rigor as production incidents.

Common Pitfalls

Security lab activities — especially packet injection, fuzzing, and credential testing — must never be performed on production systems. Even passive network scanning on a production IoT network can disrupt sensitive real-time communications.

Without defined objectives, lab time is consumed by tool configuration rather than security learning. Define what security property each lab is intended to demonstrate before beginning, and evaluate whether the objective was achieved after completing it.

Completing a lab by following instructions demonstrates that you can follow instructions, not that you understand the security concepts. After each lab, explain in your own words what vulnerability was demonstrated and how the defence works.

A lab demonstrating MQTT without TLS demonstrates a specific protocol misconfiguration. Generalise: this is an instance of the broader class ‘cleartext transmission of sensitive data’ — applicable to HTTP, Modbus, Telnet, and any other unencrypted protocol.

34.7 Summary

Hands-on security labs develop practical skills that complement theoretical knowledge:

  • Device audits reveal real vulnerabilities using systematic checklists
  • Network segmentation limits breach impact through isolation
  • Certificate verification ensures encrypted communications
  • Documentation creates professional audit trails

These labs can be repeated with different devices to build experience across IoT ecosystems.

Concept Relationships

Understanding how practice lab concepts interconnect:

Lab Focus Prerequisite Skills Validates Concepts From Enables Next Steps
Lab 1: Device Audit Basic networking, command line OWASP IoT Top 10, security fundamentals Vulnerability remediation, compliance audits
Lab 2: Network Segmentation VLAN concepts, firewall rules Defense in depth, attack surface reduction Industrial zone-and-conduit (IEC 62443), zero-trust architecture
Lab 3: Certificate Verification Public key cryptography, TLS basics ETSI Provision 5 (secure communication), NIST DC-4 (data protection) Certificate pinning, mutual TLS (mTLS), PKI deployment
Risk-Based Lab Selection Threat modeling, DREAD scoring Asset criticality, exposure assessment Resource allocation, remediation prioritization
Lab Documentation Audit report writing, risk communication Residual risk acceptance, compliance evidence Penetration testing reports, certification audits

Key Insight: Lab findings are worthless without remediation. Always integrate lab results into your issue tracker (Jira, GitHub Issues) with priority levels and owners. A 6-month-old vulnerability report with no fixes is worse than no audit at all.

See Also

Theoretical Foundations:

Complementary Labs:

Implementation Guides:

Real-World Context:

  • Case Studies - Mirai, Jeep, Ring: What audits would have found
  • Compliance - ETSI EN 303 645 checklist overlaps with Lab 1

Tools and Resources:

  • Nmap, Wireshark, OpenSSL documentation
  • OWASP IoT Security Testing Guide
  • Simulations Hub - Virtual lab environments

34.8 Knowledge Check

34.9 What’s Next

If you want to… Read this
Review the security concepts the labs implement Security Foundations
Study the threat models that the labs address Threat Modelling and Mitigation
Explore hands-on labs specific to IoT security IoT Security Hands-On Labs
Practise with additional exercises and assessments Security Practice
Return to the security module overview IoT Security Fundamentals