26  Firmware Security and Secure Updates

26.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Design Secure Boot Chains: Implement hardware root of trust with verified bootloader, kernel, and application stages
  • Implement Code Signing: Use cryptographic signatures to verify firmware authenticity before installation
  • Deploy Anti-Rollback Protection: Prevent downgrade attacks using OTP fuse-based version counters
  • Build Resilient OTA Systems: Design over-the-air update mechanisms with automatic rollback on failure
  • Apply Defense in Depth: Combine encryption, signing, and verification for comprehensive firmware protection
In 60 Seconds

Secure firmware updates for IoT data handling ensure that the software processing sensitive sensor data cannot be replaced with malicious code that exfiltrates, modifies, or destroys data — making firmware update security a critical component of data protection strategy. The three-part protection requires: cryptographic signing (proving authenticity), integrity verification (detecting tampering), and rollback prevention (blocking downgrade to vulnerable versions).

What is Firmware Security? Firmware is the permanent software programmed into IoT devices—the code that runs when devices boot up. Firmware security ensures that only authentic, unmodified code executes on devices. This includes secure boot (verifying code before running), code signing (cryptographically proving authenticity), and secure updates (safely delivering new firmware over-the-air).

Why does it matter? Attackers who can install malicious firmware gain complete control over devices. They can steal data, manipulate sensors, join botnets, or even cause physical damage. Secure firmware prevents these attacks by ensuring devices only run code from trusted sources.

Key terms: | Term | Definition | |——|————| | Secure Boot | Verification chain ensuring only signed firmware executes, starting from hardware root of trust | | Code Signing | Cryptographically signing firmware with private key so devices can verify authenticity with public key | | OTA Update | Over-The-Air firmware update delivered via network without physical access | | Anti-Rollback | Prevention of downgrade attacks using version counters in OTP (One-Time Programmable) memory | | Root of Trust | Immutable hardware component (Boot ROM) that cannot be modified and starts the verification chain |

“Every morning when I wake up, the first thing I do is check myself,” Max the Microcontroller explained. “My Boot ROM – the tiny unchangeable part of me – verifies that my bootloader is genuine. Then my bootloader checks my operating system. Then my OS checks my apps. It is like a chain of trust, where each link verifies the next one!”

“Why go through all that trouble?” Sammy the Sensor asked.

“Because if a bad guy sneaks fake software onto me, they control EVERYTHING I do!” Max said seriously. “Secure boot is like having a bouncer at every door who checks your ID. No valid ID? You do not get in. The private signing key is kept in a super-secure vault called an HSM, and my public key is burned into hardware that nobody can change.”

Lila the LED added, “And when it is time for updates, code signing makes sure the new software is really from the manufacturer. It is like getting a letter with a royal seal – you can verify the seal is real, but you cannot forge it. If someone tampers with even one tiny bit of the update, the signature will not match, and Max will reject it.”

“I keep special watch during updates,” Bella the Battery said. “If power dies mid-update, the device could be bricked forever! That is why we use A/B partitions – two copies of the software. We update the backup copy first, test it, and only then switch over. If something goes wrong, we roll back to the working copy. Safety first!”

26.2 Prerequisites

Before diving into this chapter, you should be familiar with:

Key Concepts

  • Code signing: The process of applying a cryptographic signature to firmware code using the manufacturer’s private key, allowing devices to verify authenticity before execution.
  • Hash verification: Computing a cryptographic hash (SHA-256) of received firmware and comparing it to the hash in the signed update manifest, confirming the binary has not been corrupted or modified during transmission.
  • Bootloader security: The security of the lowest-level software component that runs before the OS or application firmware, responsible for verifying signatures during secure boot — the root of trust for the entire software stack.
  • Anti-rollback counter: A value stored in write-once hardware (e-fuses, OTP memory) that records the minimum acceptable firmware version, preventing downgrade to older vulnerable versions.
  • Trusted execution environment (TEE): A hardware-isolated execution environment (ARM TrustZone, Intel TXT) providing a secure area for storing update keys and verifying firmware signatures, isolated from the main OS.
  • Update authorisation: The access control mechanism determining which entities are permitted to initiate, approve, and apply firmware updates — preventing unauthorised parties from pushing updates to production devices.

26.3 Introduction

Firmware is the foundational software layer that controls every aspect of an IoT device’s operation – from sensor readings and network communication to safety-critical actuator commands. Unlike application software on general-purpose computers, firmware on embedded devices often runs without an operating system’s protection mechanisms, making it a high-value target for attackers. A single compromised firmware image can turn thousands of deployed devices into surveillance tools, botnet nodes, or safety hazards.

This chapter examines the complete firmware security lifecycle: how devices verify code authenticity at boot time through hardware-backed chains of trust, how manufacturers cryptographically sign firmware to prevent tampering, how anti-rollback mechanisms block downgrade attacks, and how over-the-air update systems deliver patches safely to devices that cannot be physically accessed. Each concept is illustrated through detailed worked examples drawn from real-world IoT deployments including smart meters, door locks, medical devices, and connected vehicles.

26.4 Secure Boot Chain Design

A secure boot chain ensures that every piece of code executing on a device has been verified as authentic before it runs. The chain starts from an immutable hardware root of trust and extends through the bootloader to the application firmware, with each stage cryptographically verifying the next.

Code-signing process flowchart illustrating firmware security: An executable code file is processed through a hash algorithm to produce a one-way hash (digest). This hash is then encrypted using the code signer's private key to create a digital signature. The original executable code, the digital signature, and the code signer's certificate are combined to produce the final signed code package. This process ensures that devices can verify both the authenticity (who signed it) and integrity (not modified) of firmware updates before installation.

Complete code-signing workflow from executable to signed code

Source: University of Edinburgh IoT Security Course - This diagram shows the complete code-signing workflow used to secure IoT firmware updates, ensuring only authenticated and unmodified code is installed on devices.

Digital signature verification process showing sender hashing document and signing with private key, then recipient verifying signature using sender's public key and comparing document hashes to ensure authenticity and integrity
Figure 26.1: Digital signature diagram

26.5 Worked Example: Implementing Secure Boot Chain for Smart Meter

Scenario: A utility company deploys 100,000 smart electricity meters across a metropolitan area. After discovering that attackers could flash malicious firmware to manipulate energy readings (causing billing fraud and grid instability), the company must implement a secure boot chain to ensure only authenticated firmware runs on deployed meters.

Given:

  • 100,000 smart meters deployed in field (cannot be physically recalled)
  • Current state: Firmware updates via unencrypted HTTP, no signature verification
  • Meters have ARM Cortex-M4 MCU with 256KB flash, 64KB RAM
  • Hardware constraint: No dedicated HSM, but MCU has hardware AES and SHA-256 acceleration
  • Attack vector: Attacker with physical access can connect JTAG and flash arbitrary firmware
  • Requirement: Prevent unauthorized firmware from executing, even with physical access

Steps:

  1. Design the boot chain hierarchy:

    SECURE BOOT CHAIN (Root of Trust to Application):
    
    +------------------+
    | Boot ROM (8KB)   | <-- Immutable, burned into silicon at factory
    | - Contains:      |     - Cannot be modified by software or JTAG
    |   Public key hash|     - First code to execute after power-on
    |   SHA-256 verify |
    +--------+---------+
             |
             | Verifies signature of:
             v
    +------------------+
    | Bootloader (16KB)| <-- Stored in protected flash region
    | - Contains:      |     - Can be updated (signed updates only)
    |   Manufacturer   |
    |   public key     |
    |   Update logic   |
    +--------+---------+
             |
             | Verifies signature of:
             v
    +------------------+
    | Application FW   | <-- Main meter firmware (200KB)
    | - Meter logic    |     - Regular updates for features/security
    | - Comms stack    |
    | - Crypto libs    |
    +------------------+
Try It: Secure Boot Chain Explorer
  1. Implement firmware signing at build time:

    BUILD PIPELINE (at manufacturer):
    
    Source Code --> Compiler --> firmware.bin (unsigned)
                                      |
                                      v
    +--------------------------------------------------+
    | SIGNING PROCESS (HSM-protected, air-gapped)      |
    |                                                  |
    | 1. Calculate SHA-256 hash of firmware.bin       |
    |    hash = SHA256(firmware.bin)                  |
    |    = "a3f2b8c1d4e5..."                         |
    |                                                  |
    | 2. Sign hash with Ed25519 private key (in HSM)  |
    |    signature = Ed25519_Sign(private_key, hash)  |
    |    = "7b8a9c0d1e2f..." (64 bytes)              |
    |                                                  |
    | 3. Create signed firmware package               |
    |    firmware_signed.bin = {                      |
    |      magic: "SMFW"                              |
    |      version: 2.3.1                             |
    |      timestamp: 2026-01-11T10:30:00Z            |
    |      firmware_size: 204,800                     |
    |      signature: "7b8a9c0d1e2f..."              |
    |      firmware_data: [binary]                    |
    |    }                                            |
    +--------------------------------------------------+
Try It: Firmware Signing Pipeline Simulator
  1. Implement anti-rollback protection:

    ANTI-ROLLBACK MECHANISM:
    
    Problem: Attacker flashes old firmware with known vulnerabilities
    
    Solution: Monotonic counter in OTP (one-time programmable) memory
    
    OTP Fuse Bank (can only be incremented, never decremented):
    +---+---+---+---+---+---+---+---+
    | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 |  = Version 5
    +---+---+---+---+---+---+---+---+
    
    Firmware header contains: minimum_version = 5
    
    Boot verification:
    1. Read OTP counter value: current_version = 5
    2. Read firmware minimum_version from header: 5
    3. If firmware.min_version < current_version: REJECT (rollback attempt)
    4. If firmware.version > current_version: Burn additional fuse bits
    5. Proceed with signature verification
    
    Result: Once version 5 boots successfully, versions 1-4 can NEVER boot again
  2. Handle secure recovery for bricked devices: | Failure Scenario | Recovery Action | Security Consideration | |——————|—————–|————————| | Signature verification fails | Enter recovery mode, wait for signed recovery image | Recovery image also requires valid signature | | OTP counter mismatch | Reject boot, require factory service | Prevents rollback even in recovery | | Flash corruption | Boot to recovery partition | Recovery partition is read-only, signed | | Update interrupted | Dual-bank: Boot from previous partition | A/B partitioning ensures always-bootable |

Result:

  • Chain of trust from hardware root (Boot ROM) to application firmware
  • Every boot verifies firmware signature before execution
  • Physical access (JTAG) cannot bypass signature verification (Boot ROM is immutable)
  • Anti-rollback prevents downgrade attacks to vulnerable versions
  • Dual-bank update ensures meters remain operational even if update fails

Key Insight: Secure boot is only as strong as its root of trust. The Boot ROM must be immutable (burned into silicon) and contain the verification logic and public key hash. Everything above the Boot ROM can be updated, but only if signed by the trusted key.

26.6 Tradeoffs in Secure Boot Design

Tradeoff: Full Secure Boot vs Verified Boot

Option A: Full secure boot with hardware root of trust (Boot ROM verification, TPM/secure enclave, fused keys) Option B: Verified boot with software-only signature checking (bootloader verifies firmware signature without hardware backing) Decision Factors: Choose full secure boot for high-security applications (medical devices, industrial controllers, vehicles, payment terminals) where physical attacks are a concern and compliance requires hardware-backed security. Choose verified boot for cost-sensitive consumer IoT where software verification provides sufficient protection against remote attacks. Full secure boot adds $2-10 per device in hardware costs (secure element, TPM) but prevents key extraction via side-channel attacks and provides tamper evidence. Verified boot can be bypassed by attackers with physical access who can modify the bootloader itself.

Tradeoff: Aggressive Anti-Rollback vs Rollback Flexibility

Option A: Strict anti-rollback policy using OTP fuses that permanently block all previous firmware versions Option B: Flexible rollback allowing reversion to previous N versions with time-limited validity windows Decision Factors: Choose strict anti-rollback for security-critical devices (pacemakers, vehicle ECUs, industrial safety systems) where running vulnerable firmware poses unacceptable risk. Choose flexible rollback for consumer devices and development environments where a bad update might brick devices and user experience matters more than preventing sophisticated downgrade attacks. Strict anti-rollback cannot be undone (OTP fuses are permanent), so a single bad firmware release can brick entire device fleets. Flexible rollback reduces bricking risk but allows attackers to exploit known vulnerabilities by downgrading firmware. Consider hybrid approaches: strict anti-rollback for security patches, flexible rollback for feature updates.

26.7 Worked Example: Implementing Automatic Rollback for Smart Lock

Scenario: A smart lock manufacturer deploys 80,000 Bluetooth-enabled door locks in residential buildings. After a firmware update (v3.2.0) introduced a bug causing 2.3% of locks to fail Bluetooth pairing after reboot, customers were locked out of their homes. The company must implement automatic rollback to prevent future incidents.

Given:

  • Fleet size: 80,000 smart locks (ESP32-based, 4MB flash)
  • Firmware size: 1.2 MB per partition
  • Flash layout: Bootloader (64KB) + Partition A (1.5MB) + Partition B (1.5MB) + NVS (64KB)
  • Boot failure rate threshold: 0.5% (400 devices)
  • Average boot time: 3.2 seconds
  • Watchdog timeout: 30 seconds
  • Health check duration: 10 seconds post-boot

Steps:

  1. Design boot counter and health check mechanism:

    • Boot counter stored in NVS (non-volatile storage)
    • Counter increments on every boot attempt
    • Counter resets to 0 only after successful health check
    • Maximum boot attempts before rollback: 3
  2. Define health check criteria:

    • Bluetooth stack initialization: PASS/FAIL (critical)
    • Motor driver response: PASS/FAIL (critical)
    • NFC reader initialization: PASS/FAIL (non-critical)
    • Wi-Fi association (if configured): PASS/FAIL (non-critical)
    • Health check pass: All critical checks pass within 10 seconds
  3. Calculate failure detection time:

    • Boot attempt 1: 3.2s boot + 10s health check = 13.2s (fail)
    • Boot attempt 2: 3.2s boot + 10s health check = 13.2s (fail)
    • Boot attempt 3: 3.2s boot + 10s health check = 13.2s (fail, trigger rollback)
    • Rollback boot: 3.2s boot + 10s health check = 13.2s (pass)
    • Maximum time to recovery: 52.8 seconds
  4. Implement rollback state machine:

    State transitions:
    BOOT_NEW -> (health_pass) -> COMMIT_NEW
    BOOT_NEW -> (health_fail && attempts < 3) -> RETRY_NEW
    RETRY_NEW -> (health_pass) -> COMMIT_NEW
    RETRY_NEW -> (health_fail && attempts >= 3) -> ROLLBACK
    ROLLBACK -> (switch to partition A) -> BOOT_OLD
    BOOT_OLD -> (health_pass) -> SAFE_MODE
    SAFE_MODE -> (report to cloud) -> AWAIT_FIX
Try It: Rollback State Machine Simulator
  1. Calculate fleet-wide rollback impact:
    • v3.2.0 failure rate: 2.3% = 1,840 devices
    • With automatic rollback: 1,840 devices recover in <60 seconds
    • Without rollback: 1,840 customers locked out (support tickets, locksmith calls)
    • Support cost avoided: 1,840 x $150 average incident cost = $276,000

Result: The automatic rollback mechanism detects Bluetooth pairing failures within 3 boot attempts (52.8 seconds worst case) and reverts to the previous known-good firmware. The 2.3% of affected devices (1,840 locks) automatically recover without customer intervention.

Key Insight: Automatic rollback is about time to recovery, not just failure detection. A 3-attempt limit with 10-second health checks provides enough retry margin for transient failures while ensuring genuine firmware bugs trigger rollback within one minute.

26.8 Worked Example: Signature Verification Chain for Medical Device

Scenario: A medical device manufacturer produces 12,000 insulin pumps that receive firmware updates via Bluetooth from a companion smartphone app. FDA regulations (21 CFR Part 11, IEC 62443) require cryptographic verification of all software updates.

Given:

  • Fleet size: 12,000 insulin pumps (ARM Cortex-M4, 256KB flash, 64KB RAM)
  • Firmware update size: 180 KB
  • Bluetooth transfer rate: 125 Kbps (BLE 4.2)
  • CPU clock: 64 MHz
  • Cryptographic library: mbedTLS (optimized for embedded)
  • Regulatory: FDA Class II, IEC 62443 SL2
  • Threat model: Malicious app, compromised smartphone, supply chain attack

Steps:

  1. Design three-level certificate hierarchy:

    Root CA (HSM-protected, offline)
      +-- Intermediate CA (Code Signing)
           +-- Firmware Signing Key (per-product)
    
    Key algorithms:
    - Root CA: Ed25519 (256-bit, fast verification)
    - Intermediate: Ed25519
    - Firmware key: Ed25519
    
    Certificate lifetimes:
    - Root: 20 years (never rotated)
    - Intermediate: 5 years (rotated with firmware)
    - Firmware: 2 years (per-version)
Try It: Certificate Chain Verification Calculator
  1. Calculate signature verification time:

    • Ed25519 verification on Cortex-M4 @ 64MHz: ~12 ms
    • Three signatures to verify: Root, Intermediate, Firmware
    • Hash calculation (SHA-256, 180KB): ~45 ms
    • Total verification time: 3 x 12ms + 45ms = 81 ms
  2. Design update package structure:

    SIGNED_PACKAGE (196 KB total):
    +------------------------------------------+
    | Header (256 bytes)                       |
    |   - Magic: 0x494E5355 ("INSU")           |
    |   - Version: 4.2.1                        |
    |   - Target HW: PM-300                     |
    |   - Min version: 4.0.0 (anti-rollback)   |
    |   - Firmware size: 180,224 bytes          |
    |   - SHA-256 hash: [32 bytes]              |
    +------------------------------------------+
    | Certificate chain (2 KB)                 |
    |   - Intermediate cert: 512 bytes          |
    |   - Firmware cert: 512 bytes              |
    |   - Signatures: 64 + 64 bytes             |
    +------------------------------------------+
    | Firmware image (180 KB)                  |
    |   - Encrypted with AES-256-GCM            |
    |   - IV: [12 bytes]                        |
    |   - Auth tag: [16 bytes]                  |
    +------------------------------------------+
    | Package signature (64 bytes)             |
    +------------------------------------------+
Try It: Update Package Size Breakdown
  1. Implement verification sequence:
    • Step 1: Verify package signature against firmware cert (12 ms)
    • Step 2: Verify firmware cert against intermediate cert (12 ms)
    • Step 3: Verify intermediate cert against root CA (embedded) (12 ms)
    • Step 4: Calculate SHA-256 hash of firmware image (45 ms)
    • Step 5: Compare hash with signed header value (0.1 ms)
    • Step 6: Check version >= min_version (anti-rollback) (0.1 ms)
    • Step 7: Decrypt firmware with device-specific key (95 ms)
    • Total verification: 176 ms (before writing to flash)
  2. Calculate security coverage:
    • Attack blocked: Unsigned firmware (signature check fails)
    • Attack blocked: Old firmware replay (anti-rollback check)
    • Attack blocked: Firmware for different product (target HW check)
    • Attack blocked: Tampered firmware (hash mismatch)
    • Attack blocked: Stolen signing key (cert expiry + revocation)
    • Remaining risk: Compromised HSM (mitigated by HSM security)

Result: The insulin pump verifies firmware authenticity in 176 ms using a three-level certificate chain with Ed25519 signatures. This verification runs entirely on-device before any flash writes occur, ensuring that malicious firmware cannot execute.

Key Insight: Medical device firmware security is about defense in depth with regulatory compliance. The 176 ms verification overhead is negligible compared to the 11.5-second BLE transfer time for a 180 KB update. The critical design choice is using Ed25519 over RSA—at equivalent security levels, Ed25519 verification is 10x faster on resource-constrained MCUs (12 ms vs 120 ms for RSA-2048).

26.9 Worked Example: Secure OTA Update System for Connected Vehicles

Scenario: An automotive OEM operates a fleet of 50,000 connected vehicles with telematics control units (TCUs) that receive over-the-air firmware updates. The OEM must design a robust, secure OTA update system that ensures integrity, prevents attacks, and guarantees vehicle safety even if updates fail.

Given:

  • 50,000 vehicles with ARM-based TCUs (1GB flash, 512MB RAM)
  • Cellular connectivity (4G LTE) with average 10 Mbps bandwidth
  • Update sizes: 50MB (minor patches) to 500MB (major releases)
  • Safety requirement: Vehicle must remain drivable even if update fails
  • Regulatory: UN R155 (cybersecurity), UN R156 (software updates)
  • Threat model: Nation-state attackers, criminal hackers, insider threats

Key Design Elements:

  1. Signed Update Package Structure:
    • Outer envelope signed by Release Management key
    • Manifest with target vehicles, minimum version, install conditions
    • Per-component encryption with device-specific keys
  2. Staged Rollout Strategy:
    • Phase 1: Internal testing (100 employee vehicles, 7 days)
    • Phase 2: Early adopter (1,000 opt-in customers, 7 days)
    • Phase 3: Regional rollout (10,000 vehicles, 14 days)
    • Phase 4: Full fleet (remaining 39,000, rolling 10,000/week)
    • Automatic pause if rollback rate >0.1% or safety event detected
  3. Vehicle-Side Update Process:
    • Download: Background while driving, resume on interruption
    • Verify: Check signatures, version anti-rollback, hash integrity
    • Stage: Decrypt and write to inactive partition
    • User Consent: Display changelog, schedule installation
    • Install: Only when parked, battery >50%, switch boot partition
    • Validate: Run self-test, commit or auto-rollback
  4. A/B Partition Scheme for Atomic Updates:
    • Partition A: Currently active system
    • Partition B: Staging area for new firmware
    • Boot Config Block: Tracks active partition, boot count, status
    • Auto-rollback to previous partition if boot fails 3x

Result:

  • Updates download in background while vehicle is in use
  • Installation only occurs when vehicle is safely parked
  • Atomic A/B updates mean the vehicle is ALWAYS bootable
  • Automatic rollback if new firmware fails to boot 3 times
  • Staged rollout catches issues before fleet-wide deployment

Key Insight: Vehicle OTA updates have a critical safety dimension. Key principles: (1) Never modify the running system—always stage to inactive partition; (2) Never leave the vehicle unbootable—A/B partitions guarantee this; (3) Automatic rollback is mandatory; (4) Staged rollout is not optional.

Objective: Simulate how an IoT device verifies firmware integrity using SHA-256 hashing before installation. Even a single-bit change in the firmware binary produces a completely different hash, preventing tampered code from executing.

import hashlib
import os

def compute_firmware_hash(firmware_data):
    """Compute SHA-256 hash of firmware binary, matching what the
    manufacturer publishes alongside each official release."""
    return hashlib.sha256(firmware_data).hexdigest()

def verify_firmware(firmware_data, expected_hash):
    """Device-side verification: compare computed hash against the
    manufacturer-published hash embedded in the signed update manifest."""
    actual_hash = compute_firmware_hash(firmware_data)
    match = actual_hash == expected_hash
    return match, actual_hash

# Simulate original firmware binary (200 KB)
original_firmware = os.urandom(204800)
official_hash = compute_firmware_hash(original_firmware)

print("=== Firmware Hash Verification ===")
print(f"  Firmware size: {len(original_firmware):,} bytes")
print(f"  Official SHA-256: {official_hash[:32]}...")

# Test 1: Verify unmodified firmware
valid, computed = verify_firmware(original_firmware, official_hash)
print(f"\n  Test 1 - Original firmware:")
print(f"    Computed: {computed[:32]}...")
print(f"    Result: {'PASS - Safe to install' if valid else 'FAIL - Rejected'}")

# Test 2: Attacker modifies a single byte (e.g., injects backdoor)
tampered = bytearray(original_firmware)
tampered[1024] = (tampered[1024] + 1) % 256  # Flip one byte
tampered = bytes(tampered)

valid, computed = verify_firmware(tampered, official_hash)
print(f"\n  Test 2 - One byte modified (offset 1024):")
print(f"    Computed: {computed[:32]}...")
print(f"    Result: {'PASS - Safe to install' if valid else 'FAIL - Tampered!'}")

# Test 3: Attacker appends malicious payload
appended = original_firmware + b"\x00" * 1024  # 1 KB payload
valid, computed = verify_firmware(appended, official_hash)
print(f"\n  Test 3 - Malicious payload appended:")
print(f"    Computed: {computed[:32]}...")
print(f"    Result: {'PASS - Safe to install' if valid else 'FAIL - Tampered!'}")

# Show the avalanche effect: tiny change -> completely different hash
print(f"\n  Avalanche effect (1-byte change):")
print(f"    Original:  {official_hash[:48]}")
print(f"    Tampered:  {compute_firmware_hash(tampered)[:48]}")
print(f"    Matching chars: "
      f"{sum(a==b for a,b in zip(official_hash, compute_firmware_hash(tampered)))}/64")

What to Observe:

  1. SHA-256 produces a 64-character hex string (256 bits) regardless of firmware size
  2. Changing even one byte out of 200,000 produces a completely different hash (avalanche effect)
  3. Appending data also changes the hash – attackers cannot add payloads undetected
  4. Hash verification is computationally cheap (~45 ms for 200 KB on a Cortex-M4) compared to full signature verification
  5. Hash alone does not prove authenticity (an attacker could publish a new hash) – that requires digital signatures (see next exercise)
Try It: Hash Avalanche Effect Visualizer

Objective: Implement the full firmware signing workflow used in production IoT: the manufacturer signs firmware with ECDSA, and devices verify the signature before installation. ECDSA is preferred over RSA for embedded devices because 256-bit ECC provides 128-bit security with 10x smaller signatures.

from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives import hashes
from cryptography.exceptions import InvalidSignature
import hashlib
import os
import time

# ── Manufacturer Side (build server with HSM) ──

# Generate manufacturer's signing key pair (done once, stored in HSM)
manufacturer_key = ec.generate_private_key(ec.SECP256R1())
manufacturer_public = manufacturer_key.public_key()

# Simulate firmware binary
firmware_v2 = os.urandom(204800)  # 200 KB firmware
firmware_hash = hashlib.sha256(firmware_v2).digest()

# Sign the firmware hash with manufacturer's private key
start = time.perf_counter()
signature = manufacturer_key.sign(firmware_hash, ec.ECDSA(hashes.SHA256()))
sign_time = (time.perf_counter() - start) * 1000

print("=== ECDSA Firmware Signing (Manufacturer) ===")
print(f"  Firmware size: {len(firmware_v2):,} bytes")
print(f"  SHA-256 hash: {firmware_hash.hex()[:32]}...")
print(f"  ECDSA signature: {signature.hex()[:32]}... ({len(signature)} bytes)")
print(f"  Signing time: {sign_time:.1f} ms")

# ── Device Side (ESP32 / Cortex-M4 with embedded public key) ──

def verify_firmware_update(fw_data, sig, public_key):
    """Device-side firmware verification:
    1. Hash the received firmware
    2. Verify the signature against manufacturer's public key
    3. Only install if verification passes"""
    fw_hash = hashlib.sha256(fw_data).digest()
    try:
        start = time.perf_counter()
        public_key.verify(sig, fw_hash, ec.ECDSA(hashes.SHA256()))
        verify_time = (time.perf_counter() - start) * 1000
        return True, f"VERIFIED in {verify_time:.1f}ms - safe to install"
    except InvalidSignature:
        return False, "REJECTED - signature invalid, firmware tampered"

# Test 1: Verify authentic firmware
print("\n=== Device-Side Verification ===")
ok, msg = verify_firmware_update(firmware_v2, signature, manufacturer_public)
print(f"  Authentic firmware: {msg}")

# Test 2: Attacker modifies firmware (e.g., inserts backdoor)
tampered_fw = bytearray(firmware_v2)
tampered_fw[0] = (tampered_fw[0] + 1) % 256
ok, msg = verify_firmware_update(bytes(tampered_fw), signature, manufacturer_public)
print(f"  Tampered firmware:  {msg}")

# Test 3: Attacker signs with different key (impersonation)
rogue_key = ec.generate_private_key(ec.SECP256R1())
rogue_hash = hashlib.sha256(firmware_v2).digest()
rogue_sig = rogue_key.sign(rogue_hash, ec.ECDSA(hashes.SHA256()))
ok, msg = verify_firmware_update(firmware_v2, rogue_sig, manufacturer_public)
print(f"  Rogue signature:    {msg}")

print(f"\n  ECDSA-P256 signature size: {len(signature)} bytes "
      f"(vs ~256 bytes for RSA-2048)")

What to Observe:

  1. ECDSA signatures are ~70 bytes compared to 256 bytes for RSA-2048 – saving bandwidth for OTA updates
  2. Only the manufacturer’s private key can create valid signatures; the public key embedded in the device can only verify
  3. Modifying even one byte of firmware invalidates the signature – the hash changes, breaking verification
  4. A rogue attacker with their own key pair cannot forge a valid signature without the manufacturer’s private key
  5. Verification is fast enough to run on microcontrollers (~12 ms on Cortex-M4 with hardware acceleration)
Try It: Signature Algorithm Comparison for IoT

Objective: Simulate the OTP (One-Time Programmable) fuse-based anti-rollback mechanism that prevents attackers from downgrading firmware to older versions with known vulnerabilities. Once a fuse is burned, it cannot be unburned – providing hardware-enforced version floors.

class AntiRollbackSimulator:
    """Simulates OTP fuse-based anti-rollback protection.
    In real hardware, each 'fuse' is a physical circuit element
    that can be permanently blown (1) but never reset to (0)."""

    def __init__(self, num_fuses=16):
        self.fuses = [0] * num_fuses  # All fuses start unburned
        self.boot_log = []

    @property
    def minimum_version(self):
        """Count burned fuses = minimum allowed firmware version."""
        return sum(self.fuses)

    def burn_fuses_to_version(self, target_version):
        """Burn fuses up to target version. Irreversible."""
        current = self.minimum_version
        if target_version <= current:
            return  # Already at or above this version
        for i in range(current, min(target_version, len(self.fuses))):
            self.fuses[i] = 1  # Permanent -- cannot be unset

    def attempt_boot(self, firmware_version, firmware_name=""):
        """Simulate boot attempt with rollback check."""
        min_ver = self.minimum_version
        allowed = firmware_version >= min_ver

        result = {
            "firmware": firmware_name or f"v{firmware_version}.0.0",
            "fw_version": firmware_version,
            "min_required": min_ver,
            "allowed": allowed,
        }

        if allowed:
            # Successful boot -- burn fuses to new version
            self.burn_fuses_to_version(firmware_version)
            result["action"] = "BOOT OK -- fuses updated"
        else:
            result["action"] = "BOOT REJECTED -- rollback attempt"

        self.boot_log.append(result)
        return result

# ── Simulate a firmware lifecycle ──
sim = AntiRollbackSimulator(num_fuses=16)

print("=== Anti-Rollback Protection Simulation ===")
print(f"  OTP fuses: {len(sim.fuses)} available\n")

# Normal upgrade path
boot_sequence = [
    (1, "v1.0.0 (initial release)"),
    (2, "v2.0.0 (security patch)"),
    (3, "v3.0.0 (feature update)"),
    (5, "v5.0.0 (skip v4, major release)"),
    (3, "v3.0.0 (ATTACKER: rollback to vulnerable version)"),
    (4, "v4.0.0 (ATTACKER: try intermediate version)"),
    (5, "v5.0.0 (re-flash current version)"),
    (6, "v6.0.0 (legitimate update)"),
]

for fw_ver, fw_name in boot_sequence:
    result = sim.attempt_boot(fw_ver, fw_name)
    status = "OK" if result["allowed"] else "BLOCKED"
    fuse_state = ''.join(str(f) for f in sim.fuses[:8])
    print(f"  [{status:>7}] {fw_name:<50} "
          f"min_ver={result['min_required']} fuses=[{fuse_state}...]")

# Show why rollback is permanently blocked
print(f"\n  Final fuse state: "
      f"{''.join(str(f) for f in sim.fuses)}")
print(f"  Minimum bootable version: {sim.minimum_version}")
print(f"  Versions 1-{sim.minimum_version - 1} can NEVER boot again")
print(f"  Remaining fuse capacity: "
      f"{len(sim.fuses) - sim.minimum_version} more versions")

What to Observe:

  1. OTP fuses can only transition from 0 to 1, never back – this is a physical property of the silicon
  2. Each successful boot at a higher version burns additional fuses, raising the minimum version floor
  3. Attempting to boot older firmware is rejected before any code executes – the check happens in Boot ROM
  4. Skipping versions (v3 to v5) works correctly – fuses 3 and 4 are both burned
  5. Fuse capacity is finite (16 in this simulation, 32-256 in real chips) – plan your version numbering carefully
  6. A bad firmware release that burns a fuse cannot be “un-released” – test thoroughly before incrementing
Try It: OTP Fuse Anti-Rollback Visualizer

26.11 Secure Boot Lab: ESP32 Firmware Security Simulation

This hands-on lab provides an interactive simulation of secure boot concepts using an ESP32 microcontroller in the Wokwi simulator. You will explore boot verification, firmware integrity checking, rollback protection, and secure key storage through practical code examples.

What You Will Learn

In this lab, you will gain practical experience with:

  1. Boot Verification Process: Understand how devices verify firmware authenticity before execution
  2. Firmware Integrity Checking: Implement SHA-256 hash verification to detect tampering
  3. Rollback Protection: Learn how anti-rollback counters prevent downgrade attacks
  4. Secure Key Storage: Explore secure storage simulation for cryptographic keys
  5. Boot State Indicators: Visualize boot states through LED indicators and serial output
  6. Attack Simulation: See what happens when firmware is tampered with

Prerequisites: Basic Arduino/C++ syntax, understanding of secure boot concepts from this chapter

Estimated Time: 45-60 minutes

26.11.1 Lab Components

Component Purpose Wokwi Part
ESP32 DevKit V1 Main microcontroller simulating secure boot board-esp32-devkit-c-v4
Green LED Indicates successful boot/verification led
Red LED Indicates boot failure/tampering detected led
Yellow LED Indicates verification in progress led
Blue LED Indicates secure key operations led
White LED Indicates rollback protection active led
Push Button 1 Trigger normal boot sequence button
Push Button 2 Simulate firmware tampering attack button
Push Button 3 Simulate rollback attack button
Push Button 4 Reset boot state button
Resistors (5x 220 ohm) Current limiting for LEDs resistor

26.11.2 Interactive Wokwi Simulator

Use the embedded simulator below to build and test your secure boot simulation. Wire the circuit as shown, paste the code, and click “Start Simulation”.

Simulator Tips
  • Wire first: Connect all components before pasting code
  • Paste code: Copy the complete code into the editor
  • Run: Click the green “Play” button to compile and run
  • Serial Monitor: View detailed boot process output in the Serial Monitor panel
  • Buttons: Use the on-screen buttons to simulate different boot scenarios
  • Save: Create a free Wokwi account to save your projects

26.11.3 Circuit Connections

Wire the circuit in Wokwi before entering the code:

ESP32 Pin Connections:
----------------------
GPIO 2  --> Green LED (+)  --> 220 ohm Resistor --> GND  (Boot Success)
GPIO 4  --> Red LED (+)    --> 220 ohm Resistor --> GND  (Boot Failure)
GPIO 5  --> Yellow LED (+) --> 220 ohm Resistor --> GND  (Verification)
GPIO 18 --> Blue LED (+)   --> 220 ohm Resistor --> GND  (Key Operations)
GPIO 19 --> White LED (+)  --> 220 ohm Resistor --> GND  (Rollback Protection)
GPIO 15 --> Button 1       --> GND  (Normal Boot)
GPIO 16 --> Button 2       --> GND  (Tamper Attack)
GPIO 17 --> Button 3       --> GND  (Rollback Attack)
GPIO 21 --> Button 4       --> GND  (Reset State)
Circuit diagram for secure boot lab showing ESP32 DevKit connected to five LEDs (green for boot success on GPIO 2, red for boot failure on GPIO 4, yellow for verification in progress on GPIO 5, blue for key operations on GPIO 18, white for rollback protection on GPIO 19) each with 220 ohm current-limiting resistors, and four push buttons (normal boot on GPIO 15, tamper attack on GPIO 16, rollback attack on GPIO 17, reset state on GPIO 21) connected to ground
Figure 26.2: Circuit diagram for the Secure Boot Lab showing ESP32 connections to status LEDs and control buttons.

26.11.4 Step-by-Step Lab Instructions

Follow these steps to complete the secure boot lab:

Step 1: Build the Circuit
  1. Open the Wokwi simulator above
  2. Add components from the parts panel:
    • 1x ESP32 DevKit V1
    • 5x LEDs (green, red, yellow, blue, white)
    • 5x 220 ohm resistors
    • 4x push buttons
  3. Wire according to the circuit diagram:
    • Connect LEDs through resistors to GPIO 2, 4, 5, 18, 19
    • Connect buttons to GPIO 15, 16, 17, 21 (other pin to GND)
  4. Verify all connections before proceeding
Step 2: Execute Normal Boot Sequence
  1. Press Button 1 (GPIO 15) to trigger a normal boot
  2. Observe in the Serial Monitor:
    • Power-on and ROM initialization
    • Bootloader hash computation and verification
    • Kernel hash computation and verification
    • Application hash and rollback check
  3. Watch the Yellow LED blink during each verification stage
  4. When boot completes successfully:
    • Green LED turns on (boot success)
    • White LED turns on (rollback protection active)
  5. Note how the anti-rollback counter increments
Step 3: Simulate Firmware Tampering Attack
  1. Press Button 2 (GPIO 16) to simulate tampering
  2. The simulation injects “malicious code” into the firmware
  3. Watch the Serial Monitor show:
    • Modified firmware data strings
    • Hash computation of tampered data
    • HASH MISMATCH detection
    • Detailed explanation of why boot was halted
  4. The Red LED blinks and stays on (boot failure)
  5. Device refuses to execute untrusted code!
Step 4: Simulate Rollback Attack
  1. Press Button 4 (GPIO 21) to reset the state first
  2. Execute a normal boot (Button 1) to update the anti-rollback counter
  3. Press Button 3 (GPIO 17) to simulate a rollback attack
  4. The simulation attempts to install “version 1.0.0” (old firmware)
  5. Observe:
    • Rollback protection check FAILS
    • Current version (1.0.0) < Minimum required version
    • Attack blocked before firmware can execute
  6. Red LED indicates the attack was prevented

26.11.5 Expected Outcomes

After completing this lab, you should be able to:

Outcome Verification
Understand the chain of trust concept Can explain how ROM verifies bootloader, bootloader verifies kernel, etc.
Explain hash-based integrity checking Can describe how SHA-256 detects any modification to firmware
Describe rollback protection Can explain why anti-rollback counters prevent downgrade attacks
Identify secure boot failure modes Can list at least 3 ways secure boot can fail and what each means
Understand boot attempt limiting Can explain why devices limit boot attempts and enter recovery mode
Visualize the secure boot process Can draw a diagram showing verification flow from ROM to application
Key Security Principles Demonstrated

This lab illustrates these essential secure boot principles:

  1. Immutable Root of Trust: The ROM code and embedded public key cannot be modified
  2. Chain of Trust: Each stage verifies the next before transferring control
  3. Fail-Secure Design: Any verification failure halts the boot process
  4. Anti-Rollback Protection: Prevents attackers from exploiting old vulnerabilities
  5. Cryptographic Verification: SHA-256 hashes and digital signatures ensure authenticity
  6. Defense in Depth: Multiple verification layers protect against different attack vectors

Real-world application: ESP32 devices in production use similar mechanisms with hardware-backed eFuse storage, making keys truly immutable. This simulation helps you understand the concepts before working with production-level security.


26.11.6 Further Exploration

To learn more about secure boot in production systems:

26.12 Knowledge Check

Explore how boot time, health check duration, and retry count affect worst-case recovery time for automatic rollback systems:

Data cost savings with delta updates over cellular networks:

\[\text{Cost}_{\text{total}} = N_{\text{devices}} \times S_{\text{update}} \times C_{\text{data}} \times F_{\text{updates/year}}\]

Use the interactive calculator below to explore how fleet size, firmware size, data costs, and delta update ratios affect total OTA update expenses:

Key insight: A/B partitioning is justified by preventing device bricking (avoiding truck rolls and support costs), not by bandwidth savings from delta updates. Design decisions must consider total cost of ownership – choose optimizations based on the primary risk, not minor cost reduction.

26.13 Concept Relationships

How Firmware Security Concepts Connect
Core Concept Foundation Protects Enables
Secure Boot Hardware root of trust (ROM) Against malware at startup Verified firmware execution
Code Signing PKI, digital signatures Against unauthorized firmware OTA update authenticity
Anti-Rollback OTP fuses, version counters Against downgrade attacks Security patch enforcement
A/B Partitions Dual flash banks Against update failures Safe rollback capability
Health Checks Post-boot verification Against broken updates Automatic recovery

Dependency Chain: Hardware RoT → Secure Boot → Code Signing → Anti-Rollback → Safe OTA. Each mechanism builds on the previous to create defense-in-depth for firmware integrity.

26.14 See Also

Related Security Topics:

Industry Standards:

  • NIST SP 800-147B: BIOS Protection Guidelines
  • ISO/IEC 19678: Guidelines for security and privacy in IoT
  • IEC 62443: Industrial communication networks security

Implementation Resources:

  • ESP-IDF Secure Boot documentation
  • Android Verified Boot (AVB) specification
  • Microsoft Azure Sphere security whitepaper

Common Pitfalls

TLS encrypts the firmware during transmission but does not prove the firmware itself is authentic if the update server is compromised. Sign firmware with a hardware-protected key and verify the signature on device in addition to using TLS for transmission.

Signing keys in server code are vulnerable to server compromise. Store production signing keys in a hardware security module (HSM) with access controls limiting who can initiate signing operations.

A power outage during firmware flashing will leave the device in a partially updated, non-bootable state if the bootloader does not implement an A/B partition scheme or fallback recovery. Test interrupted update scenarios explicitly.

Deploying a firmware update to all devices simultaneously risks bricking the entire fleet if the update contains a critical bug. Always stage updates to a small canary group, monitor for errors, then roll out to the full fleet.

26.15 Summary

This chapter covered firmware security and secure updates:

Secure Boot Chain:

  • Hardware root of trust (Boot ROM) is immutable foundation
  • Each stage verifies the next: ROM -> Bootloader -> Kernel -> Application
  • Cryptographic signatures (Ed25519/RSA) ensure authenticity

Code Signing:

  • Sign firmware with private key in HSM
  • Device verifies with embedded public key before execution
  • Certificate chains allow key rotation without changing root

Anti-Rollback Protection:

  • OTP fuses store minimum allowed firmware version
  • Prevents downgrade attacks to vulnerable versions
  • Cannot be reset—design firmware release process carefully

OTA Update Best Practices:

  • A/B partition scheme for atomic updates
  • Automatic rollback on boot failure (3 attempts typical)
  • Staged rollout to catch issues before fleet-wide deployment
  • Health checks verify critical functionality post-update

26.16 What’s Next

If you want to… Read this
Understand OTA update security mechanisms Secure OTA Updates
Explore IoT protocol security for update communications Secure Data IoT Protocols
Study software vulnerability management Secure Data Software Vulnerabilities
Learn hardware-level protection for update keys Hardware Vulnerabilities
Return to the security module overview IoT Security Fundamentals