8  HIL Testing for IoT

8.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Architect HIL Systems: Design HIL test setups for IoT devices
  • Build Sensor Simulators: Create DAC-based sensor simulation for automated testing
  • Implement Failure Injection: Test device behavior under fault conditions
  • Integrate HIL with CI/CD: Automate HIL tests in continuous integration pipelines
In 60 Seconds

Hardware-in-the-Loop (HIL) testing connects real IoT device hardware to a test harness that simulates the external environment (sensors, actuators, network) under controlled, automated conditions. HIL testing validates actual hardware/firmware integration that software-only tests cannot — interrupt handling, SPI/I2C timing, power consumption under real RF conditions, and sensor signal processing with realistic signal noise. Automated HIL suites run on CI infrastructure to catch hardware-dependent regressions.

Testing and validation ensure your IoT device works correctly and reliably in the real world, not just on your workbench. Think of it like test-driving a car in rain, snow, and heavy traffic before buying it. Thorough testing catches problems before your devices are deployed to thousands of locations where fixing them becomes expensive and disruptive.

“Hardware-in-the-Loop testing is brilliant!” said Max the Microcontroller. “Instead of needing a real temperature chamber to test if I handle extreme cold correctly, a DAC board feeds me a fake sensor signal that looks exactly like minus 40 degrees. My firmware has no idea it is being tricked!”

Sammy the Sensor explained further. “A test computer controls a DAC – a Digital-to-Analog Converter – that generates exact voltage levels mimicking any sensor reading. Want to test what happens at 150% humidity? The DAC produces that voltage. Power failure during data transmission? The test rig cuts power at the exact right moment.”

Lila the LED described the automation. “The whole setup runs in a CI/CD pipeline. Every time a developer changes the firmware, the test computer automatically flashes the real microcontroller, simulates 100 different sensor scenarios through the DAC, checks the outputs, and reports pass or fail. All in minutes, no human needed.” Bella the Battery highlighted failure injection. “The most valuable HIL tests simulate things going wrong – sensor disconnection, corrupted data, power brownouts, network timeouts. If Max handles these gracefully instead of crashing, the firmware is ready for the real world!”

8.2 Prerequisites

Before diving into this chapter, you should be familiar with:

Key Takeaway

In one sentence: HIL testing runs real firmware against simulated sensors, enabling automated validation of hardware-software interactions.

Remember this rule: If you can’t test it in CI, you can’t prevent regressions. HIL makes hardware testing as automated as software testing.


8.3 What is Hardware-in-the-Loop Testing?

HIL testing involves running your actual device (the Device Under Test, or DUT) while simulating the external world through controlled inputs and monitored outputs.

Hardware-in-the-Loop test system architecture showing test host computer connected to interface board which connects to Device Under Test

HIL system architecture
Figure 8.1: HIL test architecture: Test PC controls interface board, which simulates sensors and monitors device outputs

8.3.1 Why HIL Testing Matters

Testing Approach Coverage Speed Cost Repeatability
Unit Tests (host) Logic only Fast Free Perfect
Manual Testing Full system Slow High Poor
Field Testing Real-world Very slow Very high Impossible
HIL Testing Firmware + HW Medium Medium Excellent

HIL fills the gap: You can test firmware behavior under conditions that are dangerous, expensive, or impossible to create manually (e.g., sensor failure, extreme temperatures, network outages).

HIL testing cost-benefit is quantified by comparing test automation ROI against manual testing labor costs.

\[\text{Annual Savings} = (\text{Manual Test Hours} - \text{HIL Maintenance Hours}) \times \text{Labor Rate} - \text{HIL Setup Cost}\]

For a team running 10 regression tests weekly (each taking 2 hours manually) at $75/hour engineer cost:

\[ \begin{align} \text{Manual cost/year:} & \quad 10 \times 2 \times 52 \times 75 = \$78,000 \\ \text{HIL automation cost:} & \quad \$1,500\text{ (hardware)} + 40\text{ hrs} \times 75\text{ (setup)} = \$4,500 \\ \text{HIL maintenance/year:} & \quad 2\text{ hrs/week} \times 52 \times 75 = \$7,800 \end{align} \]

Net first-year savings: $78,000 - $4,500 - $7,800 = \(65,700\). ROI after setup is 10:1 annually. HIL automation pays for itself in 3-4 weeks for teams with frequent hardware testing needs.

Calculate the return on investment for your HIL testing setup:


8.4 HIL Hardware Setup

8.4.1 Minimum HIL Components

Component Purpose Example Products Cost
Test Host Run test scripts Any PC/laptop Existing
Interface Board Generate/read signals Arduino Mega, Raspberry Pi Pico $25-50
DAC Module Simulate analog sensors MCP4725 (12-bit) $5-10
ADC Module Monitor DUT outputs ADS1115 (16-bit) $10-15
Relay Module Control power cycling 4-channel relay $10
Logic Analyzer Debug protocol issues Saleae Logic 8 (optional) $500

8.4.2 Wiring Architecture

Test Host (USB) ────► Interface Board (Arduino Mega)
                              │
        ┌─────────────────────┼─────────────────────┐
        │                     │                     │
        ▼                     ▼                     ▼
     DAC (I2C)           GPIO Pins            Relay Module
        │                     │                     │
        ▼                     ▼                     ▼
  DUT Analog In        DUT Digital I/O       DUT Power Supply

8.4.3 Example: Simulating a Temperature Sensor (NTC Thermistor)

Real NTC thermistors output analog voltage based on temperature. Your HIL can simulate this:

# On Interface Board (Arduino/Python)
# MCP4725 DAC connected via I2C

import board
import busio
import adafruit_mcp4725
import math

i2c = busio.I2C(board.SCL, board.SDA)
dac = adafruit_mcp4725.MCP4725(i2c)

def simulate_temperature(celsius):
    """
    Simulate NTC thermistor voltage for given temperature.
    Assumes 10k NTC with 10k pull-up, 3.3V reference.
    """
    # Steinhart-Hart approximation
    T = celsius + 273.15
    B = 3950  # B-constant for typical NTC
    R25 = 10000  # Resistance at 25C
    R_pullup = 10000

    R = R25 * math.exp(B * (1/T - 1/298.15))
    voltage = 3.3 * R / (R + R_pullup)

    # Convert to DAC value (12-bit, 0-4095)
    dac_value = int((voltage / 3.3) * 4095)
    dac.raw_value = dac_value
    return voltage

See how temperature maps to voltage for HIL sensor simulation:


8.5 HIL Test Framework

8.5.1 Python Test Framework Structure

hil_tests/
├── conftest.py           # pytest fixtures
├── fixtures/
│   ├── interface.py      # Arduino communication
│   ├── sensors.py        # Sensor simulators
│   └── network.py        # Network emulator
├── tests/
│   ├── test_temperature.py
│   ├── test_wifi_recovery.py
│   └── test_power_cycle.py
└── requirements.txt

8.5.2 conftest.py - Shared Fixtures

import pytest
import serial
import time

class HILInterface:
    def __init__(self, port='/dev/ttyUSB0', baud=115200):
        self.serial = serial.Serial(port, baud, timeout=1)
        time.sleep(2)  # Wait for Arduino reset

    def send_command(self, cmd):
        self.serial.write(f"{cmd}\n".encode())
        return self.serial.readline().decode().strip()

    def set_temperature(self, celsius):
        return self.send_command(f"TEMP:{celsius:.1f}")

    def power_cycle_dut(self, off_seconds=2):
        self.send_command("RELAY:OFF")
        time.sleep(off_seconds)
        self.send_command("RELAY:ON")

    def read_dut_led(self, pin):
        return self.send_command(f"GPIO_READ:{pin}") == "HIGH"

@pytest.fixture(scope="session")
def hil():
    interface = HILInterface()
    yield interface
    interface.serial.close()

@pytest.fixture
def mqtt_monitor():
    """Monitor MQTT messages from DUT."""
    import paho.mqtt.client as mqtt
    messages = []

    def on_message(client, userdata, msg):
        messages.append({
            'topic': msg.topic,
            'payload': msg.payload.decode(),
            'timestamp': time.time()
        })

    client = mqtt.Client()
    client.on_message = on_message
    client.connect("localhost", 1883)
    client.subscribe("dut/#")
    client.loop_start()

    yield messages

    client.loop_stop()
    client.disconnect()

8.6 Writing HIL Test Cases

8.6.1 Temperature Sensor Tests

import pytest
import time

class TestTemperatureSensor:
    """Test DUT temperature reading and reporting behavior."""

    def test_normal_reading(self, hil, mqtt_monitor):
        """DUT should report correct temperature within tolerance."""
        # Simulate 25C
        hil.set_temperature(25.0)
        time.sleep(5)  # Wait for DUT to read and publish

        # Check MQTT message
        temp_msgs = [m for m in mqtt_monitor if 'temperature' in m['topic']]
        assert len(temp_msgs) > 0, "No temperature message received"

        reported_temp = float(temp_msgs[-1]['payload'])
        assert abs(reported_temp - 25.0) < 1.0, \
            f"Temperature {reported_temp}C not within 1C of expected 25C"

    def test_sensor_out_of_range(self, hil, mqtt_monitor):
        """DUT should flag invalid readings outside sensor range."""
        # Simulate impossible temperature (sensor failure)
        hil.set_temperature(-100.0)  # Below NTC range
        time.sleep(5)

        status_msgs = [m for m in mqtt_monitor if 'status' in m['topic']]
        assert any('sensor_error' in m['payload'] for m in status_msgs), \
            "DUT should report sensor error for out-of-range reading"

    @pytest.mark.parametrize("temp", [-40, 0, 25, 50, 85])
    def test_full_range(self, hil, mqtt_monitor, temp):
        """DUT should accurately read across full operating range."""
        hil.set_temperature(temp)
        time.sleep(5)

        temp_msgs = [m for m in mqtt_monitor if 'temperature' in m['topic']]
        reported = float(temp_msgs[-1]['payload'])
        assert abs(reported - temp) < 2.0, \
            f"At {temp}C, reported {reported}C (off by {abs(reported-temp):.1f}C)"

8.6.2 Power Resilience Tests

class TestPowerResilience:
    """Test DUT behavior during power failures."""

    def test_clean_boot_after_power_loss(self, hil, mqtt_monitor):
        """DUT should boot cleanly and resume operation after power loss."""
        mqtt_monitor.clear()

        # Power cycle
        hil.power_cycle_dut(off_seconds=5)
        time.sleep(60)  # Allow full boot sequence

        # Verify operational
        assert len(mqtt_monitor) > 0, \
            "DUT should resume MQTT publishing after power cycle"

    def test_power_during_ota_update(self, hil, mqtt_monitor):
        """DUT should recover if power lost during OTA update."""
        # Trigger OTA update
        hil.send_command("TRIGGER_OTA")
        time.sleep(5)  # Wait for update to start

        # Pull power mid-update
        hil.power_cycle_dut(off_seconds=3)
        time.sleep(90)  # Allow recovery boot

        # Check DUT is operational (either old or new firmware)
        assert len(mqtt_monitor) > 0, \
            "DUT should recover to operational state after interrupted OTA"

    @pytest.mark.parametrize("off_duration", [0.1, 0.5, 1.0, 2.0, 5.0])
    def test_brownout_recovery(self, hil, mqtt_monitor, off_duration):
        """DUT should handle various brownout durations."""
        mqtt_monitor.clear()
        hil.power_cycle_dut(off_seconds=off_duration)
        time.sleep(60)

        assert len(mqtt_monitor) > 0, \
            f"DUT should recover from {off_duration}s power loss"
Try It: Power Cycle Timing Explorer

Explore how brownout duration, boot time, and test timeout interact in power resilience testing. A test fails when the total recovery time (power-off + boot + first-message delay) exceeds the test timeout.


8.7 Failure Injection Scenarios

8.7.1 Advanced HIL Scenarios

def test_i2c_sensor_nack(hil):
    """DUT should handle I2C sensor not responding (NACK)."""
    hil.send_command("I2C_DISCONNECT:TEMP_SENSOR")
    time.sleep(10)

    # DUT should report error, not crash
    assert hil.read_dut_led(STATUS_LED_PIN) == True, \
        "Status LED should indicate error"

def test_ntp_sync_failure(hil, mqtt_monitor):
    """DUT should handle NTP server unreachable."""
    hil.send_command("DNS_BLOCK:pool.ntp.org")
    time.sleep(3600)  # Run for 1 hour

    # Check timestamps are still monotonic
    timestamps = [m['timestamp'] for m in mqtt_monitor]
    assert all(t1 <= t2 for t1, t2 in zip(timestamps, timestamps[1:])), \
        "Timestamps should remain monotonic even without NTP"

8.8 Arduino Interface Firmware

// interface_board.ino
#include <Wire.h>
#include <Adafruit_MCP4725.h>

Adafruit_MCP4725 dac;
const int RELAY_PIN = 7;
const int DUT_LED_PIN = 8;

void setup() {
    Serial.begin(115200);
    dac.begin(0x62);
    pinMode(RELAY_PIN, OUTPUT);
    pinMode(DUT_LED_PIN, INPUT);
    digitalWrite(RELAY_PIN, HIGH);  // DUT power ON by default
}

void loop() {
    if (Serial.available()) {
        String cmd = Serial.readStringUntil('\n');
        handleCommand(cmd);
    }
}

void handleCommand(String cmd) {
    if (cmd.startsWith("TEMP:")) {
        float temp = cmd.substring(5).toFloat();
        setTemperature(temp);
        Serial.println("OK");
    }
    else if (cmd == "RELAY:OFF") {
        digitalWrite(RELAY_PIN, LOW);
        Serial.println("OK");
    }
    else if (cmd == "RELAY:ON") {
        digitalWrite(RELAY_PIN, HIGH);
        Serial.println("OK");
    }
    else if (cmd.startsWith("GPIO_READ:")) {
        int pin = cmd.substring(10).toInt();
        Serial.println(digitalRead(pin) == HIGH ? "HIGH" : "LOW");
    }
    else {
        Serial.println("ERROR:UNKNOWN_COMMAND");
    }
}

void setTemperature(float celsius) {
    // Convert temperature to DAC value for NTC simulation
    float T = celsius + 273.15;
    float B = 3950;
    float R25 = 10000;
    float R_pullup = 10000;

    float R = R25 * exp(B * (1/T - 1/298.15));
    float voltage = 3.3 * R / (R + R_pullup);
    int dacValue = (voltage / 3.3) * 4095;

    dac.setVoltage(dacValue, false);
}
Try It: HIL Serial Command Simulator

Simulate the serial command protocol between the test host and the Arduino interface board. Enter commands to see how the interface firmware parses and responds.


8.9 CI/CD Integration

8.9.1 GitHub Actions Workflow for HIL Tests

name: HIL Tests

on:
  push:
    branches: [main]
  schedule:
    - cron: '0 2 * * *'  # Nightly at 2 AM

jobs:
  hil-test:
    runs-on: [self-hosted, hil-runner]  # Physical machine with HIL setup
    steps:
      - uses: actions/checkout@v3

      - name: Flash DUT with latest firmware
        run: |
          pio run -e esp32dev
          pio run -e esp32dev -t upload

      - name: Run HIL Tests
        run: pytest hil_tests/ -v --tb=short --junitxml=hil-results.xml

      - name: Upload Test Results
        uses: actions/upload-artifact@v3
        with:
          name: hil-test-results
          path: hil-results.xml
Try It: CI/CD HIL Pipeline Time Estimator

Estimate how long your HIL test pipeline takes in CI and whether it fits within your deployment window. Adjust parameters to find the right balance between test coverage and pipeline speed.


8.10 Common HIL Pitfalls and Solutions

Pitfall Symptoms Solution
Timing issues Flaky tests Add explicit waits, use message-based sync
Ground loops Noise in analog signals Use optoisolators, common ground
USB disconnect Interface board resets Use powered USB hub, monitor connection
DUT boot timing Tests fail on first message Wait for “ready” message from DUT
DAC settling time Wrong readings Add 10ms delay after DAC write
Relay bounce Multiple power cycles Add debounce delay (50ms)

8.11 HIL Test Coverage Checklist

Sensor Simulation:
[ ] Normal operating range readings
[ ] Edge of range (min/max)
[ ] Out of range (sensor failure)
[ ] Noisy sensor (add random noise)
[ ] Disconnected sensor (I2C NACK / no response)

Network Scenarios:
[ ] Normal connectivity
[ ] Wi-Fi disconnect/reconnect
[ ] Cloud server unreachable
[ ] DNS failure
[ ] High latency (1000ms+)
[ ] Packet loss (10%, 50%)

Power Scenarios:
[ ] Clean boot
[ ] Power cycle (various durations)
[ ] Brownout (voltage dip)
[ ] Power during OTA update
[ ] Power during flash write

Environmental:
[ ] Temperature extremes (via thermal chamber)
[ ] Humidity (if applicable)
[ ] EMI (via signal injection)
Try It: HIL Test Coverage Scorer

Rate your HIL test coverage across the four key areas. The scorer calculates an overall readiness percentage and highlights gaps that need attention before production release.


8.12 Worked Example: HIL Investment ROI for a Smart HVAC Company

Scenario: An HVAC manufacturer ships 3 firmware releases per quarter for their smart thermostat product line (4 thermostat models, 2 gateway models = 6 DUTs). Each release requires testing across all models. Calculate the ROI of building an automated HIL device farm versus continuing manual testing.

Current Manual Testing Process:

Test Phase Duration per DUT DUTs Total per Release
Flash firmware + verify boot 15 min 6 1.5 hours
Sensor accuracy (5 temp points) 45 min 6 4.5 hours
Wi-Fi provisioning + cloud connect 30 min 6 3.0 hours
HVAC control sequence (heat/cool/fan) 60 min 6 6.0 hours
Power failure recovery (3 scenarios) 40 min 6 4.0 hours
OTA update + rollback 30 min 6 3.0 hours
Regression tests (previous bugs) 90 min 6 9.0 hours
Total per release 31 hours
Annual manual testing cost:
  Releases per year: 12 (3 per quarter)
  Hours per release: 31
  Total hours: 372 hours/year
  Engineer cost: $65/hour (loaded rate)
  Annual cost: $24,180/year

Hidden costs of manual testing:
  Missed bugs (avg 2 per release reaching customers): 2 x 12 = 24 bugs/year
  Bug remediation cost: 24 x $500 (hotfix + deployment) = $12,000/year
  Customer returns from firmware bugs: 0.3% of 50,000 units = 150 returns
  Return cost: 150 x $45 (shipping + processing) = $6,750/year
  Support tickets from field failures: 800/year x $12/ticket = $9,600/year

Total cost of manual testing: $52,530/year

HIL Device Farm Investment:

Component Quantity Unit Cost Total
Raspberry Pi 4 (test controllers) 6 $55 $330
MCP4725 DAC modules (sensor simulation) 12 $8 $96
ADS1115 ADC modules (output monitoring) 6 $12 $72
4-channel relay modules (power cycling) 6 $10 $60
Climate chamber (small, -10C to 60C) 1 $1,200 $1,200
Custom PCB interface boards 6 $35 $210
Wiring, enclosures, USB hubs 1 lot $250 $250
Hardware subtotal $2,218
Test framework development 160 hours $65/hr $10,400
Test case authoring (85 tests) 80 hours $65/hr $5,200
CI/CD integration 20 hours $65/hr $1,300
Development subtotal $16,900
Total investment $19,118

Automated Testing Performance:

Test Phase Automated Duration Improvement
Flash + boot (parallel, all 6 DUTs) 4 min 22x faster
Sensor accuracy (automated DAC sweep) 8 min 34x faster
Wi-Fi provisioning (scripted) 6 min 30x faster
HVAC control sequence (programmatic) 15 min 24x faster
Power failure recovery (relay-controlled) 10 min 24x faster
OTA update + rollback (automated) 8 min 22x faster
Regression suite (85 tests, parallel) 12 min 45x faster
Total per release 63 min 30x faster
Annual automated testing cost:
  Releases per year: 12
  Time per release: 63 min (automated, unattended)
  Engineer review time: 30 min per release (review results)
  Total engineer hours: 12 x 0.5h = 6 hours/year
  Annual labor cost: $390/year
  Hardware maintenance: $500/year
  Annual cost: $890/year

Bug detection improvement:
  Pre-release bug catch rate: 67% (manual) -> 94% (automated)
  Customer-facing bugs: 2/release -> 0.3/release
  Bug remediation reduced: 24 -> 4 bugs/year (savings: $10,000)
  Customer returns reduced: 150 -> 22/year (savings: $5,760)
  Support tickets reduced: 800 -> 120/year (savings: $8,160)

Total cost with HIL: $890 + $2,000 incidents + $990 returns
                     + $1,440 support = $5,320/year

ROI Summary:

Year 1:
  Investment: $19,118
  Savings: $52,530 - $5,320 = $47,210
  Year 1 ROI: ($47,210 - $19,118) / $19,118 = 147%
  Payback period: 4.9 months

Year 2+:
  Annual savings: $47,210 (no investment)
  Cumulative 3-year savings: $122,302

Intangible benefits:
  - Engineers freed from 372 hours/year of manual testing
  - Same-day release confidence (63 min vs 31 hours)
  - 94% pre-release bug detection (vs 67% manual)
  - Nightly regression runs catch issues before they compound

Decision: The HIL investment pays for itself in 4.9 months. The primary value is not cost savings alone but the shift from 31-hour release cycles to 63-minute automated runs, enabling weekly releases instead of quarterly.

8.13 Knowledge Check


8.14 Concept Relationships

Builds on:

Relates to:

Leads to:

Part of:

  • Automated Validation Strategy: Bridges unit tests (pure software) and field tests (pure hardware)

8.15 See Also

HIL Hardware:

Commercial HIL Systems:

Software Tools:

Learning Resources:

8.16 Try It Yourself

Challenge: Build a minimal HIL test setup for an ESP32 temperature monitor using Arduino Mega as the interface board.

Required Hardware ($50 budget):

  • Arduino Mega 2560 ($25)
  • MCP4725 DAC breakout ($8)
  • 4-channel relay module ($8)
  • Jumper wires ($5)
  • ESP32 DevKit (you already have)

Your Task (3 hours):

  1. Wire the HIL Setup:

    • Connect Arduino Mega to PC via USB
    • Connect MCP4725 DAC to Arduino (I2C: SDA/SCL)
    • Connect DAC output to ESP32 ADC pin (simulate NTC thermistor)
    • Connect relay to ESP32 power rail (for power cycling)
  2. Flash Arduino Interface Firmware:

    • Use the provided interface_board.ino code
    • Implement commands: TEMP:25.0, RELAY:ON, RELAY:OFF
  3. Write 3 Python HIL Tests:

    def test_temperature_reading(hil):
        hil.set_temperature(25.0)
        time.sleep(5)
        # Verify ESP32 reported correct temp via serial/MQTT
    
    def test_power_cycle_recovery(hil):
        hil.power_cycle_dut(off_seconds=2)
        time.sleep(60)
        # Verify ESP32 boots and resumes operation
    
    def test_sensor_out_of_range(hil):
        hil.set_temperature(-100.0)  # Impossible temp
        time.sleep(5)
        # Verify ESP32 reports sensor error
  4. Run Tests and Document:

    • Execute all 3 tests
    • Screenshot serial output showing pass/fail
    • Measure test execution time

Deliverable:

  • Wiring diagram photo
  • Arduino firmware source
  • Python test code
  • Test execution log with results

Success Criteria:

  • All 3 tests run automatically with no manual intervention
  • Tests complete in <90 seconds total
  • Power cycle test reliably resets ESP32

Bonus Challenge:

  • Add a 4th test for sensor disconnect (pull DAC output LOW)
  • Integrate tests into GitHub Actions (mock HIL with stubs)

8.17 Summary

Hardware-in-the-Loop testing enables automated firmware validation:

  • HIL Architecture: Test PC + Interface Board + DUT enables controlled testing
  • Sensor Simulation: DACs simulate analog sensors for repeatable tests
  • Failure Injection: Test device behavior under power loss, sensor failure, network issues
  • CI/CD Integration: Run HIL tests nightly on self-hosted runners
  • Coverage: Use checklists to ensure comprehensive failure mode testing

8.18 Knowledge Check

Common Pitfalls

HIL test rigs that require physical intervention to recover from a failed device state (firmware crash, USB enumeration failure, device stuck in bootloader) cannot be used for unattended CI testing. Every HIL rig must have: hardware reset capability (GPIO-controlled power relay or dedicated reset line from CI controller), USB reconnection capability (USB hub with power switching), and JTAG/SWD debug interface for firmware recovery. A HIL rig that needs human reset after every crash provides no automation value.

HIL testing with unmodified production firmware limits observability: you can only observe external behavior (outputs, communication) without visibility into internal state. Maintain a test-enabled firmware build: add debug logging via ITM/RTT, expose internal state via diagnostic UART commands, implement fault injection interfaces (force sensor read failure on next call), and enable coverage instrumentation. Test builds should have the same code path as production, with added observability hooks.

HIL tests that require a human to: start the test, watch a serial terminal, and manually record pass/fail are not scalable beyond 10 test cases. Automate result collection: redirect device serial output to a log file, parse test result lines (pass/fail markers) with a script, and post results to the CI system as JUnit XML or pytest results. CI integration is the only way to achieve continuous HIL testing across a team of developers.

HIL rigs that simulate sensor inputs using DACs or signal generators require periodic calibration to ensure simulated signals match real-world values. A simulated 3.3V from a DAC that has drifted to 3.1V will cause the firmware under test to calculate incorrect sensor readings, potentially causing false test failures or, worse, passing tests with incorrect behavior. Calibrate HIL signal generators against a reference at the start of each test session, or implement automatic calibration using a known-good firmware baseline measurement.

8.19 What’s Next?

Continue your testing journey with these chapters:

Previous Current Next
Integration Testing for IoT Systems HIL Testing for IoT Environmental & Physical Tests