Problem: CI/CD needs real hardware, but devices are physical.
Solution: Device farm - racks of real devices connected to CI/CD infrastructure.
1569.4.1 Commercial Device Farms
Service
Focus
Pricing
AWS Device Farm
Cloud-based testing on real devices
Pay per minute
Firebase Test Lab
Android/iOS app testing
Free tier available
Golioth Device Test Lab
IoT-specific testing
Enterprise
1569.4.2 DIY Device Farm Setup
Component
Purpose
Example
USB hubs
Connect multiple devices
20-port powered USB hub
Power relays
Reboot devices remotely
USB-controlled relay board
UART adapters
Serial console access
FTDI FT232 (x10 devices)
Wi-Fi access point
Isolated test network
Raspberry Pi 4 as AP
Test automation server
Run tests in parallel
Jenkins on Ubuntu server
1569.4.3 Device Farm Test Example
# device_farm_test.pyimport pytestfrom device_farm import DeviceFarm@pytest.fixture(scope="module")def farm():return DeviceFarm(config="farm_config.yml")def test_firmware_on_all_devices(farm):"""Flash and test firmware on all available devices""" devices = farm.get_available_devices() # Returns 10 ESP32 dev boardsassertlen(devices) >=10, "Not enough devices in farm" results = []for device in devices:# Flash firmware farm.flash_device(device, "firmware.bin")# Reboot farm.reboot_device(device)# Run test suite test_result = farm.run_tests(device, timeout=300) results.append({'device_id': device.id,'passed': test_result.passed,'failed': test_result.failed })# All devices must pass failures = [r for r in results if r['failed'] >0]assertlen(failures) ==0, f"Devices failed: {failures}"
1569.5 Test Metrics and Documentation
1569.5.1 Key Test Metrics
Track these metrics to measure test effectiveness:
Metric
Formula
Target
Purpose
Code Coverage
(Lines executed / Total lines) x 100%
80%+
Ensure adequate test breadth
Defect Density
Bugs found / 1000 lines of code
<5 per KLOC
Measure code quality
Mean Time to Detect (MTTD)
Average time from bug intro to detection
<1 week
Measure test effectiveness
Test Pass Rate
(Passed tests / Total tests) x 100%
>95%
Identify flaky tests
Field Failure Rate
Failures in field / Devices deployed
<1% first year
Validate pre-release testing
1569.5.2 Metrics Dashboard Example
Firmware Version 2.3.1 - Test Metrics
Code Coverage:
Unit tests: 87% (target: 80%) PASS
Integration tests: 42% (additional coverage)
Total coverage: 92%
Defect Density:
Total LOC: 15,000
Bugs found in testing: 23
Density: 1.5 bugs/KLOC PASS (target: <5)
Test Execution:
Unit tests: 1,247 tests, 1,247 passed (100%) PASS
Integration tests: 89 tests, 84 passed (94%) WARNING
End-to-end tests: 12 tests, 10 passed (83%) FAIL
MTTD:
Average: 4.2 days PASS (target: <7 days)
Longest: 18 days (memory leak in sleep mode)
Field Metrics (from v2.3.0):
Deployed devices: 10,000
Field failures: 47 (0.47%) PASS
Top failure: Wi-Fi reconnection timeout (18 devices)
1569.6 Requirements Traceability
Why traceability matters: - Regulatory compliance: FDA, automotive (ISO 26262) require proof of testing - Audit trail: Understand why tests exist, what they validate - Regression prevention: Ensure tests cover all requirements
1569.6.1 Traceability Matrix
Requirement ID
Requirement
Test ID
Test Type
Status
REQ-001
Device boots in <5s
UT-015, IT-003
Unit, Integration
Pass
REQ-002
Wi-Fi reconnects after dropout
IT-022, E2E-005
Integration, E2E
Pass
REQ-003
Battery life >2 years
IT-030, SOAK-001
Integration, Field
Pending
REQ-004
Firmware signed with RSA-2048
ST-012
Security
Pass
REQ-005
Temperature range: -40C to +85C
ENV-001, ENV-002
Environmental
Pass
1569.6.2 Traceability Tools
Tool
Purpose
Integration
JIRA + Xray
Link requirements to test cases
CI/CD reporting
TestRail
Test management with requirements linking
Automation APIs
Polarion
Full ALM (Application Lifecycle Management)
Enterprise
1569.7 Test Strategy Optimization
1569.7.1 Tiered Testing Approach
Not all tests should run on every commit:
Tier
Trigger
Duration
Tests
Tier 1 (Commit)
Every commit
<10 min
Unit tests, lint, build
Tier 2 (PR)
Pull request
<30 min
Integration, security scan
Tier 3 (Nightly)
Scheduled
<4 hours
HIL, extended integration
Tier 4 (Release)
Release candidate
<24 hours
Soak test, full regression
1569.7.2 Test Selection Optimization
# Intelligent test selection based on changed filesdef select_tests(changed_files): tests_to_run =set()forfilein changed_files:iffile.startswith("src/wifi/"): tests_to_run.add("tests/unit/test_wifi.py") tests_to_run.add("tests/integration/test_wifi_connectivity.py")eliffile.startswith("src/sensor/"): tests_to_run.add("tests/unit/test_sensor.py") tests_to_run.add("tests/integration/test_sensor_accuracy.py")eliffile.startswith("src/mqtt/"): tests_to_run.add("tests/unit/test_mqtt.py") tests_to_run.add("tests/integration/test_mqtt_publish.py") tests_to_run.add("tests/integration/test_mqtt_subscribe.py")# Always run critical path tests tests_to_run.add("tests/unit/test_boot.py") tests_to_run.add("tests/integration/test_critical_path.py")returnlist(tests_to_run)
1569.8 Knowledge Check
Show code
InlineKnowledgeCheck({questionId:"kc-testing-automation-1",question:"Your IoT firmware CI/CD pipeline runs 350 unit tests on every commit (8 seconds), 80 integration tests nightly (45 minutes), and 25 HIL tests weekly (4 hours). A developer commits a Wi-Fi reconnection fix. The commit passes unit tests and is merged. The next day, nightly integration tests reveal the fix broke Zigbee fallback mode. Production deployment is blocked for 24 hours while the bug is fixed and retested. How should you restructure the testing pipeline?",options: ["Run all 455 tests on every commit to catch integration bugs immediately, even if CI takes 5+ hours","Move critical integration tests (Wi-Fi, Zigbee, protocol switching) to run on every commit (15 min)","Keep the pipeline as-is - nightly integration testing is industry standard and 24-hour delays are acceptable","Eliminate integration tests entirely - unit tests with mocks should catch all bugs if written properly" ],correctAnswer:1,feedback: ["Incorrect. Running all tests on every commit creates 5-hour feedback loops, killing developer productivity.","Correct! Optimal strategy: Per-commit gates (fast): Unit tests (8s) + critical integration tests (15m) = 15-20min total. Nightly gates (comprehensive): All 80 integration tests. Weekly gates (expensive): HIL tests. The failure pattern reveals Wi-Fi and Zigbee have runtime dependencies that unit tests can't catch.","Incorrect. 24-hour delays for every integration bug is unacceptable. Developers commit at 10am, go home, discover failures next morning - 24 hours later with lost context.","Incorrect. Unit tests with mocks can't catch integration bugs by definition. Wi-Fi/Zigbee sharing a radio resource is exactly what integration tests are designed to validate." ],hint:"Consider the cost of fast feedback (longer CI) vs slow feedback (24-hour delays). What's the minimum set of tests needed to catch 80% of integration bugs?"})
Show code
InlineKnowledgeCheck({questionId:"kc-testing-automation-2",question:"Your smart thermostat project has 85 requirements tracked in JIRA. Your test suite has 420 test cases with 78% code coverage. In a regulatory audit (UL certification), the auditor asks: 'How do you know requirement REQ-042 (must reconnect to Wi-Fi within 30s after router reboot) is tested?' Your team searches for 2 hours but cannot definitively link any specific test to REQ-042. What's the root cause and solution?",options: ["The test coverage is only 78% - increase to 100% and the requirement will be covered","Missing requirements-to-tests traceability matrix - tests exist but aren't linked to requirements","REQ-042 is untested - add a new integration test specifically for router reboot scenarios","This is a documentation problem, not a testing problem - write better test case descriptions" ],correctAnswer:1,feedback: ["Incorrect. Code coverage measures lines executed, not requirements validated. You can have 100% coverage and still fail to test REQ-042 if no test validates the 30-second reconnection time.","Correct! This is a requirements traceability failure. The test probably exists (78% coverage is decent), but you can't prove it. Solution: Create traceability matrix: REQ-042 -> Test cases IT-089, IT-090. Add requirement IDs to test metadata. Generate automated reports.","Incorrect. Before adding duplicate tests, establish traceability to understand your actual test coverage. You might discover 3 tests already cover REQ-042.","Partially correct. Better descriptions help, but they're not sufficient for regulatory compliance. You need structured linking with automated reporting, not just 'write better comments'." ],hint:"Think about the difference between 'tests exist' and 'I can prove which requirements each test validates'."})
1569.9 Summary
Test automation enables quality at scale:
CI/CD Pipelines: Automated build, test, and deployment on every commit
Device Farms: Real hardware testing integrated into CI/CD
Tiered Testing: Fast tests per-commit, comprehensive tests nightly/weekly
Metrics: Track coverage, defect density, MTTD, and field failure rate
Traceability: Link requirements to tests for regulatory compliance
1569.10 Whatβs Next?
Continue your testing journey with these chapters: