28  Tech Selection & Energy

In 60 Seconds

Technology selection follows a constraint-first approach: start with power source (battery vs. harvested), then range (meters vs. km), then bandwidth (bytes vs. megabytes). Power budget formula: average current = (sleep% x sleep_current) + (active% x active_current). At 99% sleep (10uA) and 1% active (20mA), average is 0.21mA, enabling 3+ years on a 2000mAh battery. Outdoor solar harvesting provides 10-100 mW/cm2; indoor solar only 10-100 uW/cm2.

Minimum Viable Understanding
  • Technology selection is driven by constraints (battery life, range, data rate), not by maximizing specs – start with power source, then range, then bandwidth to pick the right protocol.
  • Power budgets determine battery life: calculate average current from duty cycle percentages (sleep%, active%, TX%) to predict how long a device will run.
  • Energy harvesting (solar, thermal, vibration, RF) can extend or eliminate battery replacement, with outdoor solar providing 10-100 mW/cm2 and indoor solar only 10-100 uW/cm2.

Bella the Battery was worried. “I only have so much energy to share!” she told the team.

Max the Microcontroller had an idea. “What if I sleep most of the time? I’ll set an alarm to wake up, quickly read Sammy the Sensor’s data, and then go right back to sleep!”

Sammy the Sensor agreed. “I only need a tiny moment to take a measurement. If Max sleeps 99.9% of the time and only wakes me up when needed, Bella’s energy could last for YEARS!”

Lila the LED chimed in, “And if we put a tiny solar panel on top of our treehouse, we can recharge Bella during the day – she might never run out!”

The lesson: IoT devices save energy by sleeping most of the time and only waking up briefly. It is like how you save phone battery by turning off the screen – except IoT devices are so good at it, they can last for years on a single battery!

28.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Apply Selection Frameworks: Use constraint-first decision trees to select and justify appropriate communication technologies based on power, range, data rate, and cost requirements
  • Calculate Power Budgets: Compute average current consumption from duty cycle percentages to predict battery lifetime for multi-mode IoT devices
  • Design Energy Systems: Architect energy harvesting and battery systems for autonomous IoT devices with worst-case seasonal analysis
  • Evaluate Miniaturization Trends: Trace the historical progression of hardware miniaturization and assess its impact on IoT cost and capability
Key Concepts
  • Technology Selection Framework: Systematic approach to choosing communication protocols based on battery constraints, range requirements, and data rate needs
  • Duty Cycling: Energy optimization technique where devices sleep most of the time, waking briefly for sensing and transmission
  • Energy Harvesting: Collecting power from environmental sources (solar, thermal, vibration, RF) to extend or eliminate battery replacement
  • Power Budget: Analysis of energy consumption across sleep, sensing, and transmission modes to predict battery lifetime
  • Miniaturization: Progressive shrinking of electronic components enabling smaller, cheaper, more efficient IoT devices

28.2 Prerequisites

Before diving into this chapter, you should be familiar with:

Chapter Position in Series

This is the third chapter in the Architectural Enablers series:

  1. IoT Evolution and Enablers Overview - History and convergence
  2. IoT Communications Technology - Protocols and network types
  3. Technology Selection and Energy Management (this chapter) - Decision frameworks
  4. Labs and Assessment - Hands-on practice

Think of choosing an IoT communication technology like picking the right vehicle for a trip:

Trip Type Vehicle IoT Equivalent
Walk to school (short, simple) Walking Bluetooth LE – short range, very low power
Drive across town (medium) Car Wi-Fi – medium range, more power needed
Cross-country journey (long) Train LoRaWAN – long range, efficient for small messages

The key question is always: How far does the data need to travel, and how long must the battery last? Start with those two answers and the right technology becomes clear!

How It Works: Constraint-First Technology Selection

Most engineers start by asking “Which technology is best?” – but IoT selection works backward from constraints, not forward from features.

Step 1 - Power Source Constraint: Start with the most limiting factor. Is the device battery-powered with a 5-year replacement interval? This immediately eliminates high-power protocols (Wi-Fi, LTE) and forces you toward ultra-low-power options (LoRa, Zigbee, NB-IoT). If mains-powered, skip to Step 2.

Step 2 - Range Constraint: What’s the maximum distance from device to gateway? < 100m suggests short-range (BLE, Zigbee). 100m-10km suggests mid-range (LoRaWAN, NB-IoT). >10km suggests wide-area cellular (LTE-M, 5G). Range eliminates options: BLE cannot cover a farm, and cellular is overkill for a 20-meter smart home.

Step 3 - Data Rate Constraint: How much data per transmission, and how often? Sending 20 bytes every hour = 480 bytes/day (suits LoRa). Streaming 1080p video = 3-8 Mbps (requires Wi-Fi or LTE). Low data rates (< 1 kbps) enable the most power-efficient protocols.

Step 4 - Cost Constraint: What’s the budget per device at target volume (1,000 units)? Cellular modules cost $15-30 + $1-5/month service. LoRa modules cost $8-15 + $0/month (private network). At 1,000 devices over 5 years, LoRa saves $60,000-300,000 in connectivity fees alone.

Why this order matters: Each step eliminates whole categories of options. Battery life eliminates Wi-Fi. Range eliminates BLE. By the time you reach cost, you’re comparing 2-3 viable options instead of 10+ protocols. This systematic approach prevents “We’ll use Wi-Fi because we know it” mistakes that kill projects during deployment.

28.3 Technology Selection Decision Framework

~12 min | Advanced | P04.C08.U10

Use this flowchart to select appropriate communication technologies:

IoT communication technology selection decision tree flowchart starting with system requirements definition, branching on battery power constraint (if battery-powered for >1 year prioritize long battery life, if mains-powered proceed to range), then range requirement (< 100m suggests Bluetooth LE/Zigbee/Thread, 100m-10km suggests LoRaWAN/Sigfox/NB-IoT, >10km suggests Cellular/LTE-M), and for mains-powered systems data rate requirement (<1 Mbps suggests Sub-GHz mesh networks, 1-100 Mbps suggests Wi-Fi 6/7 or Ethernet, >100 Mbps suggests Fiber or 5G), guiding systematic selection based on power budget, coverage area, and bandwidth needs
Figure 28.1: Decision tree for selecting IoT communication technology based on power source, range requirements, and data rate needs

28.4 Energy Management Guidelines

28.4.1 Power Consumption Tiers

Typical IoT device power consumption:

Tier Power Range Battery Life (2000 mAh) Example Applications
Ultra-Low 1-100 uW 10-20 years Soil sensors, leak detectors
Low 100 uW - 10 mW 1-5 years Weather stations, parking sensors
Medium 10-100 mW Weeks-months Wearables, smart locks
High 100 mW - 1 W Days-weeks Cameras, Wi-Fi devices

28.4.2 Power Budget Calculation

Formula for average current:

Average Current = (Sleep% x Sleep_I) + (Active% x Active_I) + (TX% x TX_I)

Example: LoRa soil sensor

  • Sleep: 99.9% at 5 uA = 0.005 mA
  • Sensing: 0.05% at 5 mA = 0.0025 mA
  • Transmit: 0.05% at 100 mA = 0.05 mA
  • Average: 0.058 mA
  • Battery life (2000 mAh): 34,483 hours = 3.9 years

Duty cycle formula for multi-mode IoT devices:

\[ I_{\text{avg}} = \sum_{i=1}^{n} (d_i \times I_i) \]

Where \(d_i\) is duty cycle fraction for mode \(i\), \(I_i\) is current in that mode.

For the LoRa sensor (15-min cycle = 900s, sense 2s, TX 3s):

\[ I_{\text{avg}} = \left(\frac{895}{900} \times 0.005\right) + \left(\frac{2}{900} \times 5\right) + \left(\frac{3}{900} \times 100\right) \]

\[ = 0.00497 + 0.0111 + 0.333 = 0.058 \text{ mA} \]

Battery life: \(L = C / I_{\text{avg}} = 2000 / 0.058 = 34{,}483\) hours \(\approx 3.9\) years.

Impact of polling frequency: Doubling TX frequency (7.5 min) → \(I_{\text{avg}} = 0.10\) mA → \(L = 2.3\) years (41% reduction).

28.4.3 Interactive: Power Budget Calculator

Use the sliders to explore how duty cycle choices affect IoT battery life:

28.5 IoT Evolution Timeline

~8 min | Foundational | P04.C08.U11

IoT technology evolution timeline showing seven key milestones from 1999 to 2025: 1999 RFID Technology and Auto-ID Labs establishing object identification, 2008 First IoT Wave with smart devices emerging, 2012 IPv6 Adoption enabling unlimited device addressing, 2015 LPWAN Networks with LoRa and Sigfox deployed for long-range low-power connectivity, 2018 5G and Edge Computing providing ultra-low latency infrastructure, 2020 AI at Edge with TinyML enabling inference on resource-constrained devices, and 2025 Massive IoT Deployments achieving billions of connected devices at scale
Figure 28.2: Timeline showing key milestones in IoT evolution from RFID technology in 1999 to massive IoT deployments in 2025

28.6 Communication Technology Selection Flowchart

Communication technology selection flowchart based on range requirements showing three-branch decision: short range (0-100m) leads to Bluetooth LE for personal area networks or Zigbee for home automation mesh networks, medium range (100m-1km) leads to Wi-Fi for high-bandwidth local area or Thread for low-power mesh, and long range (>1km) leads to LoRaWAN for city-scale deployments, NB-IoT for cellular infrastructure, or traditional Cellular for wide-area mobile connectivity, systematically matching protocol to coverage area needs
Figure 28.3: Flowchart showing communication technology selection based on range requirements with short-range, medium-range, and long-range options

28.7 Energy Harvesting Architecture

Energy harvesting architecture diagram showing four environmental energy sources (Solar Panel, Thermoelectric generator, Piezoelectric vibration harvester, and RF Harvesting antenna) all feeding into central Power Management Unit which regulates and conditions harvested energy, then charges Energy Storage (Battery or Capacitor) buffer, finally powering Microcontroller and Sensors subsystem, enabling autonomous operation without battery replacement by converting ambient energy into usable electrical power
Figure 28.4: Energy harvesting architecture showing multiple energy sources (solar, thermal, vibration, RF) feeding through power management to battery storage and microcontroller

28.7.1 Energy Harvesting Power Density

Source Typical Power Best Application
Outdoor Solar 10-100 mW/cm2 Environmental sensors, agricultural
Indoor Solar 10-100 uW/cm2 Building sensors, retail tags
Vibration 10-500 uW Industrial machinery, bridges
Thermal 10-50 uW/cm2 Body heat, industrial processes
RF 1-100 uW Near wireless power transmitters

28.8 Miniaturization Trend

Hardware miniaturization trend timeline showing four eras of semiconductor evolution: 1970s-1980s with Large ICs featuring 100+ pin packages, 2000s achieving System-on-Chip SoC Integration combining multiple functions on single die, 2010s delivering Wearable MCUs in compact 5mm x 5mm packages, and 2020s onward reaching Sub-mm Sensors approaching dust-sized IoT devices, connected by arrows labeled Moore's Law, Integration, and Nano-scale showing progressive miniaturization driven by transistor scaling and manufacturing advances
Figure 28.5: Historical progression of hardware miniaturization from large integrated circuits in the 1970s to dust-sized IoT sensors in 2020s driven by Moore’s Law

28.9 Architecture Enablers Ecosystem

Architecture enablers ecosystem mind map showing central Architecture Enablers root branching into five primary categories: Miniaturization (SoC Integration, MEMS Sensors, Reduced Cost enabling compact affordable devices), Computing Power (Edge Processing, Low-power MCUs, AI Acceleration for local intelligence), Energy (Batteries, Harvesting, Power Management for autonomous operation), Communications (Short-range BLE, Long-range LoRa, Cellular NB-IoT for diverse connectivity), and Development (Platforms, Tools & IDEs, Cloud Services for rapid deployment), illustrating how five foundational enablers combine to make modern IoT systems possible
Figure 28.6: Mind map showing the five key architectural enablers (miniaturization, computing power, energy, communications, development) and their sub-components
Layered dependency diagram showing architectural enablers as a stack: Foundation layer at bottom with Miniaturization (SoC, MEMS, nano-scale components), middle layer with Computing Power (MCUs, edge processors) and Energy Management (batteries, harvesting, power conditioning) side by side, upper layer with Communications Infrastructure (wireless protocols, network stacks), and top layer with Development Platforms (cloud services, IDEs, frameworks) supporting Smart Applications (industrial, healthcare, smart cities), demonstrating how lower-level enablers provide the foundation that higher layers depend upon to create complete IoT solutions
Figure 28.7: Layered dependency diagram showing how architectural enablers stack to support IoT applications. Foundation layer (miniaturization) enables computing and energy management, which support communications infrastructure, accessible through development platforms that power smart applications.

28.10 Power Consumption Comparison

Power consumption comparison across IoT communication technologies showing three tiers: High Power category (1-10W) includes Wi-Fi Active at 500mW and LTE Cat-M at 800mW for high-bandwidth applications, Medium Power category (10-100mW) includes Bluetooth LE at 15mW and Zigbee at 30mW for moderate-range mesh networks, and Low Power category (<10mW) includes LoRa TX at 5mW for long-range and Sleep Mode at 1uW for idle state, demonstrating 500,000x power range across IoT protocols with clear trade-offs between range, data rate, and energy consumption
Figure 28.8: Power consumption comparison across IoT communication technologies showing high power (Wi-Fi, cellular), medium power (BLE, Zigbee), and low power (LoRa, sleep mode) tiers

28.11 Knowledge Check

Common Misconception: “More Power = Better IoT Device”

Myth: Higher processing power and data rates always result in better IoT systems.

Reality: IoT design is all about optimization for constraints, not maximizing specifications. A Wi-Fi-enabled device might have 100x the data rate of a LoRa device, but it’s useless for a remote agricultural sensor that needs 5-year battery life.

Why this matters:

  • Battery life trumps performance: A soil sensor transmitting 20 bytes/hour needs years of battery life, not megabit speeds
  • Cost at scale: Deploying 10,000 sensors? A $5 difference per device = $50,000 total cost difference
  • Network effects: Choosing cellular ($2/month/device) vs. LoRaWAN (private network, $0/month) = $240,000/year for 10,000 devices

The right approach: Start with constraints (range, battery life, cost), then select the minimum technology that meets requirements.

28.12 Worked Example: Smart Agriculture Technology Selection

Scenario: A vineyard in Napa Valley needs to monitor soil moisture, temperature, and humidity across 50 hectares. The nearest building with power and internet is 1.2 km from the farthest sensor location. Budget: $15,000 for hardware, $200/month for connectivity.

Step 1: Apply Constraint-First Selection

Constraint Requirement Eliminates
Power source Battery (no power lines in vineyard) Wi-Fi, Ethernet
Battery life >2 years (harvest cycle) Cellular LTE (weeks), Wi-Fi (days)
Range Up to 1.2 km to gateway Bluetooth LE (100m), Zigbee (300m)
Data rate 20 bytes every 15 min Not a limiting factor
Cost per node <$50 (300 sensors needed) NB-IoT ($15/year subscription each)

Result: LoRaWAN wins on all constraints.

Step 2: Power Budget Calculation

Using a LoRa module (SX1276) + soil sensor + ESP32 in deep sleep:

Sleep mode (99.95%):     10 uA x 0.9995 = 9.995 uA
Sensor reading (0.03%):  12 mA x 0.0003 = 3.6 uA
LoRa TX SF7 (0.02%):    120 mA x 0.0002 = 24 uA
                         ─────────────────────────
Average current:                           37.6 uA

Battery: 2x AA lithium (3000 mAh each, parallel = 6000 mAh)
Lifetime: 6000 mAh / 0.0376 mA = 159,574 hours = 18.2 years (theoretical)
Derated 60% for self-discharge and temperature: 10.9 years

Step 3: Cost Comparison (5-year TCO)

Solution Hardware Connectivity 5-Year TCO
LoRaWAN (private gateway) 300 nodes x $35 + 2 gateways x $150 = $10,800 $0/month (private network) $10,800
NB-IoT 300 nodes x $28 = $8,400 $15/device/year x 300 = $4,500/year $30,900
Cellular (LTE-M) 300 nodes x $42 = $12,600 $2/device/month x 300 = $7,200/year $48,600
Wi-Fi mesh 300 nodes x $18 + solar panels x $25 = $12,900 $50/month (internet) $15,900

LoRaWAN saves $20,100 over NB-IoT and $37,800 over cellular across 5 years. The vineyard chose LoRaWAN with 2 gateways on the main building and a hillside barn.

Practice power budget calculation with a realistic scenario.

Scenario: Vineyard soil sensor with 3-year battery target (26,280 hours)

Components:

  • ESP32 microcontroller
  • Capacitive soil moisture sensor
  • LoRa SX1276 radio
  • 2x AA lithium batteries (3000 mAh each = 6000 mAh total)

Operating profile:

  • Wake every 15 minutes to read sensor
  • Transmit reading via LoRa
  • Go back to deep sleep

Your task: Calculate whether this meets the 3-year target.

Power measurements (you would measure these with a power profiler): - Deep sleep: 10 uA - Sensor reading (5 seconds): 15 mA - LoRa transmit (2 seconds at SF7): 120 mA - Active time per cycle: 7 seconds total - Cycle period: 900 seconds (15 minutes)

Calculation steps:

  1. Calculate duty cycle percentages: Active% = (7 / 900) = 0.78%, Sleep% = 99.22%
  2. Calculate average current contribution from each mode:
    • Sleep: 0.9922 × 0.010 mA = 0.0099 mA
    • Sensor: (5 / 900) × 15 mA = 0.083 mA
    • LoRa: (2 / 900) × 120 mA = 0.267 mA
  3. Total average: 0.0099 + 0.083 + 0.267 = 0.360 mA
  4. Battery life: 6000 mAh / 0.360 mA = 16,667 hours = 694 days = 1.9 years

Result: Falls short of 3-year target. What changes would you make to reach the goal? (Hints: longer transmit interval, lower spreading factor, eliminate sensor warm-up time, add solar panel)

Concept Relationship Connected Concept
Constraint-First Selection Filters protocols by eliminating High-Power / Short-Range / High-Cost Options – each constraint (battery, range, budget) removes whole categories
Duty Cycle Optimization Extends battery life by maximizing Sleep Time Percentage – increasing sleep from 99% to 99.9% (10x less awake) roughly halves power consumption
Energy Harvesting Eliminates battery replacement via Environmental Energy Sources – solar (outdoor), thermal (industrial), vibration (machinery) converted to electrical power
LoRa Spreading Factor Trades range for power/time via SF7 to SF12 Range – SF12 reaches 4x farther but takes 64x longer to transmit, draining 64x more energy per message
Indoor vs Outdoor Solar Differs by 1000x in power density via Light Intensity – outdoor 100 mW/cm² vs indoor 0.1 mW/cm², making indoor solar only viable for ultra-low-power devices
Power Tier Selection Matches device complexity to Energy Budget – ultra-low (µW) for 10-year sensors, low (mW) for multi-year, medium (10s of mW) for weeks, high (W) for hours

28.13 See Also

28.14 Chapter Summary

This chapter presented frameworks for selecting IoT technologies and managing energy:

  • Selection Framework: Use decision trees starting with power constraints, then range, then data rate
  • Power Budget Analysis: Calculate average current from duty cycle percentages to predict battery lifetime
  • Energy Harvesting: Solar, thermal, vibration, and RF sources can extend or eliminate battery replacement
  • Miniaturization Impact: Moore’s Law continues driving smaller, cheaper, more efficient IoT devices

These frameworks enable practical design decisions for real-world IoT deployments.

28.15 What’s Next

Direction Chapter Focus
Next Labs and Assessment Hands-on labs for technology selection, energy system design, and exam preparation
Related Architecture Enablers Review Synthesis and production readiness review across all four enablers
Related LoRaWAN Deep Dive Spreading factor, adaptive data rate, and Class A/B/C power implications

Common Pitfalls

Calculating battery life using only MCU sleep current (5 µA) while ignoring always-on peripherals (LDO quiescent: 50 µA, pull-up resistors: 100 µA, RTC module: 2 µA). Peripheral quiescent currents often dominate the sleep power budget and must be measured individually.

Computing battery life using peak active current (e.g., ESP32 peak: 500 mA) instead of average active current during typical operation (80–120 mA during Wi-Fi transmission). Peak current determines fuse selection; average current determines battery life.

Sizing a solar system for summer peak irradiance (1,000 W/m²) rather than winter minimum (100 W/m² at high latitudes). Size the battery to bridge consecutive cloudy days, not sunny averages — the system must survive worst-case energy input.

Selecting a low-power protocol based on advertised RX/TX current without accounting for total energy per message. Measure total energy per message (current × duration) across all protocol phases including startup, header exchange, and acknowledgment.