28 Tech Selection & Energy
Sensor Squad: The Great Power Race
Bella the Battery was worried. “I only have so much energy to share!” she told the team.
Max the Microcontroller had an idea. “What if I sleep most of the time? I’ll set an alarm to wake up, quickly read Sammy the Sensor’s data, and then go right back to sleep!”
Sammy the Sensor agreed. “I only need a tiny moment to take a measurement. If Max sleeps 99.9% of the time and only wakes me up when needed, Bella’s energy could last for YEARS!”
Lila the LED chimed in, “And if we put a tiny solar panel on top of our treehouse, we can recharge Bella during the day – she might never run out!”
The lesson: IoT devices save energy by sleeping most of the time and only waking up briefly. It is like how you save phone battery by turning off the screen – except IoT devices are so good at it, they can last for years on a single battery!
28.1 Learning Objectives
By the end of this chapter, you will be able to:
- Apply Selection Frameworks: Use constraint-first decision trees to select and justify appropriate communication technologies based on power, range, data rate, and cost requirements
- Calculate Power Budgets: Compute average current consumption from duty cycle percentages to predict battery lifetime for multi-mode IoT devices
- Design Energy Systems: Architect energy harvesting and battery systems for autonomous IoT devices with worst-case seasonal analysis
- Evaluate Miniaturization Trends: Trace the historical progression of hardware miniaturization and assess its impact on IoT cost and capability
Key Concepts
- Technology Selection Framework: Systematic approach to choosing communication protocols based on battery constraints, range requirements, and data rate needs
- Duty Cycling: Energy optimization technique where devices sleep most of the time, waking briefly for sensing and transmission
- Energy Harvesting: Collecting power from environmental sources (solar, thermal, vibration, RF) to extend or eliminate battery replacement
- Power Budget: Analysis of energy consumption across sleep, sensing, and transmission modes to predict battery lifetime
- Miniaturization: Progressive shrinking of electronic components enabling smaller, cheaper, more efficient IoT devices
28.2 Prerequisites
Before diving into this chapter, you should be familiar with:
- IoT Evolution and Enablers Overview: Understanding the four core enablers
- IoT Communications Technology: Knowledge of PAN/LAN/MAN/WAN protocols
Chapter Position in Series
This is the third chapter in the Architectural Enablers series:
- IoT Evolution and Enablers Overview - History and convergence
- IoT Communications Technology - Protocols and network types
- Technology Selection and Energy Management (this chapter) - Decision frameworks
- Labs and Assessment - Hands-on practice
For Beginners: How Do You Pick the Right Technology?
Think of choosing an IoT communication technology like picking the right vehicle for a trip:
| Trip Type | Vehicle | IoT Equivalent |
|---|---|---|
| Walk to school (short, simple) | Walking | Bluetooth LE – short range, very low power |
| Drive across town (medium) | Car | Wi-Fi – medium range, more power needed |
| Cross-country journey (long) | Train | LoRaWAN – long range, efficient for small messages |
The key question is always: How far does the data need to travel, and how long must the battery last? Start with those two answers and the right technology becomes clear!
How It Works: Constraint-First Technology Selection
Most engineers start by asking “Which technology is best?” – but IoT selection works backward from constraints, not forward from features.
Step 1 - Power Source Constraint: Start with the most limiting factor. Is the device battery-powered with a 5-year replacement interval? This immediately eliminates high-power protocols (Wi-Fi, LTE) and forces you toward ultra-low-power options (LoRa, Zigbee, NB-IoT). If mains-powered, skip to Step 2.
Step 2 - Range Constraint: What’s the maximum distance from device to gateway? < 100m suggests short-range (BLE, Zigbee). 100m-10km suggests mid-range (LoRaWAN, NB-IoT). >10km suggests wide-area cellular (LTE-M, 5G). Range eliminates options: BLE cannot cover a farm, and cellular is overkill for a 20-meter smart home.
Step 3 - Data Rate Constraint: How much data per transmission, and how often? Sending 20 bytes every hour = 480 bytes/day (suits LoRa). Streaming 1080p video = 3-8 Mbps (requires Wi-Fi or LTE). Low data rates (< 1 kbps) enable the most power-efficient protocols.
Step 4 - Cost Constraint: What’s the budget per device at target volume (1,000 units)? Cellular modules cost $15-30 + $1-5/month service. LoRa modules cost $8-15 + $0/month (private network). At 1,000 devices over 5 years, LoRa saves $60,000-300,000 in connectivity fees alone.
Why this order matters: Each step eliminates whole categories of options. Battery life eliminates Wi-Fi. Range eliminates BLE. By the time you reach cost, you’re comparing 2-3 viable options instead of 10+ protocols. This systematic approach prevents “We’ll use Wi-Fi because we know it” mistakes that kill projects during deployment.
28.3 Technology Selection Decision Framework
Use this flowchart to select appropriate communication technologies:
28.4 Energy Management Guidelines
28.4.1 Power Consumption Tiers
Typical IoT device power consumption:
| Tier | Power Range | Battery Life (2000 mAh) | Example Applications |
|---|---|---|---|
| Ultra-Low | 1-100 uW | 10-20 years | Soil sensors, leak detectors |
| Low | 100 uW - 10 mW | 1-5 years | Weather stations, parking sensors |
| Medium | 10-100 mW | Weeks-months | Wearables, smart locks |
| High | 100 mW - 1 W | Days-weeks | Cameras, Wi-Fi devices |
28.4.2 Power Budget Calculation
Formula for average current:
Average Current = (Sleep% x Sleep_I) + (Active% x Active_I) + (TX% x TX_I)
Example: LoRa soil sensor
- Sleep: 99.9% at 5 uA = 0.005 mA
- Sensing: 0.05% at 5 mA = 0.0025 mA
- Transmit: 0.05% at 100 mA = 0.05 mA
- Average: 0.058 mA
- Battery life (2000 mAh): 34,483 hours = 3.9 years
Putting Numbers to It
Duty cycle formula for multi-mode IoT devices:
\[ I_{\text{avg}} = \sum_{i=1}^{n} (d_i \times I_i) \]
Where \(d_i\) is duty cycle fraction for mode \(i\), \(I_i\) is current in that mode.
For the LoRa sensor (15-min cycle = 900s, sense 2s, TX 3s):
\[ I_{\text{avg}} = \left(\frac{895}{900} \times 0.005\right) + \left(\frac{2}{900} \times 5\right) + \left(\frac{3}{900} \times 100\right) \]
\[ = 0.00497 + 0.0111 + 0.333 = 0.058 \text{ mA} \]
Battery life: \(L = C / I_{\text{avg}} = 2000 / 0.058 = 34{,}483\) hours \(\approx 3.9\) years.
Impact of polling frequency: Doubling TX frequency (7.5 min) → \(I_{\text{avg}} = 0.10\) mA → \(L = 2.3\) years (41% reduction).
28.4.3 Interactive: Power Budget Calculator
Use the sliders to explore how duty cycle choices affect IoT battery life:
28.5 IoT Evolution Timeline
28.6 Communication Technology Selection Flowchart
28.7 Energy Harvesting Architecture
28.7.1 Energy Harvesting Power Density
| Source | Typical Power | Best Application |
|---|---|---|
| Outdoor Solar | 10-100 mW/cm2 | Environmental sensors, agricultural |
| Indoor Solar | 10-100 uW/cm2 | Building sensors, retail tags |
| Vibration | 10-500 uW | Industrial machinery, bridges |
| Thermal | 10-50 uW/cm2 | Body heat, industrial processes |
| RF | 1-100 uW | Near wireless power transmitters |
28.8 Miniaturization Trend
28.9 Architecture Enablers Ecosystem
28.10 Power Consumption Comparison
28.11 Knowledge Check
Common Misconception: “More Power = Better IoT Device”
Myth: Higher processing power and data rates always result in better IoT systems.
Reality: IoT design is all about optimization for constraints, not maximizing specifications. A Wi-Fi-enabled device might have 100x the data rate of a LoRa device, but it’s useless for a remote agricultural sensor that needs 5-year battery life.
Why this matters:
- Battery life trumps performance: A soil sensor transmitting 20 bytes/hour needs years of battery life, not megabit speeds
- Cost at scale: Deploying 10,000 sensors? A $5 difference per device = $50,000 total cost difference
- Network effects: Choosing cellular ($2/month/device) vs. LoRaWAN (private network, $0/month) = $240,000/year for 10,000 devices
The right approach: Start with constraints (range, battery life, cost), then select the minimum technology that meets requirements.
28.12 Worked Example: Smart Agriculture Technology Selection
Scenario: A vineyard in Napa Valley needs to monitor soil moisture, temperature, and humidity across 50 hectares. The nearest building with power and internet is 1.2 km from the farthest sensor location. Budget: $15,000 for hardware, $200/month for connectivity.
Step 1: Apply Constraint-First Selection
| Constraint | Requirement | Eliminates |
|---|---|---|
| Power source | Battery (no power lines in vineyard) | Wi-Fi, Ethernet |
| Battery life | >2 years (harvest cycle) | Cellular LTE (weeks), Wi-Fi (days) |
| Range | Up to 1.2 km to gateway | Bluetooth LE (100m), Zigbee (300m) |
| Data rate | 20 bytes every 15 min | Not a limiting factor |
| Cost per node | <$50 (300 sensors needed) | NB-IoT ($15/year subscription each) |
Result: LoRaWAN wins on all constraints.
Step 2: Power Budget Calculation
Using a LoRa module (SX1276) + soil sensor + ESP32 in deep sleep:
Sleep mode (99.95%): 10 uA x 0.9995 = 9.995 uA
Sensor reading (0.03%): 12 mA x 0.0003 = 3.6 uA
LoRa TX SF7 (0.02%): 120 mA x 0.0002 = 24 uA
─────────────────────────
Average current: 37.6 uA
Battery: 2x AA lithium (3000 mAh each, parallel = 6000 mAh)
Lifetime: 6000 mAh / 0.0376 mA = 159,574 hours = 18.2 years (theoretical)
Derated 60% for self-discharge and temperature: 10.9 years
Step 3: Cost Comparison (5-year TCO)
| Solution | Hardware | Connectivity | 5-Year TCO |
|---|---|---|---|
| LoRaWAN (private gateway) | 300 nodes x $35 + 2 gateways x $150 = $10,800 | $0/month (private network) | $10,800 |
| NB-IoT | 300 nodes x $28 = $8,400 | $15/device/year x 300 = $4,500/year | $30,900 |
| Cellular (LTE-M) | 300 nodes x $42 = $12,600 | $2/device/month x 300 = $7,200/year | $48,600 |
| Wi-Fi mesh | 300 nodes x $18 + solar panels x $25 = $12,900 | $50/month (internet) | $15,900 |
LoRaWAN saves $20,100 over NB-IoT and $37,800 over cellular across 5 years. The vineyard chose LoRaWAN with 2 gateways on the main building and a hillside barn.
Try It Yourself: Smart Agriculture Power Budget
Practice power budget calculation with a realistic scenario.
Scenario: Vineyard soil sensor with 3-year battery target (26,280 hours)
Components:
- ESP32 microcontroller
- Capacitive soil moisture sensor
- LoRa SX1276 radio
- 2x AA lithium batteries (3000 mAh each = 6000 mAh total)
Operating profile:
- Wake every 15 minutes to read sensor
- Transmit reading via LoRa
- Go back to deep sleep
Your task: Calculate whether this meets the 3-year target.
Power measurements (you would measure these with a power profiler): - Deep sleep: 10 uA - Sensor reading (5 seconds): 15 mA - LoRa transmit (2 seconds at SF7): 120 mA - Active time per cycle: 7 seconds total - Cycle period: 900 seconds (15 minutes)
Calculation steps:
- Calculate duty cycle percentages: Active% = (7 / 900) = 0.78%, Sleep% = 99.22%
- Calculate average current contribution from each mode:
- Sleep: 0.9922 × 0.010 mA = 0.0099 mA
- Sensor: (5 / 900) × 15 mA = 0.083 mA
- LoRa: (2 / 900) × 120 mA = 0.267 mA
- Total average: 0.0099 + 0.083 + 0.267 = 0.360 mA
- Battery life: 6000 mAh / 0.360 mA = 16,667 hours = 694 days = 1.9 years
Result: Falls short of 3-year target. What changes would you make to reach the goal? (Hints: longer transmit interval, lower spreading factor, eliminate sensor warm-up time, add solar panel)
| Concept | Relationship | Connected Concept |
|---|---|---|
| Constraint-First Selection | Filters protocols by eliminating | High-Power / Short-Range / High-Cost Options – each constraint (battery, range, budget) removes whole categories |
| Duty Cycle Optimization | Extends battery life by maximizing | Sleep Time Percentage – increasing sleep from 99% to 99.9% (10x less awake) roughly halves power consumption |
| Energy Harvesting | Eliminates battery replacement via | Environmental Energy Sources – solar (outdoor), thermal (industrial), vibration (machinery) converted to electrical power |
| LoRa Spreading Factor | Trades range for power/time via | SF7 to SF12 Range – SF12 reaches 4x farther but takes 64x longer to transmit, draining 64x more energy per message |
| Indoor vs Outdoor Solar | Differs by 1000x in power density via | Light Intensity – outdoor 100 mW/cm² vs indoor 0.1 mW/cm², making indoor solar only viable for ultra-low-power devices |
| Power Tier Selection | Matches device complexity to | Energy Budget – ultra-low (µW) for 10-year sensors, low (mW) for multi-year, medium (10s of mW) for weeks, high (W) for hours |
28.13 See Also
- IoT Communications Technology – Detailed protocol comparison (PAN/LAN/MAN/WAN) used in selection decision tree
- Power Management and Interfaces – ESP32 sleep modes, leakage current, and ADC power consumption details
- Edge Computing Patterns – When local processing reduces transmission frequency (and power consumption)
- Energy-Aware Considerations – Advanced duty cycling, context-aware sampling, and energy-quality tradeoffs
- LoRaWAN Deep Dive – Spreading factor, adaptive data rate, and Class A/B/C power implications
28.14 Chapter Summary
This chapter presented frameworks for selecting IoT technologies and managing energy:
- Selection Framework: Use decision trees starting with power constraints, then range, then data rate
- Power Budget Analysis: Calculate average current from duty cycle percentages to predict battery lifetime
- Energy Harvesting: Solar, thermal, vibration, and RF sources can extend or eliminate battery replacement
- Miniaturization Impact: Moore’s Law continues driving smaller, cheaper, more efficient IoT devices
These frameworks enable practical design decisions for real-world IoT deployments.
28.15 What’s Next
| Direction | Chapter | Focus |
|---|---|---|
| Next | Labs and Assessment | Hands-on labs for technology selection, energy system design, and exam preparation |
| Related | Architecture Enablers Review | Synthesis and production readiness review across all four enablers |
| Related | LoRaWAN Deep Dive | Spreading factor, adaptive data rate, and Class A/B/C power implications |
Common Pitfalls
1. Ignoring Quiescent Current of Always-On Peripherals
Calculating battery life using only MCU sleep current (5 µA) while ignoring always-on peripherals (LDO quiescent: 50 µA, pull-up resistors: 100 µA, RTC module: 2 µA). Peripheral quiescent currents often dominate the sleep power budget and must be measured individually.
2. Using Peak Datasheet Current for Budget Calculations
Computing battery life using peak active current (e.g., ESP32 peak: 500 mA) instead of average active current during typical operation (80–120 mA during Wi-Fi transmission). Peak current determines fuse selection; average current determines battery life.
3. Sizing Solar Harvesting for Average, Not Worst-Case Conditions
Sizing a solar system for summer peak irradiance (1,000 W/m²) rather than winter minimum (100 W/m² at high latitudes). Size the battery to bridge consecutive cloudy days, not sunny averages — the system must survive worst-case energy input.
4. Choosing Protocol by Data Rate Spec, Not Power Profile
Selecting a low-power protocol based on advertised RX/TX current without accounting for total energy per message. Measure total energy per message (current × duration) across all protocol phases including startup, header exchange, and acknowledgment.