3  Energy Sources for IoT Devices

3.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Compare different battery chemistries and their characteristics for IoT applications
  • Explain energy density, self-discharge, and temperature effects on batteries
  • Evaluate energy harvesting technologies and their practical power outputs
  • Select appropriate power sources based on deployment requirements
  • Calculate battery capacity requirements for target device lifetimes
In 60 Seconds

Selecting the right IoT power source requires matching battery chemistry to deployment conditions (temperature range, discharge rate, shelf life) and evaluating whether energy harvesting from solar, thermal, or vibration sources can supplement or replace the battery for the target deployment lifetime.

Key Concepts

  • Battery Chemistry Types: Primary (non-rechargeable) and secondary (rechargeable) battery types with different energy densities, self-discharge rates, and temperature characteristics relevant to IoT.
  • Lithium Primary (Li-SOCI2): Thionyl chloride lithium battery with highest energy density among primary cells; ideal for long-life IoT deployments requiring 5-10 years of operation.
  • Lithium-Ion (Li-Ion): Rechargeable battery with high energy density and cycle life; used in devices with regular recharging capability (smartphones, GPS trackers).
  • Energy Density: Energy stored per unit volume (Wh/L) or mass (Wh/kg); higher density enables smaller or lighter batteries for same energy capacity.
  • Self-Discharge Rate: Gradual battery capacity loss without any load; Li-SOCI2 has excellent shelf life (1% per year); standard alkaline loses 2-3% per year.
  • Temperature Performance: Battery capacity reduction at low temperatures; critical for outdoor IoT deployments where −20°C temperatures can reduce capacity by 50-70%.
  • Capacity Planning: Process of calculating required battery capacity given average current consumption, duty cycle, deployment duration, and environmental temperature range.

3.2 For Beginners: Where Does IoT Energy Come From?

IoT devices get their energy from two main sources:

  1. Batteries - Store energy chemically, like tiny fuel tanks
  2. Energy Harvesters - Capture energy from the environment (solar, motion, heat)

Most IoT devices use batteries because they’re reliable and predictable. Energy harvesting is exciting but has limitations - you can’t always count on the sun shining or something vibrating!

Key question for any IoT project: How much energy do I need, and where will it come from?

“I come in many flavors,” said Bella the Battery proudly. “Alkaline batteries are cheap and great for easy-access indoor devices. Lithium coin cells last years and work in cold weather. Rechargeable lithium-ion batteries are perfect when you can add a solar panel. Choosing the right chemistry is as important as choosing the right sensor!”

Sammy the Sensor asked about extreme environments: “What if I am deployed in the Arctic where it is minus 40 degrees?” Bella answered, “Lithium thionyl chloride batteries work from minus 55 to plus 85 degrees Celsius and can last over 10 years! They have amazing energy density – 500 watt-hours per kilogram. But they are not rechargeable and cost more.”

Max the Microcontroller brought up harvesting: “For devices that need to last forever without maintenance, energy harvesting is the answer. Solar panels, vibration generators, and thermoelectric devices capture energy from the environment. A solar panel on a parking meter can power it indefinitely!” Lila the LED added, “The key formula is: harvested energy per day must exceed consumed energy per day. If it does, you have a perpetually powered device!”

3.3 Energy Sources

3.3.1 Battery Technologies

Batteries are the most common power source for IoT devices. Understanding battery chemistry characteristics is essential for proper device design.

Hierarchical diagram showing battery technologies branching into primary batteries (alkaline, lithium primary, lithium thionyl chloride) and secondary rechargeable batteries (Li-ion, LiFePO4, NiMH) with their key characteristics listed below each type
Figure 3.1: Overview of battery technologies for IoT applications

3.3.2 Battery Comparison Table

Chemistry Voltage Energy Density Self-Discharge Temperature Range Best For
Alkaline 1.5V 100-150 Wh/kg 2-3%/year -18°C to 55°C Low-cost, moderate life
Lithium Primary 3.0V 250-300 Wh/kg <1%/year -40°C to 60°C Long life, cold environments
Li Thionyl Chloride 3.6V 500+ Wh/kg <1%/10 years -55°C to 85°C Extreme environments, 10+ years
Li-ion/LiPo 3.7V 150-250 Wh/kg 3-5%/month 0°C to 45°C Rechargeable, frequent use
LiFePO4 3.2V 90-120 Wh/kg 2-3%/month -20°C to 60°C Safety-critical, high cycle
NiMH 1.2V 60-80 Wh/kg 15-30%/month -20°C to 50°C Low cost rechargeable

3.3.3 Primary Battery Selection Guide

When to use Alkaline (AA/AAA):

  • Indoor deployments with easy access
  • Cost-sensitive applications
  • Moderate temperature range (-18°C to 55°C)
  • 1-3 year target lifetime
  • Consumer-replaceable batteries preferred

When to use Lithium Primary (CR2032, CR123A):

  • Compact form factor required
  • Wide temperature operation (-40°C to 60°C)
  • 3-5 year target lifetime
  • Low self-discharge essential
  • Stable voltage preferred

When to use Lithium Thionyl Chloride (ER14505, ER34615):

  • Remote/inaccessible deployments
  • Extreme temperatures (-55°C to 85°C)
  • 10+ year target lifetime
  • Industrial/utility applications
  • Higher initial cost acceptable
Tradeoff: Lithium Primary vs Lithium Thionyl Chloride Batteries

Lithium Primary (CR series): Lower cost, widely available, moderate energy density. Good for consumer IoT with 3-5 year targets. Cannot handle high pulse currents well.

Lithium Thionyl Chloride (ER series): Highest energy density, lowest self-discharge (<1%/decade), extreme temperature range. Essential for industrial/utility “deploy and forget” applications. Higher cost, requires careful circuit design for initial voltage delay (passivation), not rechargeable.

Choose Li-SOCl2 when: deployment is permanent (>7 years), temperature extremes expected, replacement is expensive/impossible.

3.3.4 Secondary (Rechargeable) Battery Selection

Lithium-ion/Lithium Polymer:

  • Best for devices with charging infrastructure
  • High energy density, no memory effect
  • 500-1000 charge cycles typical
  • Requires protection circuit (BMS)
  • Not suitable for extreme temperatures

Lithium Iron Phosphate (LiFePO4):

  • Inherently safe chemistry (no thermal runaway)
  • 2000+ charge cycles
  • Lower energy density than Li-ion
  • Wider temperature range than Li-ion
  • Ideal for solar + battery systems

Nickel Metal Hydride (NiMH):

  • Low cost, widely available
  • High self-discharge (use low-self-discharge variants)
  • No toxic materials (easier disposal)
  • Good for frequently charged devices
Tradeoff: Solar Harvesting vs Thermoelectric Harvesting
Factor Solar Harvesting Thermoelectric (TEG)
Power Output 10-200 mW/cm² (outdoor) 0.1-5 mW/cm² (10°C ΔT)
Availability Day only, weather dependent Continuous if gradient exists
Efficiency 15-22% (Si panels) 3-8%
Form Factor Flat panel, needs sun exposure Flexible, hidden installation
Best Applications Outdoor sensors, agriculture Industrial machinery, body heat
Cost $0.50-$2 per watt $5-$20 per watt

Choose Solar when: Outdoor deployment with sun access, moderate power needs (>10mW average), cost-sensitive.

Choose TEG when: Consistent temperature gradient exists (pipes, motors, body), solar access impossible, power needs are <5mW.

Hybrid approach: Some industrial deployments use solar as primary with TEG as backup during low-light periods.

3.4 Energy Harvesting Technologies

Energy harvesting captures ambient energy from the environment to power IoT devices. While promising, realistic expectations are essential.

System block diagram showing energy harvesting flow from multiple sources (solar panel, thermoelectric generator, piezoelectric harvester, RF antenna) through power conditioning and energy storage (battery or supercapacitor) to the IoT device load, with power management IC controlling the entire system
Figure 3.2: Energy harvesting system architecture showing sources, conditioning, storage, and load

3.4.1 Solar Energy Harvesting

Solar harvesting is the most mature and practical energy harvesting technology for IoT:

Condition Power Density 5cm² Panel Output
Direct Sunlight 100 mW/cm² 500 mW
Overcast Outdoor 10 mW/cm² 50 mW
Bright Indoor (window) 1 mW/cm² 5 mW
Office Indoor 0.01 mW/cm² 50 µW
Dim Indoor 0.001 mW/cm² 5 µW

Key Solar Design Considerations:

  1. Panel sizing: Size for worst-case (winter, cloudy) not best-case
  2. Battery buffer: 3-7 days of autonomous operation without sun
  3. MPPT controller: Extracts 20-30% more power than direct connection
  4. Orientation: Fixed panels need optimal angle for location latitude
  5. Cleaning: Dust reduces output by 5-25% over time

3.4.2 Thermoelectric (TEG) Harvesting

Thermoelectric generators convert temperature differences to electricity using the Seebeck effect:

\[P = \alpha^2 \times \Delta T^2 / (4R)\]

Where:

  • α = Seebeck coefficient (V/K)
  • ΔT = Temperature difference (K)
  • R = Internal resistance (Ω)

Practical TEG Power Outputs:

Temperature Difference Typical Power Output
5°C (body heat) 10-50 µW
10°C (warm pipe) 0.1-1 mW
50°C (industrial) 5-50 mW
100°C (exhaust) 50-500 mW

Why does TEG power scale with temperature difference squared? The Seebeck equation reveals the relationship:

\[P = \frac{\alpha^2 \times \Delta T^2}{4R}\]

where \(\alpha\) is the Seebeck coefficient (V/K), \(\Delta T\) is temperature difference (K), and \(R\) is resistance (Ω).

Example: A TEG module with \(\alpha = 0.2\) V/K and \(R = 5\) Ω on a warm pipe:

At 10°C difference: \(P = \frac{(0.2)^2 \times (10)^2}{4 \times 5} = \frac{0.04 \times 100}{20} = 0.2\) mW

At 20°C difference: \(P = \frac{(0.2)^2 \times (20)^2}{4 \times 5} = \frac{0.04 \times 400}{20} = 0.8\) mW

Doubling \(\Delta T\) yields 4× power (quadratic relationship). This is why industrial TEGs targeting 50-100°C gradients deliver much more than body-heat harvesters (5-10°C).

TEG Design Considerations:

  • Maintain temperature gradient (heat sinks essential)
  • Cold side must dissipate heat to environment
  • Power output is proportional to ΔT²
  • Best for constant temperature sources (machines, pipes)

3.4.3 Piezoelectric (Vibration) Harvesting

Piezoelectric materials generate voltage when mechanically stressed:

Typical Power Outputs:

Source Frequency Power Output
Human walking 1-2 Hz 1-10 mW
Machine vibration 50-200 Hz 0.1-10 mW
Structural vibration 10-100 Hz 10-100 µW
Traffic vibration 5-30 Hz 100 µW - 1 mW

Design Challenges:

  • Resonant frequency must match vibration source
  • Narrow bandwidth (tuned to specific frequency)
  • Intermittent output (requires energy storage)
  • Mechanical fatigue over time

3.4.4 RF Energy Harvesting

RF harvesting captures ambient radio waves (Wi-Fi, cellular, broadcast):

Realistic Power Levels:

Source Distance Available Power
Dedicated 1W transmitter 1m 100 µW
Wi-Fi router 1m 10-50 µW
Wi-Fi router 5m 0.1-1 µW
Cellular tower 100m 0.1-1 µW
Broadcast TV 1km 0.01-0.1 µW

RF Harvesting Reality Check:

RF harvesting produces microwatts—sufficient only for:

  • Passive RFID tags (backscatter communication)
  • Sensors with very long sleep intervals (hours)
  • Devices with dedicated RF power transmitters nearby
The Million-to-One Power Ratio

Understanding relative power levels helps set realistic expectations:

Operation Power Required
Ambient RF power (typical) 1 µW
ESP32 deep sleep 10 µW
ESP32 light sleep 800 µW
ESP32 active 50,000 µW (50 mW)
ESP32 Wi-Fi TX 200,000 µW (200 mW)

Ratio of Wi-Fi TX to ambient RF: 200,000:1

This is why RF harvesting cannot power Wi-Fi transmission from ambient sources—you’d need a dedicated power transmitter or massive collection area.

3.5 Battery and Energy Storage Visualizations

The following AI-generated diagrams illustrate key concepts in battery management and energy harvesting for IoT systems.

Diagram showing lithium-ion battery charging profiles with constant current (CC) phase at beginning when battery is depleted followed by constant voltage (CV) phase as battery approaches full charge, with current tapering off exponentially until charging terminates at approximately 3 percent of initial charge current.

Battery Charging Profiles
Figure 3.3: Battery charging profiles for lithium-ion cells. The CC-CV (Constant Current-Constant Voltage) charging algorithm prevents overcharging damage while maximizing charging speed. Understanding this profile helps designers select appropriate charge controllers and estimate charging time.

Circular lifecycle diagram of IoT battery management showing phases from initial charge through deployment, discharge monitoring, low battery alerts, replacement or recharging decision, and disposal or recycling, with annotations about capacity degradation over charge cycles.

Battery Lifecycle Management
Figure 3.4: Battery lifecycle management encompasses the entire operational life of IoT devices. Capacity degradation typically follows a predictable curve, losing 20% capacity after 500 charge cycles, enabling proactive maintenance scheduling.

Exploded view of IoT battery pack showing individual lithium cells, battery management system BMS circuit board with cell balancing and protection ICs, temperature sensor, connector, and protective enclosure with ventilation slots for thermal management.

Battery Pack Design
Figure 3.5: Battery pack architecture for multi-cell IoT applications. The Battery Management System (BMS) ensures balanced charging across cells, monitors temperature, and prevents over-discharge that could permanently damage cells or create safety hazards.

Circuit schematic comparing buck and boost DC-DC converter topologies for energy harvesting applications, showing inductor, switch, diode, and capacitor arrangements with efficiency curves and typical input/output voltage ranges for solar and thermoelectric harvester integration.

DC-DC Converters for Energy Harvesting
Figure 3.6: DC-DC converter topologies enable energy harvesters operating at millivolt levels to power 3.3V microcontrollers. Boost converters step up low solar panel voltages, while buck converters efficiently reduce higher voltage sources to logic levels.

3.6 Battery Capacity Calculation

To determine required battery capacity for a target lifetime:

Step 1: Calculate Average Current

\[I_{avg} = \frac{(I_{active} \times T_{active}) + (I_{sleep} \times T_{sleep})}{T_{cycle}}\]

Step 2: Apply Efficiency Factors

Real battery capacity is typically 60-80% of rated capacity due to:

  • Temperature effects (especially cold)
  • Voltage cutoff (can’t use full capacity)
  • Self-discharge over time
  • Aging/degradation

Step 3: Calculate Required Capacity

\[Capacity = \frac{I_{avg} \times Target Hours}{Efficiency Factor}\]

Example Calculation:

  • Active: 50mA for 5s per hour
  • Sleep: 10µA for 3595s per hour
  • Target: 5 years (43,800 hours)
  • Efficiency: 70%

\[I_{avg} = \frac{(50 \times 5) + (0.01 \times 3595)}{3600} = 0.079 \text{ mA}\]

\[Capacity = \frac{0.079 \times 43800}{0.7} = 4,944 \text{ mAh}\]

A 5000+ mAh battery (such as 2× Li-SOCl2 ER14505 in parallel) would meet this requirement.

3.6.1 Interactive Battery Capacity Calculator

Try adjusting the parameters below to see how they affect battery requirements:

3.7 Code Example: Battery Life Calculator with Derating

This Python tool automates the battery capacity calculation from the section above, adding realistic derating factors for temperature, self-discharge, and aging that are often overlooked in initial estimates:

class BatteryLifeCalculator:
    """Calculate IoT device battery life with realistic derating.

    Accounts for duty cycling, temperature effects, self-discharge,
    and capacity aging -- factors that reduce real-world life by
    30-50% compared to naive calculations.
    """
    def __init__(self, battery_mah, chemistry="li_primary"):
        self.battery_mah = battery_mah
        self.chemistry = chemistry
        # Derating factors by chemistry
        self.profiles = {
            "alkaline":       {"temp_derate": 0.70, "self_discharge_pct_yr": 3.0,
                               "aging_factor": 0.85, "voltage": 1.5},
            "li_primary":     {"temp_derate": 0.90, "self_discharge_pct_yr": 1.0,
                               "aging_factor": 0.95, "voltage": 3.0},
            "li_thionyl":     {"temp_derate": 0.95, "self_discharge_pct_yr": 0.1,
                               "aging_factor": 0.98, "voltage": 3.6},
            "li_ion":         {"temp_derate": 0.85, "self_discharge_pct_yr": 5.0,
                               "aging_factor": 0.80, "voltage": 3.7},
        }

    def calculate(self, active_ma, active_sec, sleep_ua, cycle_sec,
                  target_years=None):
        """Calculate battery life with all derating factors.

        Args:
            active_ma: Current during active period (mA).
            active_sec: Duration of active period per cycle (seconds).
            sleep_ua: Sleep current (microamps).
            cycle_sec: Total cycle period (seconds).
            target_years: Optional target to check pass/fail.

        Returns:
            Dict with detailed breakdown.
        """
        profile = self.profiles[self.chemistry]
        sleep_sec = cycle_sec - active_sec
        duty_pct = (active_sec / cycle_sec) * 100

        # Average current (ideal)
        avg_ma = (active_ma * active_sec + (sleep_ua / 1000) * sleep_sec) / cycle_sec

        # Ideal life (no derating)
        ideal_hours = self.battery_mah / avg_ma
        ideal_years = ideal_hours / 8760

        # Apply derating
        usable_mah = self.battery_mah
        usable_mah *= profile["temp_derate"]       # Cold weather
        usable_mah *= profile["aging_factor"]       # Capacity fade

        # Self-discharge loss over estimated life
        est_years = min(ideal_years, 15)  # Cap estimate
        sd_loss = profile["self_discharge_pct_yr"] * est_years / 100
        usable_mah *= (1 - sd_loss)

        real_hours = usable_mah / avg_ma
        real_years = real_hours / 8760

        result = {
            "chemistry": self.chemistry,
            "battery_mah": self.battery_mah,
            "avg_current_ma": round(avg_ma, 4),
            "duty_cycle_pct": round(duty_pct, 3),
            "ideal_life_years": round(ideal_years, 2),
            "usable_mah": round(usable_mah, 1),
            "real_life_years": round(real_years, 2),
            "derating_loss_pct": round((1 - real_years / ideal_years) * 100, 1),
        }
        if target_years:
            result["target_years"] = target_years
            result["pass"] = real_years >= target_years
        return result

# Example: Outdoor soil sensor (from earlier section)
calc = BatteryLifeCalculator(battery_mah=3000, chemistry="li_primary")
result = calc.calculate(
    active_ma=50, active_sec=5,
    sleep_ua=10, cycle_sec=3600,  # 1 reading per hour
    target_years=5
)
for k, v in result.items():
    print(f"  {k}: {v}")

# Compare chemistries for same workload
print("\nChemistry comparison:")
for chem in ["alkaline", "li_primary", "li_thionyl"]:
    c = BatteryLifeCalculator(3000, chemistry=chem)
    r = c.calculate(active_ma=50, active_sec=5, sleep_ua=10, cycle_sec=3600)
    print(f"  {chem:15s}  ideal={r['ideal_life_years']}y  "
          f"real={r['real_life_years']}y  loss={r['derating_loss_pct']}%")
# Output:
#   chemistry: li_primary
#   battery_mah: 3000
#   avg_current_ma: 0.0793
#   duty_cycle_pct: 0.139
#   ideal_life_years: 4.32
#   usable_mah: 2536.5
#   real_life_years: 3.65
#   derating_loss_pct: 15.5
#   target_years: 5
#   pass: False
#
# Chemistry comparison:
#   alkaline         ideal=4.32y  real=2.79y  loss=35.3%
#   li_primary       ideal=4.32y  real=3.65y  loss=15.5%
#   li_thionyl       ideal=4.32y  real=4.01y  loss=7.1%

The derating reveals a critical insight: the “ideal” 4.3-year estimate drops to 3.65 years for Li-primary (15% loss) and just 2.8 years for alkaline (35% loss from cold weather and self-discharge). To meet a 5-year target, you would need either a larger battery or Li-thionyl chloride chemistry.

Scenario: Design a solar-powered weather station for deployment in rural Australia. The system must operate year-round including during winter with shorter days and occasional cloudy periods.

Given Requirements:

  • Location: Melbourne, Australia (latitude 37.8°S)
  • System power consumption: 50 mW average (ESP32 + sensors)
  • Must survive 7 consecutive cloudy days without sun
  • Components: Solar panel, LiFePO4 battery, charge controller

Steps:

  1. Calculate daily energy requirement:
    • Power: 50 mW = 0.05 W
    • Daily energy: 0.05 W × 24 hours = 1.2 Wh
  2. Size battery for autonomy:
    • 7-day autonomy: 1.2 Wh × 7 = 8.4 Wh
    • LiFePO4 usable capacity (80% DoD): 8.4 / 0.8 = 10.5 Wh
    • Using 3.2V LiFePO4: 10.5 Wh / 3.2V = 3,281 mAh
    • Selected battery: 3,400 mAh LiFePO4 (standard 18650 size)
  3. Calculate worst-case solar insolation:
    • Melbourne winter (June): 2.5 peak sun hours per day average
    • Cloudy days: 0.5 peak sun hours
    • Panel tilt angle: 38° (latitude + 0° for year-round optimization)
    • Shading factor: 0.9 (assume 10% shading from mounting structure)
  4. Size solar panel:
    • Daily energy needed: 1.2 Wh
    • Must also recharge battery after 7 cloudy days
    • On sunny day, need: 1.2 Wh (daily) + (8.4 Wh / 3 days recovery) = 4.0 Wh/day
    • Panel output needed: 4.0 Wh / (2.5 peak hours × 0.9 shading × 0.75 efficiency) = 2.37 W
    • Selected panel: 5W solar panel (2× margin for safety)
  5. Verify design with simulation:
    • Sunny day: 5W × 2.5h × 0.9 × 0.75 = 8.44 Wh generated, 1.2 Wh consumed → +7.24 Wh stored
    • Cloudy day: 5W × 0.5h × 0.9 × 0.75 = 1.69 Wh generated, 1.2 Wh consumed → +0.49 Wh stored
    • After 7 cloudy days: Battery at 10.5 - (7 × 0.71) = 5.5 Wh (still 52% charged)
    • 3 sunny recovery days restore full charge ✓
  6. Select charge controller:
    • Input: 5W panel at 6V = 0.83A max
    • Output: 3.2V LiFePO4
    • Selected: CN3065 solar charge controller (suitable for single-cell LiFePO4)

Result: The system operates reliably year-round with a 5W panel and 3.4Ah battery, providing 2× safety margin beyond the 7-day autonomy requirement.

Key Insight: Always size for worst-case (winter) and add margin. A 2.4W panel would be mathematically sufficient, but the 5W panel (2× oversized) ensures reliable operation even with unexpected dust accumulation or panel degradation over years.

3.7.1 Interactive Solar Panel Sizing Calculator

Calculate solar panel requirements for your IoT deployment:

Decision Factors:

Factor Primary Battery Only Solar + Battery Energy Harvesting (TEG/Vibration)
Upfront Cost Low ($5-20) Medium ($30-80) High ($100-300)
Maintenance Replace every 1-5 years Minimal (clean panel yearly) Minimal
Reliability High (predictable) Medium (weather dependent) Low (source dependent)
Lifespan 1-10 years 10-20 years (panel life) 5-15 years
Best For Indoor, accessible Outdoor, sunlight Industrial, constant heat/vibration

Decision Tree:

  1. Is the device accessible for maintenance?
    • YES and indoors → Primary battery (alkaline or Li-primary)
    • NO or outdoor → Continue to #2
  2. Does the location get reliable sunlight?
    • YES (outdoor, >3 peak hours/day average) → Solar + rechargeable battery
    • NO (shaded, indoor, far north/south latitude) → Continue to #3
  3. Is there a constant thermal gradient or vibration?
    • YES (pipes, machinery) → Consider TEG or piezo harvesting
    • NO → Large primary battery with LoRa (low power protocol)
  4. What is the total cost of ownership (TCO) over 10 years?

Example Calculation (100 remote sensors, 10-year deployment):

Option A: Primary Battery

  • Hardware: 100 × $15 = $1,500
  • Battery replacements: 100 × 2 replacements × $10 = $2,000
  • Labor (tech visit): 100 × 2 visits × $50 = $10,000
  • Total: $13,500

Option B: Solar + Rechargeable

  • Hardware: 100 × $60 = $6,000
  • No replacements needed (battery lasts 5 years, panel lasts 20 years)
  • Maintenance: 100 × 1 cleaning visit × $50 = $5,000
  • Total: $11,000

Winner: Solar saves $2,500 over 10 years AND eliminates risk of battery failure before replacement window.

When Battery-Only Makes Sense:

  • Indoor sensors with easy access (office building)
  • Short deployment timeframe (<2 years)
  • Very low power consumption where Li-SOCl2 lasts 10+ years
Common Mistake: Underestimating Solar Panel Degradation and Soiling

The Problem: You size a solar panel for your daily energy needs based on manufacturer ratings, but the system fails in the field after 6-12 months.

What Went Wrong:

  1. Panel degradation: Solar panels lose 0.5-1% efficiency per year
    • After 5 years: 95% of original output
    • After 10 years: 90% of original output
    • Solution: Add 10-20% oversizing margin to account for aging
  2. Soiling (dust, dirt, bird droppings): Can reduce output by 20-50% in dry climates
    • Monthly cleaning: ~5% loss
    • Yearly cleaning: ~25% loss
    • Never cleaned: ~50% loss
    • Solution: Design for 20-30% soiling loss, or plan cleaning schedule
  3. Temperature derating: Panels lose efficiency at high temps
    • Rated at 25°C (77°F)
    • At 65°C (149°F): 15-20% power loss
    • Solution: Use temperature coefficient from datasheet, add 15% margin for hot climates
  4. Shading: Even 10% shading can reduce output by 50% (series-connected cells)
    • Partial shading causes hotspots and disproportionate losses
    • Solution: Use bypass diodes or microinverters, avoid any shading
  5. Angle/orientation suboptimal: Fixed panels can’t track the sun
    • Optimal angle = latitude, but mounting constraints may force compromises
    • Solution: Simulate with tools (PVWatts) for your specific location and angle

Real-World Example: Agricultural sensor in California desert

  • Initial design: 3W panel for 1.5W daily average load (2× margin)
  • After 8 months: Battery regularly depleted
  • Investigation revealed:
    • Dust accumulation: 30% output loss (never cleaned)
    • Panel temperature: 70°C typical → 18% loss
    • Mounting angle: 15° (should be 35° for latitude) → 12% loss
    • Combined effect: 3W panel performing like 1.5W (50% total loss)

Solution: Upgrade to 6W panel (4× original design), add rain sensor to alert when cleaning needed, adjust tilt angle. System now works reliably even with soiling.

Best Practice: For critical deployments, size solar panels to deliver required power even with: - 20% degradation (age + soiling) - 15% temperature loss - 10% angle suboptimality - Total derating: 0.8 × 0.85 × 0.9 = 0.61 → Panels should be 1.6× oversized minimum

3.8 Knowledge Check

## How It Works

Battery selection and sizing follows a systematic process based on environmental and operational requirements:

Battery chemistry selection criteria:

  1. Temperature range: Below -20°C → Li-SOCl2 required (alkaline capacity drops 70%, Li-ion cannot charge below 0°C)
  2. Lifetime target: >5 years → Primary battery (Li-SOCl2); <2 years with daily charging → Rechargeable (Li-ion/LiPo)
  3. Cost sensitivity: High-volume consumer → Alkaline; Industrial → Li-SOCl2; Product with charging → Li-ion
  4. Self-discharge tolerance: 10-year deployment → Li-SOCl2 (<1% per decade); Annual replacement OK → Any chemistry

Sizing workflow:

  1. Calculate average current from power profile (Iavg = Σ(state current × duration) / cycle time)
  2. Determine target lifetime in hours (years × 8760)
  3. Raw capacity needed = Iavg × lifetime hours
  4. Apply derating: Temperature (0.7-0.9×), aging (0.8-0.95×), voltage cutoff (0.8-0.9×), self-discharge over time
  5. Final capacity = Raw / (temp × aging × cutoff × self-discharge)

Example: 50µA average, 5-year target, -10°C operation → 50µA × 43,800h = 2,190 mAh raw. With derating: 2,190 / (0.8 temp × 0.9 aging × 0.85 cutoff) = 3,583 mAh → Select 2× ER14505 (3,600 mAh total).

Solar harvesting sizing: Panel must generate daily consumption + battery recharge reserve. For 50mW average device in moderate climate (3 peak sun hours/day), need 50mW × 24h / (3h × 0.75 efficiency) = 533mW panel minimum. Add 2× safety margin for cloudy periods → 1W panel. Battery buffers 7 days: 50mW × 168h = 8.4Wh storage.

3.9 Concept Check

## Concept Relationships

Energy sources are the foundation that all other energy design decisions build upon:

  • Constrains: Battery capacity sets the absolute limit for Power Analysis targets—cannot exceed available energy
  • Enables: Proper battery sizing is prerequisite for Energy Harvesting integration (must buffer intermittent solar/thermal input)
  • Temperature Coupling: Battery derating at temperature extremes must align with Environmental Design specs
  • Informs Protocol Selection: Battery life target drives protocol choice—5-year target with 2000mAh battery → LoRaWAN, not WiFi

Design sequence: Requirements → Battery sizing → Average current target → Power profile optimization. Never optimize consumption before knowing battery constraints—may over-engineer (wasting development time) or under-engineer (missing targets).

3.10 See Also

Battery Chemistry Details:

  • Lithium Thionyl Chloride Deep Dive - Passivation, pulse handling
  • Li-ion Safety & BMS - Protection circuits, charging
  • Alkaline vs Lithium Primary - Cost-performance trade-offs

Energy Harvesting:

  • Solar Panel Sizing Calculator - Geographic insolation data
  • Thermoelectric Generators - Industrial heat capture
  • MPPT Charge Controllers - Maximizing solar efficiency

Integration:

Platform-Specific:

3.11 Try It Yourself

3.11.1 Exercise 1: Battery Life Derating

Given: 2,400 mAh Li-primary battery, 75µA average current, deployed at -10°C for 5 years.

Calculate:

  1. Ideal battery life (room temperature, no aging)
  2. Apply temperature derating (80% capacity at -10°C)
  3. Apply aging factor (95% after 5 years)
  4. Apply voltage cutoff (85% usable to 2.0V cutoff)
  5. Final realistic lifetime

Solution:

  1. Ideal: 2,400 mAh / 0.075 mA = 32,000 hours = 3.65 years
  2. Temp: 2,400 × 0.8 = 1,920 mAh
  3. Aging: 1,920 × 0.95 = 1,824 mAh
  4. Cutoff: 1,824 × 0.85 = 1,550 mAh
  5. Real: 1,550 / 0.075 = 20,667 hours = 2.36 years (35% less than ideal!)

What to observe: Derating compounds multiplicatively (0.8 × 0.95 × 0.85 = 0.646 = 35% loss). Always design with margin.

3.11.2 Exercise 2: Solar Harvesting Feasibility

Scenario: Indoor warehouse sensor near window receives 1 mW/cm² average light.

Device specs:

  • Average power: 5 mW (ESP32 with frequent BLE scans)
  • Available solar panel area: 10 cm²

Questions:

  1. Maximum power harvestable: 10 cm² × 1 mW/cm² × 15% efficiency = ?
  2. Can solar alone sustain the device?
  3. What battery capacity needed for 3-day backup?

Answers:

  1. Harvest: 1.5 mW
  2. NO—device needs 5 mW but solar provides 1.5 mW (30% of need)
  3. Energy deficit: 5 mW - 1.5 mW = 3.5 mW. 3-day shortage: 3.5 mW × 72h = 252 mWh = 76 mAh at 3.3V

What to observe: Indoor solar rarely works as primary power. Need either (1) reduce power to <1.5 mW (deep sleep optimization), (2) increase panel to 34 cm² (impractical), or (3) accept battery replacement every ~200 hours (8 days).

3.12 Summary

Key takeaways from this chapter:

  1. Battery Chemistry Matters: Li-SOCl2 for extreme environments (10+ years), Li-ion for rechargeable applications, alkaline for low-cost moderate-life deployments
  2. Energy Harvesting Reality: Solar is practical outdoors (10-200 mW/cm²), but indoor solar (0.01 mW/cm²) is rarely viable as primary power
  3. Temperature Effects: Cold significantly reduces battery capacity (50% at -20°C), requiring careful derating
  4. Storage is Essential: Even with energy harvesting, batteries or supercapacitors are required to buffer intermittent energy availability
  5. Calculate Requirements: Work backwards from target lifetime to determine required capacity, applying realistic efficiency factors (60-80%)

Common Pitfalls

A battery with high mAh may have poor performance at low temperatures, high self-discharge, or inability to deliver pulse current for radio transmissions. Always evaluate chemistry against deployment temperature range, self-discharge rate, and peak discharge capability.

A device that sends one reading per day may use only 10 µAh of useful energy per day, but a primary lithium battery self-discharges at ~1% per month (~20 µAh/day for a 2,000 mAh battery). Self-discharge can dominate total energy consumption for very low-duty-cycle devices — calculate both.

Sizing a solar harvesting system for average irradiance means it fails during cloudy periods. Size the energy storage to bridge your worst-case cloudy period (typically 5–14 consecutive cloudy days), and size the panel to fully recharge storage in 1–2 sunny days after.

Indoor illuminance (100–500 lux) yields roughly 10–50 µW/cm² from amorphous silicon panels — enough for µW-class sensors but not for nodes with active radio transmissions (1–100 mW). Most indoor solar IoT applications require supplementary battery storage or ultra-aggressive duty cycling.

3.13 What’s Next

If you want to… Read this
Analyze power consumption across device states Power Consumption Analysis
Learn energy harvesting design in depth Energy Harvesting Design
Apply low-power design strategies Low-Power Design Strategies
Calculate energy budgets for your design Energy-Aware Considerations
Use interactive tools to validate power sources Interactive Tools