Compare different battery chemistries and their characteristics for IoT applications
Explain energy density, self-discharge, and temperature effects on batteries
Evaluate energy harvesting technologies and their practical power outputs
Select appropriate power sources based on deployment requirements
Calculate battery capacity requirements for target device lifetimes
In 60 Seconds
Selecting the right IoT power source requires matching battery chemistry to deployment conditions (temperature range, discharge rate, shelf life) and evaluating whether energy harvesting from solar, thermal, or vibration sources can supplement or replace the battery for the target deployment lifetime.
Key Concepts
Battery Chemistry Types: Primary (non-rechargeable) and secondary (rechargeable) battery types with different energy densities, self-discharge rates, and temperature characteristics relevant to IoT.
Lithium Primary (Li-SOCI2): Thionyl chloride lithium battery with highest energy density among primary cells; ideal for long-life IoT deployments requiring 5-10 years of operation.
Lithium-Ion (Li-Ion): Rechargeable battery with high energy density and cycle life; used in devices with regular recharging capability (smartphones, GPS trackers).
Energy Density: Energy stored per unit volume (Wh/L) or mass (Wh/kg); higher density enables smaller or lighter batteries for same energy capacity.
Self-Discharge Rate: Gradual battery capacity loss without any load; Li-SOCI2 has excellent shelf life (1% per year); standard alkaline loses 2-3% per year.
Temperature Performance: Battery capacity reduction at low temperatures; critical for outdoor IoT deployments where −20°C temperatures can reduce capacity by 50-70%.
Capacity Planning: Process of calculating required battery capacity given average current consumption, duty cycle, deployment duration, and environmental temperature range.
3.2 For Beginners: Where Does IoT Energy Come From?
IoT devices get their energy from two main sources:
Batteries - Store energy chemically, like tiny fuel tanks
Energy Harvesters - Capture energy from the environment (solar, motion, heat)
Most IoT devices use batteries because they’re reliable and predictable. Energy harvesting is exciting but has limitations - you can’t always count on the sun shining or something vibrating!
Key question for any IoT project: How much energy do I need, and where will it come from?
Sensor Squad: Choosing the Right Fuel!
“I come in many flavors,” said Bella the Battery proudly. “Alkaline batteries are cheap and great for easy-access indoor devices. Lithium coin cells last years and work in cold weather. Rechargeable lithium-ion batteries are perfect when you can add a solar panel. Choosing the right chemistry is as important as choosing the right sensor!”
Sammy the Sensor asked about extreme environments: “What if I am deployed in the Arctic where it is minus 40 degrees?” Bella answered, “Lithium thionyl chloride batteries work from minus 55 to plus 85 degrees Celsius and can last over 10 years! They have amazing energy density – 500 watt-hours per kilogram. But they are not rechargeable and cost more.”
Max the Microcontroller brought up harvesting: “For devices that need to last forever without maintenance, energy harvesting is the answer. Solar panels, vibration generators, and thermoelectric devices capture energy from the environment. A solar panel on a parking meter can power it indefinitely!” Lila the LED added, “The key formula is: harvested energy per day must exceed consumed energy per day. If it does, you have a perpetually powered device!”
Interactive: Battery Discharge Curve Animation
3.3 Energy Sources
3.3.1 Battery Technologies
Batteries are the most common power source for IoT devices. Understanding battery chemistry characteristics is essential for proper device design.
Figure 3.1: Overview of battery technologies for IoT applications
3.3.2 Battery Comparison Table
Chemistry
Voltage
Energy Density
Self-Discharge
Temperature Range
Best For
Alkaline
1.5V
100-150 Wh/kg
2-3%/year
-18°C to 55°C
Low-cost, moderate life
Lithium Primary
3.0V
250-300 Wh/kg
<1%/year
-40°C to 60°C
Long life, cold environments
Li Thionyl Chloride
3.6V
500+ Wh/kg
<1%/10 years
-55°C to 85°C
Extreme environments, 10+ years
Li-ion/LiPo
3.7V
150-250 Wh/kg
3-5%/month
0°C to 45°C
Rechargeable, frequent use
LiFePO4
3.2V
90-120 Wh/kg
2-3%/month
-20°C to 60°C
Safety-critical, high cycle
NiMH
1.2V
60-80 Wh/kg
15-30%/month
-20°C to 50°C
Low cost rechargeable
3.3.3 Primary Battery Selection Guide
When to use Alkaline (AA/AAA):
Indoor deployments with easy access
Cost-sensitive applications
Moderate temperature range (-18°C to 55°C)
1-3 year target lifetime
Consumer-replaceable batteries preferred
When to use Lithium Primary (CR2032, CR123A):
Compact form factor required
Wide temperature operation (-40°C to 60°C)
3-5 year target lifetime
Low self-discharge essential
Stable voltage preferred
When to use Lithium Thionyl Chloride (ER14505, ER34615):
Remote/inaccessible deployments
Extreme temperatures (-55°C to 85°C)
10+ year target lifetime
Industrial/utility applications
Higher initial cost acceptable
Tradeoff: Lithium Primary vs Lithium Thionyl Chloride Batteries
Lithium Primary (CR series): Lower cost, widely available, moderate energy density. Good for consumer IoT with 3-5 year targets. Cannot handle high pulse currents well.
Lithium Thionyl Chloride (ER series): Highest energy density, lowest self-discharge (<1%/decade), extreme temperature range. Essential for industrial/utility “deploy and forget” applications. Higher cost, requires careful circuit design for initial voltage delay (passivation), not rechargeable.
Choose Li-SOCl2 when: deployment is permanent (>7 years), temperature extremes expected, replacement is expensive/impossible.
3.3.4 Secondary (Rechargeable) Battery Selection
Lithium-ion/Lithium Polymer:
Best for devices with charging infrastructure
High energy density, no memory effect
500-1000 charge cycles typical
Requires protection circuit (BMS)
Not suitable for extreme temperatures
Lithium Iron Phosphate (LiFePO4):
Inherently safe chemistry (no thermal runaway)
2000+ charge cycles
Lower energy density than Li-ion
Wider temperature range than Li-ion
Ideal for solar + battery systems
Nickel Metal Hydride (NiMH):
Low cost, widely available
High self-discharge (use low-self-discharge variants)
No toxic materials (easier disposal)
Good for frequently charged devices
Tradeoff: Solar Harvesting vs Thermoelectric Harvesting
Factor
Solar Harvesting
Thermoelectric (TEG)
Power Output
10-200 mW/cm² (outdoor)
0.1-5 mW/cm² (10°C ΔT)
Availability
Day only, weather dependent
Continuous if gradient exists
Efficiency
15-22% (Si panels)
3-8%
Form Factor
Flat panel, needs sun exposure
Flexible, hidden installation
Best Applications
Outdoor sensors, agriculture
Industrial machinery, body heat
Cost
$0.50-$2 per watt
$5-$20 per watt
Choose Solar when: Outdoor deployment with sun access, moderate power needs (>10mW average), cost-sensitive.
Choose TEG when: Consistent temperature gradient exists (pipes, motors, body), solar access impossible, power needs are <5mW.
Hybrid approach: Some industrial deployments use solar as primary with TEG as backup during low-light periods.
Interactive: Energy Harvesting Cycle Animation
3.4 Energy Harvesting Technologies
Energy harvesting captures ambient energy from the environment to power IoT devices. While promising, realistic expectations are essential.
Figure 3.2: Energy harvesting system architecture showing sources, conditioning, storage, and load
3.4.1 Solar Energy Harvesting
Solar harvesting is the most mature and practical energy harvesting technology for IoT:
Condition
Power Density
5cm² Panel Output
Direct Sunlight
100 mW/cm²
500 mW
Overcast Outdoor
10 mW/cm²
50 mW
Bright Indoor (window)
1 mW/cm²
5 mW
Office Indoor
0.01 mW/cm²
50 µW
Dim Indoor
0.001 mW/cm²
5 µW
Key Solar Design Considerations:
Panel sizing: Size for worst-case (winter, cloudy) not best-case
Battery buffer: 3-7 days of autonomous operation without sun
MPPT controller: Extracts 20-30% more power than direct connection
Orientation: Fixed panels need optimal angle for location latitude
Cleaning: Dust reduces output by 5-25% over time
3.4.2 Thermoelectric (TEG) Harvesting
Thermoelectric generators convert temperature differences to electricity using the Seebeck effect:
\[P = \alpha^2 \times \Delta T^2 / (4R)\]
Where:
α = Seebeck coefficient (V/K)
ΔT = Temperature difference (K)
R = Internal resistance (Ω)
Practical TEG Power Outputs:
Temperature Difference
Typical Power Output
5°C (body heat)
10-50 µW
10°C (warm pipe)
0.1-1 mW
50°C (industrial)
5-50 mW
100°C (exhaust)
50-500 mW
Putting Numbers to It
Why does TEG power scale with temperature difference squared? The Seebeck equation reveals the relationship:
\[P = \frac{\alpha^2 \times \Delta T^2}{4R}\]
where \(\alpha\) is the Seebeck coefficient (V/K), \(\Delta T\) is temperature difference (K), and \(R\) is resistance (Ω).
Example: A TEG module with \(\alpha = 0.2\) V/K and \(R = 5\) Ω on a warm pipe:
Doubling \(\Delta T\) yields 4× power (quadratic relationship). This is why industrial TEGs targeting 50-100°C gradients deliver much more than body-heat harvesters (5-10°C).
TEG Design Considerations:
Maintain temperature gradient (heat sinks essential)
Cold side must dissipate heat to environment
Power output is proportional to ΔT²
Best for constant temperature sources (machines, pipes)
3.4.3 Piezoelectric (Vibration) Harvesting
Piezoelectric materials generate voltage when mechanically stressed:
Typical Power Outputs:
Source
Frequency
Power Output
Human walking
1-2 Hz
1-10 mW
Machine vibration
50-200 Hz
0.1-10 mW
Structural vibration
10-100 Hz
10-100 µW
Traffic vibration
5-30 Hz
100 µW - 1 mW
Design Challenges:
Resonant frequency must match vibration source
Narrow bandwidth (tuned to specific frequency)
Intermittent output (requires energy storage)
Mechanical fatigue over time
3.4.4 RF Energy Harvesting
RF harvesting captures ambient radio waves (Wi-Fi, cellular, broadcast):
Realistic Power Levels:
Source
Distance
Available Power
Dedicated 1W transmitter
1m
100 µW
Wi-Fi router
1m
10-50 µW
Wi-Fi router
5m
0.1-1 µW
Cellular tower
100m
0.1-1 µW
Broadcast TV
1km
0.01-0.1 µW
RF Harvesting Reality Check:
RF harvesting produces microwatts—sufficient only for:
Passive RFID tags (backscatter communication)
Sensors with very long sleep intervals (hours)
Devices with dedicated RF power transmitters nearby
The Million-to-One Power Ratio
Understanding relative power levels helps set realistic expectations:
Operation
Power Required
Ambient RF power (typical)
1 µW
ESP32 deep sleep
10 µW
ESP32 light sleep
800 µW
ESP32 active
50,000 µW (50 mW)
ESP32 Wi-Fi TX
200,000 µW (200 mW)
Ratio of Wi-Fi TX to ambient RF: 200,000:1
This is why RF harvesting cannot power Wi-Fi transmission from ambient sources—you’d need a dedicated power transmitter or massive collection area.
3.5 Battery and Energy Storage Visualizations
The following AI-generated diagrams illustrate key concepts in battery management and energy harvesting for IoT systems.
Battery Charging Profiles
Figure 3.3: Battery charging profiles for lithium-ion cells. The CC-CV (Constant Current-Constant Voltage) charging algorithm prevents overcharging damage while maximizing charging speed. Understanding this profile helps designers select appropriate charge controllers and estimate charging time.
Battery Lifecycle Management
Figure 3.4: Battery lifecycle management encompasses the entire operational life of IoT devices. Capacity degradation typically follows a predictable curve, losing 20% capacity after 500 charge cycles, enabling proactive maintenance scheduling.
Battery Pack Design
Figure 3.5: Battery pack architecture for multi-cell IoT applications. The Battery Management System (BMS) ensures balanced charging across cells, monitors temperature, and prevents over-discharge that could permanently damage cells or create safety hazards.
DC-DC Converters for Energy Harvesting
Figure 3.6: DC-DC converter topologies enable energy harvesters operating at millivolt levels to power 3.3V microcontrollers. Boost converters step up low solar panel voltages, while buck converters efficiently reduce higher voltage sources to logic levels.
3.6 Battery Capacity Calculation
To determine required battery capacity for a target lifetime:
3.7 Code Example: Battery Life Calculator with Derating
This Python tool automates the battery capacity calculation from the section above, adding realistic derating factors for temperature, self-discharge, and aging that are often overlooked in initial estimates:
class BatteryLifeCalculator:"""Calculate IoT device battery life with realistic derating. Accounts for duty cycling, temperature effects, self-discharge, and capacity aging -- factors that reduce real-world life by 30-50% compared to naive calculations. """def__init__(self, battery_mah, chemistry="li_primary"):self.battery_mah = battery_mahself.chemistry = chemistry# Derating factors by chemistryself.profiles = {"alkaline": {"temp_derate": 0.70, "self_discharge_pct_yr": 3.0,"aging_factor": 0.85, "voltage": 1.5},"li_primary": {"temp_derate": 0.90, "self_discharge_pct_yr": 1.0,"aging_factor": 0.95, "voltage": 3.0},"li_thionyl": {"temp_derate": 0.95, "self_discharge_pct_yr": 0.1,"aging_factor": 0.98, "voltage": 3.6},"li_ion": {"temp_derate": 0.85, "self_discharge_pct_yr": 5.0,"aging_factor": 0.80, "voltage": 3.7}, }def calculate(self, active_ma, active_sec, sleep_ua, cycle_sec, target_years=None):"""Calculate battery life with all derating factors. Args: active_ma: Current during active period (mA). active_sec: Duration of active period per cycle (seconds). sleep_ua: Sleep current (microamps). cycle_sec: Total cycle period (seconds). target_years: Optional target to check pass/fail. Returns: Dict with detailed breakdown. """ profile =self.profiles[self.chemistry] sleep_sec = cycle_sec - active_sec duty_pct = (active_sec / cycle_sec) *100# Average current (ideal) avg_ma = (active_ma * active_sec + (sleep_ua /1000) * sleep_sec) / cycle_sec# Ideal life (no derating) ideal_hours =self.battery_mah / avg_ma ideal_years = ideal_hours /8760# Apply derating usable_mah =self.battery_mah usable_mah *= profile["temp_derate"] # Cold weather usable_mah *= profile["aging_factor"] # Capacity fade# Self-discharge loss over estimated life est_years =min(ideal_years, 15) # Cap estimate sd_loss = profile["self_discharge_pct_yr"] * est_years /100 usable_mah *= (1- sd_loss) real_hours = usable_mah / avg_ma real_years = real_hours /8760 result = {"chemistry": self.chemistry,"battery_mah": self.battery_mah,"avg_current_ma": round(avg_ma, 4),"duty_cycle_pct": round(duty_pct, 3),"ideal_life_years": round(ideal_years, 2),"usable_mah": round(usable_mah, 1),"real_life_years": round(real_years, 2),"derating_loss_pct": round((1- real_years / ideal_years) *100, 1), }if target_years: result["target_years"] = target_years result["pass"] = real_years >= target_yearsreturn result# Example: Outdoor soil sensor (from earlier section)calc = BatteryLifeCalculator(battery_mah=3000, chemistry="li_primary")result = calc.calculate( active_ma=50, active_sec=5, sleep_ua=10, cycle_sec=3600, # 1 reading per hour target_years=5)for k, v in result.items():print(f" {k}: {v}")# Compare chemistries for same workloadprint("\nChemistry comparison:")for chem in ["alkaline", "li_primary", "li_thionyl"]: c = BatteryLifeCalculator(3000, chemistry=chem) r = c.calculate(active_ma=50, active_sec=5, sleep_ua=10, cycle_sec=3600)print(f" {chem:15s} ideal={r['ideal_life_years']}y "f"real={r['real_life_years']}y loss={r['derating_loss_pct']}%")# Output:# chemistry: li_primary# battery_mah: 3000# avg_current_ma: 0.0793# duty_cycle_pct: 0.139# ideal_life_years: 4.32# usable_mah: 2536.5# real_life_years: 3.65# derating_loss_pct: 15.5# target_years: 5# pass: False## Chemistry comparison:# alkaline ideal=4.32y real=2.79y loss=35.3%# li_primary ideal=4.32y real=3.65y loss=15.5%# li_thionyl ideal=4.32y real=4.01y loss=7.1%
The derating reveals a critical insight: the “ideal” 4.3-year estimate drops to 3.65 years for Li-primary (15% loss) and just 2.8 years for alkaline (35% loss from cold weather and self-discharge). To meet a 5-year target, you would need either a larger battery or Li-thionyl chloride chemistry.
Worked Example: Sizing Solar Panel for Remote Weather Station
Scenario: Design a solar-powered weather station for deployment in rural Australia. The system must operate year-round including during winter with shorter days and occasional cloudy periods.
Given Requirements:
Location: Melbourne, Australia (latitude 37.8°S)
System power consumption: 50 mW average (ESP32 + sensors)
Must survive 7 consecutive cloudy days without sun
Components: Solar panel, LiFePO4 battery, charge controller
After 7 cloudy days: Battery at 10.5 - (7 × 0.71) = 5.5 Wh (still 52% charged)
3 sunny recovery days restore full charge ✓
Select charge controller:
Input: 5W panel at 6V = 0.83A max
Output: 3.2V LiFePO4
Selected: CN3065 solar charge controller (suitable for single-cell LiFePO4)
Result: The system operates reliably year-round with a 5W panel and 3.4Ah battery, providing 2× safety margin beyond the 7-day autonomy requirement.
Key Insight: Always size for worst-case (winter) and add margin. A 2.4W panel would be mathematically sufficient, but the 5W panel (2× oversized) ensures reliable operation even with unexpected dust accumulation or panel degradation over years.
3.7.1 Interactive Solar Panel Sizing Calculator
Calculate solar panel requirements for your IoT deployment:
Show code
viewof avg_power_mw = Inputs.range([1,500], {value:50,step:5,label:"Device Average Power (mW)"})viewof peak_sun_hours = Inputs.range([1,10], {value:2.5,step:0.5,label:"Peak Sun Hours/Day (worst case)"})viewof autonomy_days = Inputs.range([1,14], {value:7,step:1,label:"Battery Autonomy (days)"})viewof system_efficiency = Inputs.range([0.5,0.9], {value:0.75,step:0.05,label:"System Efficiency"})viewof battery_voltage = Inputs.select([3.2,3.3,3.6,3.7], {value:3.2,label:"Battery Voltage (V)"})
Winner: Solar saves $2,500 over 10 years AND eliminates risk of battery failure before replacement window.
When Battery-Only Makes Sense:
Indoor sensors with easy access (office building)
Short deployment timeframe (<2 years)
Very low power consumption where Li-SOCl2 lasts 10+ years
Common Mistake: Underestimating Solar Panel Degradation and Soiling
The Problem: You size a solar panel for your daily energy needs based on manufacturer ratings, but the system fails in the field after 6-12 months.
What Went Wrong:
Panel degradation: Solar panels lose 0.5-1% efficiency per year
After 5 years: 95% of original output
After 10 years: 90% of original output
Solution: Add 10-20% oversizing margin to account for aging
Soiling (dust, dirt, bird droppings): Can reduce output by 20-50% in dry climates
Monthly cleaning: ~5% loss
Yearly cleaning: ~25% loss
Never cleaned: ~50% loss
Solution: Design for 20-30% soiling loss, or plan cleaning schedule
Temperature derating: Panels lose efficiency at high temps
Rated at 25°C (77°F)
At 65°C (149°F): 15-20% power loss
Solution: Use temperature coefficient from datasheet, add 15% margin for hot climates
Shading: Even 10% shading can reduce output by 50% (series-connected cells)
Partial shading causes hotspots and disproportionate losses
Solution: Use bypass diodes or microinverters, avoid any shading
Angle/orientation suboptimal: Fixed panels can’t track the sun
Optimal angle = latitude, but mounting constraints may force compromises
Solution: Simulate with tools (PVWatts) for your specific location and angle
Real-World Example: Agricultural sensor in California desert
Initial design: 3W panel for 1.5W daily average load (2× margin)
After 8 months: Battery regularly depleted
Investigation revealed:
Dust accumulation: 30% output loss (never cleaned)
Panel temperature: 70°C typical → 18% loss
Mounting angle: 15° (should be 35° for latitude) → 12% loss
Combined effect: 3W panel performing like 1.5W (50% total loss)
Solution: Upgrade to 6W panel (4× original design), add rain sensor to alert when cleaning needed, adjust tilt angle. System now works reliably even with soiling.
Best Practice: For critical deployments, size solar panels to deliver required power even with: - 20% degradation (age + soiling) - 15% temperature loss - 10% angle suboptimality - Total derating: 0.8 × 0.85 × 0.9 = 0.61 → Panels should be 1.6× oversized minimum
3.8 Knowledge Check
## How It Works
Battery selection and sizing follows a systematic process based on environmental and operational requirements:
Self-discharge tolerance: 10-year deployment → Li-SOCl2 (<1% per decade); Annual replacement OK → Any chemistry
Sizing workflow:
Calculate average current from power profile (Iavg = Σ(state current × duration) / cycle time)
Determine target lifetime in hours (years × 8760)
Raw capacity needed = Iavg × lifetime hours
Apply derating: Temperature (0.7-0.9×), aging (0.8-0.95×), voltage cutoff (0.8-0.9×), self-discharge over time
Final capacity = Raw / (temp × aging × cutoff × self-discharge)
Example: 50µA average, 5-year target, -10°C operation → 50µA × 43,800h = 2,190 mAh raw. With derating: 2,190 / (0.8 temp × 0.9 aging × 0.85 cutoff) = 3,583 mAh → Select 2× ER14505 (3,600 mAh total).
Solar harvesting sizing: Panel must generate daily consumption + battery recharge reserve. For 50mW average device in moderate climate (3 peak sun hours/day), need 50mW × 24h / (3h × 0.75 efficiency) = 533mW panel minimum. Add 2× safety margin for cloudy periods → 1W panel. Battery buffers 7 days: 50mW × 168h = 8.4Wh storage.
3.9 Concept Check
## Concept Relationships
Energy sources are the foundation that all other energy design decisions build upon:
Constrains: Battery capacity sets the absolute limit for Power Analysis targets—cannot exceed available energy
Enables: Proper battery sizing is prerequisite for Energy Harvesting integration (must buffer intermittent solar/thermal input)
Temperature Coupling: Battery derating at temperature extremes must align with Environmental Design specs
Informs Protocol Selection: Battery life target drives protocol choice—5-year target with 2000mAh battery → LoRaWAN, not WiFi
Design sequence: Requirements → Battery sizing → Average current target → Power profile optimization. Never optimize consumption before knowing battery constraints—may over-engineer (wasting development time) or under-engineer (missing targets).
3.10 See Also
Battery Chemistry Details:
Lithium Thionyl Chloride Deep Dive - Passivation, pulse handling
Given: 2,400 mAh Li-primary battery, 75µA average current, deployed at -10°C for 5 years.
Calculate:
Ideal battery life (room temperature, no aging)
Apply temperature derating (80% capacity at -10°C)
Apply aging factor (95% after 5 years)
Apply voltage cutoff (85% usable to 2.0V cutoff)
Final realistic lifetime
Solution:
Ideal: 2,400 mAh / 0.075 mA = 32,000 hours = 3.65 years
Temp: 2,400 × 0.8 = 1,920 mAh
Aging: 1,920 × 0.95 = 1,824 mAh
Cutoff: 1,824 × 0.85 = 1,550 mAh
Real: 1,550 / 0.075 = 20,667 hours = 2.36 years (35% less than ideal!)
What to observe: Derating compounds multiplicatively (0.8 × 0.95 × 0.85 = 0.646 = 35% loss). Always design with margin.
3.11.2 Exercise 2: Solar Harvesting Feasibility
Scenario: Indoor warehouse sensor near window receives 1 mW/cm² average light.
Device specs:
Average power: 5 mW (ESP32 with frequent BLE scans)
Available solar panel area: 10 cm²
Questions:
Maximum power harvestable: 10 cm² × 1 mW/cm² × 15% efficiency = ?
Can solar alone sustain the device?
What battery capacity needed for 3-day backup?
Answers:
Harvest: 1.5 mW
NO—device needs 5 mW but solar provides 1.5 mW (30% of need)
Energy deficit: 5 mW - 1.5 mW = 3.5 mW. 3-day shortage: 3.5 mW × 72h = 252 mWh = 76 mAh at 3.3V
What to observe: Indoor solar rarely works as primary power. Need either (1) reduce power to <1.5 mW (deep sleep optimization), (2) increase panel to 34 cm² (impractical), or (3) accept battery replacement every ~200 hours (8 days).
Matching Quiz: Match Battery Chemistries to Characteristics
Ordering Quiz: Order Battery Selection Decision Steps
Label the Diagram
💻 Code Challenge
Order the Steps
Match the Concepts
3.12 Summary
Key takeaways from this chapter:
Battery Chemistry Matters: Li-SOCl2 for extreme environments (10+ years), Li-ion for rechargeable applications, alkaline for low-cost moderate-life deployments
Energy Harvesting Reality: Solar is practical outdoors (10-200 mW/cm²), but indoor solar (0.01 mW/cm²) is rarely viable as primary power
Temperature Effects: Cold significantly reduces battery capacity (50% at -20°C), requiring careful derating
Storage is Essential: Even with energy harvesting, batteries or supercapacitors are required to buffer intermittent energy availability
Calculate Requirements: Work backwards from target lifetime to determine required capacity, applying realistic efficiency factors (60-80%)
Common Pitfalls
1. Selecting Battery Chemistry Based on Capacity Alone
A battery with high mAh may have poor performance at low temperatures, high self-discharge, or inability to deliver pulse current for radio transmissions. Always evaluate chemistry against deployment temperature range, self-discharge rate, and peak discharge capability.
2. Ignoring Self-Discharge for Low-Duty-Cycle Devices
A device that sends one reading per day may use only 10 µAh of useful energy per day, but a primary lithium battery self-discharges at ~1% per month (~20 µAh/day for a 2,000 mAh battery). Self-discharge can dominate total energy consumption for very low-duty-cycle devices — calculate both.
3. Sizing Solar for Average Days, Not Worst-Case Days
Sizing a solar harvesting system for average irradiance means it fails during cloudy periods. Size the energy storage to bridge your worst-case cloudy period (typically 5–14 consecutive cloudy days), and size the panel to fully recharge storage in 1–2 sunny days after.
4. Assuming Indoor Solar Can Power Active IoT Nodes
Indoor illuminance (100–500 lux) yields roughly 10–50 µW/cm² from amorphous silicon panels — enough for µW-class sensors but not for nodes with active radio transmissions (1–100 mW). Most indoor solar IoT applications require supplementary battery storage or ultra-aggressive duty cycling.