Scenario: A soil moisture sensor in a smart agriculture deployment sends 50-byte readings every 15 minutes. The sensor runs on a 2000 mAh battery at 3.3V. Calculate how protocol choice affects battery life.
Given:
- Payload size: 50 bytes
- Transmission interval: 15 minutes (96 times/day)
- Battery capacity: 2000 mAh at 3.3V
- Radio TX current: 120 mA (typical for Wi-Fi module)
- Radio RX current: 80 mA
- Sleep current: 10 uA
- Transmission rate: 1 Mbps
Step 1: Calculate UDP overhead per transmission
UDP packet structure:
- IP header: 20 bytes
- UDP header: 8 bytes
- Payload: 50 bytes
- Total: 78 bytes = 624 bits
Transmission time: 624 bits / 1 Mbps = 0.624 ms
Energy per TX: 120 mA × 0.624 ms = 0.075 mAs
Daily energy (UDP):
- 96 transmissions × 0.075 mAs = 7.2 mAs/day
Step 2: Calculate TCP overhead per transmission
TCP per-message exchange:
1. SYN: 40 bytes (IP 20 + TCP 20 header, no payload)
2. SYN-ACK: 40 bytes (receive)
3. ACK: 40 bytes (send, acknowledgment of SYN-ACK)
4. DATA: 90 bytes (IP 20 + TCP 20 header + 50-byte payload)
5. DATA-ACK: 40 bytes (receive)
6. FIN: 40 bytes (send, connection teardown)
7. FIN-ACK: 40 bytes (receive)
Total TX: 40 + 40 + 90 + 40 = 210 bytes = 1680 bits
Total RX: 40 + 40 + 40 = 120 bytes = 960 bits
TX time: 1680 bits / 1 Mbps = 1.68 ms
RX time: 960 bits / 1 Mbps = 0.96 ms
Energy per exchange:
- TX: 120 mA × 1.68 ms = 0.202 mAs
- RX: 80 mA × 0.96 ms = 0.077 mAs
- Total: 0.279 mAs per transmission
Daily energy (TCP):
- 96 transmissions × 0.279 mAs = 26.8 mAs/day
Step 3: Calculate battery life comparison
Sleep energy (both protocols):
- 23.9 hours/day in sleep (accounting for TX time)
- 10 uA × 23.9 hours = 0.239 mAh/day = 860.4 mAs/day
UDP total daily consumption:
- TX energy: 7.2 mAs = 0.002 mAh
- Sleep: 860.4 mAs = 0.239 mAh
- Total: 0.241 mAh/day
- Battery life: 2000 mAh / 0.241 mAh/day = 8,299 days = 22.7 years
TCP total daily consumption:
- TX/RX energy: 26.8 mAs = 0.007 mAh
- Sleep: 860.4 mAs = 0.239 mAh
- Total: 0.246 mAh/day
- Battery life: 2000 mAh / 0.246 mAh/day = 8,130 days = 22.3 years
Step 4: Factor in real-world overhead (retransmissions, keep-alives)
With 5% packet loss requiring retransmission:
- UDP + CoAP CON: TX overhead × 1.05 = 7.56 mAs/day
Total: (7.56 + 860.4) / 3600 ≈ 0.241 × 1.002 ≈ 0.253 mAh/day -> 21.7 years
- TCP: TX/RX overhead × 1.15 = 30.8 mAs/day (head-of-line blocking adds cascading retransmits)
Total: (30.8 + 860.4) / 3600 ≈ 0.248 × 1.013 ≈ 0.283 mAh/day -> 19.4 years
With TCP keep-alive every 30 minutes (if connection persisted):
- 48 keep-alives/day × 40 bytes × 120 mA = additional 0.005 mAh/day
- New TCP total: 0.288 mAh/day -> 19.0 years
Result:
| UDP (ideal) |
0.241 mAh |
22.7 years |
Baseline |
| TCP (ideal) |
0.246 mAh |
22.3 years |
-2% |
| UDP + CoAP (5% loss) |
0.253 mAh |
21.7 years |
-4% |
| TCP (5% loss + keepalive) |
0.288 mAh |
19.0 years |
-16% |
Key Insight: Sleep current dominates total energy consumption for low duty-cycle sensors, which compresses the apparent protocol difference. However, real-world conditions amplify the gap: TCP’s handshake overhead, retransmission behavior, and keep-alive requirements combine to reduce battery life by 16% compared to UDP with application-layer reliability. For a 1,000-sensor deployment, this 3.7-year difference per sensor translates to 3,700 fewer battery replacements over the fleet lifetime. The calculation also reveals that optimizing wake time and sleep current has greater impact than protocol choice alone for very low duty-cycle sensors.