For a sensor sending 10-byte payloads every 60 seconds over a lossy cellular link (15% packet loss), compare UDP vs TCP energy consumption.
UDP transmission cost: \[
\text{Packet size} = 10 \text{ (payload)} + 8 \text{ (UDP)} + 20 \text{ (IP)} = 38 \text{ bytes}
\]
With 15% loss and application-layer retries, expected transmissions per message: \[
\text{TX}_{\text{avg}} = \frac{1}{1 - 0.15} \approx 1.18 \text{ transmissions}
\]
TCP transmission cost (full connection lifecycle): \[
\text{Handshake} = 3 \times 40 = 120 \text{ bytes (SYN, SYN-ACK, ACK; each = 20 IP + 20 TCP)}
\] \[
\text{Data + ACK} = 50 + 40 = 90 \text{ bytes (data: 20+20+10; ACK: 20+20)}
\] \[
\text{Teardown} = 3 \times 40 = 120 \text{ bytes (FIN, FIN-ACK, ACK — 3-way close)}
\] \[
\text{Total} = 120 + 90 + 120 = 330 \text{ bytes per message}
\]
Radio-on time per message (RF bit-time at 1 Mbps, 200 mA TX current):
\[
t_{\text{UDP}} = \frac{38 \times 1.18 \times 8}{10^6} = 0.359 \text{ ms (RF bit-time)}
\] \[
t_{\text{TCP}} = \frac{330 \times 8}{10^6} = 2.64 \text{ ms (RF bit-time)}
\]
For cellular links (NB-IoT / LTE-M), the total radio-on time includes channel acquisition and protocol setup, scaling the bit-time by a cellular overhead factor (~1000× for small payloads). Effective radio-on times are approximately 360 ms (UDP) and 2,640 ms (TCP). Converting to energy at 200 mA:
\[
E_{\text{UDP}} = 200 \text{ mA} \times \frac{0.360 \text{ s}}{3600 \text{ s/h}} \approx 0.0200 \text{ mAh/reading}
\] \[
E_{\text{TCP}} = 200 \text{ mA} \times \frac{2.64 \text{ s}}{3600 \text{ s/h}} \approx 0.147 \text{ mAh/reading}
\]
Daily energy (1,440 readings/day):
UDP: \(1{,}440 \times 0.0200 = 28.8 \text{ mAh/day}\)
TCP: \(1{,}440 \times 0.147 = 211.7 \text{ mAh/day}\)
Battery life with 2000 mAh:
UDP: \(2000 / 28.8 = 69.4 \text{ days}\), TCP: \(2000 / 211.7 = 9.4 \text{ days}\)
TCP uses ~7.4× more energy due to connection overhead dominating for small, infrequent payloads.