Let’s calculate the real impact of protocol overhead on IoT deployments with concrete numbers.
Scenario: 1,000 temperature sensors sending 4-byte readings every 15 minutes over LoRaWAN.
Payload Analysis:
\[
\text{Data per message} = \begin{cases}
\text{Sensor value:} & 4 \text{ bytes} \\
\text{LoRaWAN header:} & 13 \text{ bytes} \\
\text{Total transmitted:} & 17 \text{ bytes}
\end{cases}
\]
Overhead ratio: \(\frac{13}{17} = 76\%\) of every packet is overhead, not sensor data.
Annual transmission volume:
\[
\begin{aligned}
\text{Messages per sensor per year} &= \frac{24 \times 365}{15/60} = 35,040 \text{ messages} \\
\text{Total network messages} &= 35,040 \times 1,000 = 35.04 \text{ million/year} \\
\text{Data transmitted} &= 35.04M \times 17 \text{ bytes} = 595.7 \text{ MB/year}
\end{aligned}
\]
Cost on cellular IoT (NB-IoT at $0.10/MB): \(595.7 \times \$0.10 = \$59.57\) per year for this overhead.
Energy impact (at 20mW transmit power, 250 kbps):
\[
\begin{aligned}
\text{Transmit time per packet} &= \frac{17 \text{ bytes} \times 8}{250,000 \text{ bps}} = 0.544 \text{ ms} \\
\text{Energy per transmission} &= 20 \text{ mW} \times 0.544 \text{ ms} = 0.0109 \text{ mWs} \\
\text{Annual energy (per sensor)} &= 35,040 \times 0.0109 \text{ mWs} = 382 \text{ mWs} = 0.106 \text{ mWh}
\end{aligned}
\]
With a CR2032 battery (220 mAh at 3V = 660 mWh), transmission alone consumes 0.106/660 = 0.016% of battery per year - sensors could theoretically last 6,000+ years on transmission energy alone. This shows why sleep current (not transmission) dominates IoT battery budgets.
Key insight: For IoT, reducing overhead from 76% to 50% matters less for battery life than reducing sleep current from 10 µA to 1 µA.