Calculate IoT data volumes: Estimate raw data generation from sensor deployments
Quantify bandwidth costs: Compare cloud-only vs edge/fog architectures economically
Design data filtering strategies: Reduce transmitted data by 90-99%
Understand data gravity: Explain why moving compute to data is more efficient
Identify bandwidth constraints: Recognize physical limitations of cellular and satellite networks
323.2 The Bandwidth Cost Problem: Why Sending Everything is Impossible
Beyond latency, the second major driver for edge/fog computing is bandwidth limitations and costs. As IoT scales to billions of devices, sending all raw data to the cloud becomes economically and physically impossible.
323.2.1 The Mathematics of IoT Data Volume
Let’s calculate real-world data generation rates:
Example 1: Smart Factory with 1,000 Sensors
Scenario: Manufacturing facility monitoring temperature, vibration, pressure across production lines
Number of sensors: 1,000
Sampling rate: 100 readings/second per sensor (needed for vibration detection)
Data per reading: 50 bytes (timestamp: 8 bytes, sensor ID: 4 bytes, value: 8 bytes, metadata: 30 bytes)
Data Volume Calculation: - Data per second: 1,000 sensors x 100 readings/s x 50 bytes = 5 MB/second - Data per hour: 5 MB/s x 3,600 s = 18 GB/hour - Data per day: 18 GB/hour x 24 hours = 432 GB/day - Data per month: 432 GB/day x 30 days = 12.96 TB/month - Data per year: 12.96 TB/month x 12 months = 155.5 TB/year
Cloud-Only Architecture Costs:
Cost Component
Calculation
Monthly Cost
Cellular data upload
12.96 TB x $8/GB (industrial IoT rates)
$103,680
Cloud ingestion
12.96 TB x $0.05/GB (AWS IoT Core)
$648
Cloud storage (S3 Standard)
12.96 TB x $0.023/GB
$298
Data processing (Lambda)
1,000 sensors x 100/s x 86,400s x $0.20/1M
$1,728
Total monthly cost
$106,354
Annual cost
$1,276,248
With Edge/Fog Processing Architecture:
The edge/fog layer performs: - Local filtering: Only send readings that deviate >2C or show vibration anomalies (typically 1-2% of readings) - Local aggregation: Send hourly statistics (min, max, average, std dev) instead of every reading - Anomaly detection: ML model on fog node identifies unusual patterns locally
Resulting data sent to cloud: - Anomalies: 1,000 sensors x 0.02 (2%) x 100 readings/s x 50 bytes = 100 KB/s - Hourly summaries: 1,000 sensors x 1 summary/hour x 200 bytes = 200 KB/hour - Total to cloud: ~100 KB/s + 0.055 KB/s (hourly) = 100 KB/s = 8.64 GB/day = 259 GB/month
Edge/Fog Architecture Costs:
Cost Component
Calculation
Monthly Cost
Cellular data upload
259 GB x $8/GB
$2,072
Cloud ingestion
259 GB x $0.05/GB
$13
Cloud storage
259 GB x $0.023/GB
$6
Data processing
(only anomalies)
$35
Edge/Fog hardware
(amortized over 5 years)
$400
Total monthly cost
$2,526
Annual cost
$30,312
Savings Achieved with Edge/Fog:
Metric
Cloud-Only
Edge/Fog
Savings
Monthly cost
$106,354
$2,526
$103,828/month (98% reduction)
Annual cost
$1,276,248
$30,312
$1,245,936/year
Data transmitted
12.96 TB/month
259 GB/month
98% reduction
Bandwidth required
5 MB/second
100 KB/second
98% reduction
Show code
{const container =document.getElementById('kc-edge-3');if (container &&typeof InlineKnowledgeCheck !=='undefined') { container.innerHTML=''; container.appendChild(InlineKnowledgeCheck.create({question:"A factory has 500 temperature sensors, each sending 100 bytes every second to the cloud at $0.10/GB bandwidth cost. What is the monthly bandwidth cost, and what would it be if edge processing filters out 95% of redundant data?",options: [ {text:"Cloud-only: $129.60/month, Edge-filtered: $6.48/month",correct:true,feedback:"Correct! Data = 500 sensors x 100 B/s x 86,400 s/day x 30 days = 129.6 GB/month. Cost = 129.6 x $0.10 = $12.96. With 95% filtering: $12.96 x 0.05 = $0.65/month."}, {text:"Cloud-only: $1,296/month, Edge-filtered: $64.80/month",correct:false,feedback:"This assumes 10x higher data rates than stated. Double-check: 500 sensors x 100 bytes x 86,400 seconds x 30 days = 129.6 GB, not 1,296 GB."}, {text:"Cloud-only: $12.96/month, Edge-filtered: $0.65/month",correct:false,feedback:"Close calculation but the filtering math is slightly different. At 95% reduction, the filtered cost would be 5% of original."}, {text:"Bandwidth costs are the same regardless of filtering location",correct:false,feedback:"Incorrect. Edge filtering dramatically reduces data transmitted to cloud. Processing locally means only relevant data (5%) incurs bandwidth charges."} ],difficulty:"hard",topic:"edge-fog-bandwidth" })); }}
The key insight: Most IoT sensor data is repetitive and uninteresting.
In a stable factory: - 98% of temperature readings are “normal” (within +/-1C of target) - 99% of vibration readings show “normal operation” - Equipment failures (interesting events) occur <0.1% of the time
Cloud doesn’t need to see “everything is normal” 100 times per second!
What cloud DOES need: - Real-time alerts when anomalies occur (sent immediately) - Hourly/daily summaries for trend analysis - Historical data for ML model training (can be compressed/sampled)
Edge/fog computing intelligently filters data, sending only what matters.
323.2.2 Example 2: Connected Vehicle Fleet
Scenario: Logistics company with 1,000 delivery trucks, each equipped with IoT sensors
Each truck generates: - GPS location: 1 reading/second x 50 bytes = 50 bytes/s - Engine diagnostics: 10 readings/second x 100 bytes = 1 KB/s - Video cameras (4 cameras): 4 streams x 2 Mbps = 8 Mbps = 1 MB/s - Driver behavior sensors: 5 readings/second x 50 bytes = 250 bytes/s - Total per truck: ~1 MB/s = 3.6 GB/hour = 86.4 GB/day
Fleet of 1,000 trucks: - Total data generated: 1,000 x 86.4 GB/day = 86.4 TB/day - Monthly volume: 86.4 TB x 30 = 2,592 TB/month = 2.59 PB/month
Cloud-Only Cost (Impossible): - Cellular upload (at $5/GB commercial rates): 2,592,000 GB x $5 = $12,960,000/month - This is economically impossible and would saturate available cellular bandwidth
Edge/Fog Solution:
Each truck has onboard edge computing (fog node): - Video processing: Detect events (hard braking, lane departure, near-miss) locally, only upload 10-second clips when incidents occur (99.9% reduction) - GPS: Send every 10 seconds instead of every second when in normal transit (90% reduction), real-time during deliveries - Engine data: Send only when anomalies detected or hourly summaries (95% reduction)
Data sent to cloud: ~1,000 trucks x 500 MB/day = 500 GB/day = 15 TB/month
Beyond cost, physical bandwidth limitations make cloud-only architectures impossible:
Cellular Network Limits: - 4G LTE upload speed: 10-50 Mbps (theoretical max, real-world often 5-10 Mbps) - 5G upload speed: 100-200 Mbps (not widely available) - A single 4K security camera generates: 25 Mbps - Result: One 4K camera saturates an entire 4G connection!
Satellite IoT (remote locations): - Typical upload speed: 128 kbps - 1 Mbps - Monthly data caps: 10-50 GB - Cost: $100-500/month - Result: Would take 14 hours to upload 1 GB, monthly cap exhausted in 1-2 days
Smart City Scale: - City of 1 million people with 100,000 surveillance cameras - Each camera: 5 Mbps average - Total bandwidth needed: 500 Gbps = 500,000 Mbps - Result: Impossible to backhaul to centralized cloud, requires distributed fog processing
323.2.4 Historical Context
Metric
Simple Explanation
Edge Computing
Processing data right at the source (on the sensor or gateway), like a smart camera detecting motion locally
Fog Computing
Intermediate processing between edge and cloud (at network routers/gateways), aggregating data from multiple sensors
Cloud Computing
Centralized processing in distant data centers, good for big-picture analytics but slow for real-time decisions
Latency
Delay between action and response; edge/fog dramatically reduce this (milliseconds vs seconds)
Bandwidth
Amount of data that can be transmitted; fog reduces this by 90-99% through local filtering
IoT Data Growth Has Outpaced Network Capacity:
Year
IoT Devices Worldwide
Data Generated
Available Bandwidth
Gap
2015
15 billion
500 EB/year
Sufficient
0%
2020
30 billion
2,500 EB/year
Insufficient
40% gap
2025
75 billion (projected)
8,000 EB/year (projected)
Severe shortage
70% gap
The “data gravity” problem: It’s become cheaper and faster to move computation to data than to move data to computation.
WarningCommon Misconception: “Fog Computing Is Just About Latency”
The Myth: Students often think latency reduction is the only reason to use fog/edge computing–if latency isn’t critical, just use the cloud.
The Reality: Fog computing addresses four distinct problems, not just latency:
Latency (time-critical): Autonomous vehicles need <10ms collision avoidance -> edge required regardless of bandwidth
Bandwidth (cost/capacity): Smart city with 10,000 cameras generating 5TB/hour -> fog required to avoid $200,000/month cellular costs
Privacy (regulatory): Hospital patient monitoring with HIPAA restrictions -> fog required to keep PHI local, even if latency isn’t critical
Reliability (offline operation): Remote oil rig monitoring must continue during satellite outages -> fog required for autonomous operation
Real-world example that surprised engineers: A smart building deployment initially chose fog computing for latency reasons (HVAC control needs <100ms responses). But the real benefit turned out to be reliability–during a 6-hour internet outage, the fog gateway kept the building operational while cloud-only competitors’ systems failed completely. Post-analysis showed latency wasn’t even the top concern; offline autonomy was mission-critical.
Another example: Video surveillance systems often use fog not primarily for latency (security guards tolerate 1-2 second delays) but for bandwidth cost–uploading 50 cameras x 2 Mbps x 24 hours to cloud costs $15,000/month, while fog processing (motion detection, face blur) reduces it to $500/month.
Key takeaway: When evaluating edge/fog vs cloud, assess all four criteria (latency, bandwidth, privacy, reliability), not just latency. Many successful fog deployments are driven by bandwidth costs or privacy regulations, not real-time requirements.
Show code
{const container =document.getElementById('kc-edge-4');if (container &&typeof InlineKnowledgeCheck !=='undefined') { container.innerHTML=''; container.appendChild(InlineKnowledgeCheck.create({question:"A hospital deploys patient monitoring wearables. Latency is not critical (1-second alerts are acceptable), but the hospital experiences frequent 2-hour internet outages due to construction. What is the PRIMARY reason to use fog computing?",options: [ {text:"Latency reduction - fog provides faster response times",correct:false,feedback:"While fog does reduce latency, the question states 1-second latency is acceptable, which cloud can meet. Latency is not the primary driver here."}, {text:"Bandwidth optimization - reduce data transmission costs",correct:false,feedback:"Bandwidth optimization is a benefit, but it's not the PRIMARY concern for patient monitoring with modest data volumes."}, {text:"Reliability and offline operation - critical alerts must work during outages",correct:true,feedback:"Correct! During 2-hour internet outages, a cloud-only system cannot alert nurses about patient emergencies. Fog computing ensures continuous monitoring and alerts even when internet connectivity fails."}, {text:"Cost reduction - fog hardware is cheaper than cloud services",correct:false,feedback:"Fog hardware has upfront costs. The primary driver here is reliability during network outages, not cost optimization."} ],difficulty:"medium",topic:"edge-fog-reliability" })); }}
323.3 Real-World Data Reduction Example: Smart Factory
To understand bandwidth savings in practice, consider this detailed breakdown:
Raw Data Generated: - 200 machines at 10,000 Hz sampling rate - Data rate: 8 MB/sec
After Edge Processing: - Local filtering removes 97.5% of normal readings - Data rate: 200 KB/sec
After Fog Aggregation: - Statistical summaries replace detailed streams - Data rate: 10 KB/sec
To Cloud: - Only anomalies and hourly summaries - Data rate: 1 KB/sec
Scenario
Raw Data
After Edge
After Fog
Cloud Receives
Reduction
Smart Home (15 sensors)
15 KB/min
N/A
10 KB/min
10 KB/min
33%
Building (5,000 sensors)
3 MB/min
N/A
300 KB/min
300 KB/min
90%
Factory (200 machines @ 10kHz)
8 MB/sec
200 KB/sec
10 KB/sec
10 KB/sec
99.875%
Autonomous Car (4 GB/sec sensors)
4 GB/sec
100 KB/sec
10 KB/sec
1 KB/sec
99.999975%
Smart City (100K streetlights)
100 KB/sec
N/A
10 KB/sec
10 KB/sec
90%
Cost Implications (cellular data @ $10/GB):
Factory Example (200 machines, 10,000 Hz sampling):
Cloud-Only:
- 8 MB/sec x 86,400 sec/day = 691 GB/day
- 691 GB x $10/GB = $6,910/day
- Annual: $2,522,150
Edge + Fog:
- 10 KB/sec x 86,400 sec/day = 864 MB/day
- 0.864 GB x $10/GB = $8.64/day
- Annual: $3,154
Savings: $2,519,000/year (99.875% cost reduction)
ROI on Edge/Fog Hardware:
- Edge devices: 200 x $50 = $10,000
- Fog gateway: $5,000
- Total investment: $15,000
- Payback period: 2.2 days
Real number: Edge + Fog processing reduces cellular bandwidth costs by 99.9% for high-frequency industrial IoT deployments.
323.4 Data Gravity and Why Proximity Matters
Concept: Large datasets have “gravity” - moving them is costly in time, bandwidth, and money.
Implication: Bringing computation to data (fog) is often more efficient than bringing data to computation (cloud).
Example: Video surveillance generating 1TB/day per camera: - Sending to cloud: Massive bandwidth and cost - Fog processing: Extract only motion events, faces, or anomalies - Result: 1GB/day instead of 1TB/day sent to cloud (99.9% reduction)
323.5 Summary
Bandwidth costs and physical network limitations are often the deciding factor in edge/fog architecture decisions–even more than latency in many cases.
Key takeaways:
IoT data volumes quickly become economically impossible to transmit entirely to cloud
Edge/fog processing can reduce transmitted data by 90-99.9%
ROI on edge hardware is often measured in days, not years
Four factors drive edge/fog adoption: latency, bandwidth, privacy, reliability
The “data gravity” principle: move compute to data, not data to compute
323.6 What’s Next?
Now that you understand both latency and bandwidth drivers, the next chapter provides a systematic decision framework for choosing where to process data.