Edge is mandatory when latency is life-critical: autonomous vehicles require under 10ms for collision avoidance, while cloud round-trip adds 200ms+. A factory with 100 sensors at 10kHz generates 32 Mbps of raw data requiring 99.9% local reduction, and GDPR mandates fog-based anonymization so only aggregate statistics (not raw PII) reach the cloud.
Key Concepts
Industrial IoT (IIoT): Manufacturing and process control applications requiring deterministic sub-10ms response times for machine safety and quality control
Smart City Infrastructure: Large-scale deployments of traffic sensors, parking monitors, and environmental sensors requiring edge aggregation to manage bandwidth
Autonomous Systems: Vehicles, drones, and robots requiring local perception and decision-making because cloud latency (100ms+) is physically unsafe at operational speeds
Healthcare at the Edge: Patient monitoring systems (ECG, SpO2) processing data locally to meet HIPAA data locality requirements and provide offline resilience
Predictive Maintenance: Vibration and temperature analysis on industrial equipment running locally to detect bearing failures 2-4 weeks before breakdown
Precision Agriculture: Soil sensors and drone imagery analyzed at farm-edge gateways in areas with intermittent connectivity to optimize irrigation and fertilization
Retail Analytics: In-store video analytics running at edge to count customers and detect shelf gaps without transmitting sensitive video footage to cloud
Energy Grid Management: Smart meters and grid sensors processing locally for real-time frequency regulation that requires <100ms response to prevent cascading failures
Analyze edge/fog computing requirements: Determine whether a use case demands edge, fog, or cloud processing based on latency, bandwidth, and regulatory constraints
Design predictive maintenance architectures: Apply three-tier fog computing to industrial scenarios with 1kHz sensor data and sub-100ms anomaly detection
Evaluate autonomous vehicle compute placement: Compare on-vehicle edge (10ms), roadside fog (20-50ms), and cloud (200ms+) processing tradeoffs for safety-critical decisions
Implement privacy-preserving data pipelines: Architect fog-based anonymization, differential privacy, and data minimization flows that comply with GDPR requirements
Calculate bandwidth and cost tradeoffs: Quantify data reduction ratios, compute offloading costs, and break-even points for edge vs. cloud deployments
Compare use case architectures: Contrast the deployment patterns across smart factory, autonomous vehicle, agricultural drone, and healthcare monitoring scenarios
Minimum Viable Understanding
Edge is mandatory when latency is life-critical: Autonomous vehicles require under 10ms for collision avoidance; cloud round-trip adds 200ms+, meaning the crash happens before the response arrives
Bandwidth physics often force edge processing: A factory with 100 sensors at 10kHz sampling generates 32 Mbps of raw data, exceeding typical uplink capacity and requiring 99.9% local data reduction before transmission
Privacy regulations make fog a legal requirement: GDPR mandates data minimization; fog nodes anonymize video (face blur, people count) so only aggregate statistics like “47 customers” reach the cloud, never raw PII
Hybrid architectures combine all three tiers: Real-time decisions happen at edge (under 50ms), regional coordination at fog (under 500ms), and model training and trend analysis in the cloud (hours to days)
Sensor Squad: The Three-Layer City!
Hey Future IoT Explorer! Let’s learn about how smart cities work using a story about three layers!
Welcome to EdgeFog City!
Imagine a city with three levels, like a three-layer cake:
Layer 1: The Street Level (Edge)
These are the tiny helpers right where things happen
Traffic lights, streetlights, parking sensors
They’re like crossing guards - they make instant decisions!
When a car is coming, the crossing guard doesn’t call the mayor first!
Layer 2: The Neighborhood Level (Fog)
These are the neighborhood managers
They coordinate several blocks together
Like a school principal who manages many classrooms
They know what’s happening in their area and help things work together
Layer 3: The City Hall Level (Cloud)
This is the mayor’s office
They make big plans for the whole city
Like deciding where to build new roads or when to have festivals
They don’t need to decide RIGHT NOW - they plan ahead
A Real Example - Traffic Jam!
Edge (Street): Traffic light sees cars piling up → turns green immediately (2 seconds)
Fog (Neighborhood): “Streets 1, 2, 3 are busy!” → Coordinates all lights together (30 seconds)
Cloud (City Hall): “Monday mornings are always busy here” → Plans new traffic patterns (days later)
Fun Activity: Next time you’re in a car, watch the traffic lights. Are they making instant decisions (edge) or do they seem coordinated with other lights (fog)?
Remember: The closer the decision-maker is to the problem, the faster the solution!
For Beginners: Understanding Use Cases
Why Study Use Cases?
Instead of just learning theory, use cases show you HOW edge and fog computing solve real problems. Each use case demonstrates:
The Problem: What couldn’t be done with cloud-only?
The Solution: How does distributing compute help?
The Numbers: What improvement do we actually get?
The Three Use Cases in This Chapter:
Use Case
Main Constraint
Key Insight
Smart Factory
Bandwidth (too much data)
99.9% less data sent to cloud
Autonomous Vehicles
Latency (life-or-death speed)
10ms vs 200ms response time
Privacy Systems
Regulations (data can’t leave)
Process locally, send only results
Simple Questions to Ask:
When evaluating any IoT system, ask: - “What happens if the internet goes down?” → If bad, need edge - “How fast must decisions happen?” → If < 100ms, need edge - “Does sensitive data travel?” → If yes, consider fog for privacy
9.2 Use Case 1: Smart Factory Predictive Maintenance
9.2.1 Scenario
Manufacturing facility with hundreds of machines, each instrumented with vibration, temperature, and acoustic sensors generating data at 1kHz sampling rate.
9.2.2 Requirements
Real-time anomaly detection (<100ms)
Predictive failure alerts (hours to days advance warning)
Latency: Immediate shutdown on critical failures; real-time anomaly alerts Bandwidth: 99.9% reduction (1kHz data -> event summaries) Reliability: Continues operating during internet outages Value: Reduced downtime, optimized maintenance, extended equipment life
Putting Numbers to It
Data reduction at the fog tier follows the formula \(R = \frac{D_{raw} - D_{filtered}}{D_{raw}} \times 100\%\), where \(R\) is reduction percentage, \(D_{raw}\) is raw data volume, and \(D_{filtered}\) is filtered output volume.
If only anomaly alerts are sent (1% of filtered data): Final reduction = \(\frac{115.2 - 0.074}{115.2} \times 100\% = 99.94\%\), saving \(\$311/\text{month}\).
9.3 Use Case 2: Autonomous Vehicle Edge Computing
9.3.1 Scenario
Connected autonomous vehicles requiring instant decision-making with sensing, communication, and coordination.
Safety: Life-critical response times achieved Bandwidth: Terabytes/day of sensor data processed locally Reliability: Critical functions independent of cloud connectivity Scalability: Millions of vehicles supported through distributed architecture
9.4 Use Case 3: Privacy-Preserving Architecture
Fog computing enables privacy-preserving architectures that process sensitive data locally while still providing useful insights and services.
9.4.1 Privacy Challenges in IoT
Personal Data Exposure:
Video surveillance
Health monitoring
Location tracking
Behavioral patterns
Cloud Privacy Risks:
Data breaches
Unauthorized access
Third-party sharing
Government surveillance
9.4.2 Fog-Based Privacy Preservation
Local Processing Principle: “Process data where it’s collected; send only necessary insights”
Techniques:
Data Minimization:
Extract only required features
Discard raw sensitive data
Aggregate individual data
Example: Smart home: Count people in room (1 number) instead of sending video stream
Anonymization:
Remove personally identifiable information
Blur faces in video
Generalize location (area vs. precise GPS)
Differential Privacy:
Add noise to data before transmission
Provide statistical guarantees on privacy
Enable aggregate analytics while protecting individuals
Encryption:
End-to-end encryption for necessary transmissions
Homomorphic encryption for cloud processing of encrypted data
Secure multi-party computation
9.4.3 Architecture Pattern
Edge Devices: Collect raw sensitive data
Fog Nodes:
Extract privacy-safe features
Anonymize or aggregate
Encrypt if transmission needed
Cloud:
Receives only privacy-preserved data
Performs authorized analytics
Returns results to fog/devices
Privacy-Preserving Data Flow in Fog Architecture
Example: Healthcare Monitoring
Wearable: Collects heart rate, location, activity
Fog (smartphone): Detects anomalies, triggers alerts
Cloud: Receives only: “Anomaly detected at approximate location X”
Privacy preserved: Raw health data never leaves personal fog node
Interactive Quiz: Match Concepts
Interactive Quiz: Sequence the Steps
9.5 Worked Example: Compute Offloading Decision for Agricultural Drone
Scenario: A fleet of agricultural drones surveys 1,000-acre farms for crop disease detection. Each drone carries cameras, processes images to detect diseased plants, and triggers precision pesticide spraying. The question: should image processing happen on the drone (edge), at a ground station (fog), or in the cloud?
Deferred (ground station buffer -> cloud overnight): Full-resolution image archival, detailed analysis for treatment planning, model retraining data collection
Calculate costs:
Option A (Cloud - if possible): $3/hour x 2 hours/day x 20 drones x 365 = $43,800/year + bandwidth: 50.6 TB/day x $0.09/GB x 365 = $1,662,714/year = $1,706,514/year (IMPOSSIBLE anyway)
Option C (Edge + deferred cloud): Drone GPUs: $400 x 20 = $8,000 (one-time), Ground station: $15,000 (one-time), Cloud analysis: $3/hour x 4 hours/day x 365 = $4,380/year, Overnight bandwidth: ~100 GB/day x $0.09 x 365 = $3,285/year
Result: Edge processing is the only viable option for real-time crop disease detection. The hybrid architecture uses on-drone inference for immediate decisions (32ms latency) while deferring full-resolution uploads for overnight cloud analysis.
Key Insight: When bandwidth is the bottleneck (which it almost always is for high-resolution imagery), edge processing becomes mandatory regardless of cost. The compute offloading decision is often determined by physics (data size vs link capacity) rather than economics. In this case, even unlimited budget couldn’t make cloud processing work in real-time. Design for edge-first when dealing with high-bandwidth sensors (cameras, lidar, radar) and use fog/cloud for deferred analytics.
Common Pitfalls and Misconceptions
“Edge means no cloud”: A common mistake is treating edge and cloud as mutually exclusive. In practice, every production edge/fog deployment uses a hybrid architecture. Edge handles real-time decisions, but cloud remains essential for model training, long-term analytics, and fleet management. Removing the cloud tier from a smart factory design eliminates the ability to retrain ML models and improve anomaly detection over time.
“More edge compute is always better”: Adding powerful GPUs to every edge device increases cost, power consumption, and thermal challenges. An agricultural drone with a $400 Jetson module at 15W is viable; adding a $10,000 GPU server would exceed the drone’s weight and power budget. Right-size edge compute to the actual latency and throughput requirements rather than maximizing capability.
“Fog and edge are the same thing”: Fog computing specifically refers to an intermediate layer that aggregates and coordinates data from multiple edge devices. A single traffic camera doing local face detection is edge computing. A roadside unit coordinating 20 vehicles at an intersection is fog computing. Confusing these leads to architectures that miss the coordination benefits of the fog tier.
“Privacy is solved by keeping data in the EU”: Simply hosting cloud servers in the EU does not satisfy GDPR’s data minimization principle. Raw video with identifiable faces stored in an EU data center is still a privacy risk and a potential violation. True privacy-preserving architecture requires fog-level processing: blur faces, extract counts, and transmit only anonymized aggregates. Data minimization means collecting less, not just storing it closer.
“Latency calculations only need network round-trip time”: Real end-to-end latency includes sensor capture time, preprocessing, inference, decision logic, and actuation. A cloud service with 50ms network latency may have 200ms+ total when including serialization, queuing, and processing. For the autonomous vehicle use case, the 10ms budget includes 5ms sensor fusion + 3ms decision + 2ms actuation, leaving zero margin for network hops. Always calculate the full pipeline, not just the network segment.
Edge-Fog-Cloud Decision Framework for IoT Use Cases
Worked Example: Smart Factory Bandwidth Cost Analysis
Scenario: Calculate monthly bandwidth costs for sending factory sensor data to cloud, comparing cloud-only vs fog-preprocessing approaches.
Factory specifications:
200 temperature sensors: 1 reading/sec, 4 bytes each
50 pressure sensors: 10 readings/sec, 4 bytes each
10 vibration sensors: 1,000 readings/sec (1 kHz), 4 bytes each
Key insight: Fog preprocessing reduces bandwidth costs by 99.5% (2.5 MB/sec → 0.013 MB/sec). The fog gateway hardware pays for itself in under 4 months through bandwidth savings alone.
Decision Framework: When Edge/Fog Is Mandatory vs Optional
Use this framework to determine if your IoT deployment requires edge/fog or can use cloud-only:
Factor
Cloud-Only Viable
Edge/Fog Mandatory
Decision Rule
Response time
>200ms acceptable
<100ms required
Safety-critical systems need local processing
Bandwidth cost
<10 GB/month per site
>100 GB/month per site
High-volume data (video, high-frequency sensors) needs local filtering
Connectivity
99.9%+ uptime, <50ms latency
Unreliable, rural, or mobile
Systems must work offline (factories, vehicles, remote monitoring)
Data sensitivity
Public or anonymized
PII, health, financial
GDPR/HIPAA require local processing of regulated data
Connectivity: Hospital network (reliable but privacy matters)
Sensitivity: HIPAA-protected health data
Decision: Fog mandatory (local processing for privacy + aggregation)
Scenario C: Autonomous delivery robot
Response time: <10ms for obstacle avoidance
Bandwidth: LIDAR + cameras = 50 Mbps continuous
Connectivity: Cellular (unreliable, high latency)
Sensitivity: Location tracking (privacy concern)
Decision: Edge mandatory (on-robot processing for all real-time decisions)
Rule of thumb: If 2+ factors indicate “Edge/Fog Mandatory,” cloud-only is insufficient. If all factors indicate “Cloud-Only Viable,” edge/fog adds unnecessary cost.
Common Mistake: Assuming “Fog” Means Single Gateway Per Site
The mistake: Deploying one fog gateway per factory/building and assuming it’s sufficient.
Why single-gateway fog fails:
Real scenario: Manufacturing plant with 1,000 sensors across 10 production lines, one fog gateway
Failure modes:
Single point of failure:
Gateway hardware fails → all 1,000 sensors go dark
Video analytics saturates gateway CPU → temperature alerts delayed
Network topology limits:
Gateway 200 meters from far production line
Wi-Fi range issues cause packet loss
Wired Ethernet routing adds 50ms latency
How to design fog correctly:
Distributed fog architecture:
Factory floor layout:
├─ Zone A (Lines 1-3): Fog Gateway A
├─ Zone B (Lines 4-6): Fog Gateway B
├─ Zone C (Lines 7-8): Fog Gateway C
└─ Zone D (Lines 9-10): Fog Gateway D
Each zone gateway:
- Handles 250 sensors locally
- Provides N+1 redundancy (can take over adjacent zone)
- Connects to central coordinator via 10 GbE
Value: Eliminates downtime ($10,000/hour production loss × 4 hours/year MTBF = $40,000 annual risk avoided)
Rule of thumb: Deploy 1 fog gateway per 100-500 devices, with N+1 redundancy and cross-zone failover. Never rely on single gateway for production systems.
🏷️ Label the Diagram
💻 Code Challenge
9.6 Summary and Key Takeaways
Edge and fog computing enable use cases that cloud-only architectures cannot support. From factory floors requiring sub-100ms anomaly detection to autonomous vehicles needing 10ms collision avoidance, local processing is often a fundamental requirement rather than an optimization.
Key takeaways:
Smart factory predictive maintenance: Three-tier architecture achieves 99.9% bandwidth reduction by filtering 1kHz sensor data at edge, running ML anomaly detection at fog, and deferring model training to cloud. Continues operating during internet outages.
Autonomous vehicles require edge processing: Safety-critical collision avoidance demands under 10ms total latency (5ms sensor fusion + 3ms decision + 2ms actuation). Cloud round-trip of 200ms+ means the collision happens before the response arrives. Edge is not optional here.
Privacy compliance demands fog-level processing: GDPR’s data minimization principle requires processing sensitive data locally. Fog nodes anonymize video (face blur, people count) so only aggregate statistics reach the cloud, never raw personally identifiable information.
Bandwidth physics often make the decision: Agricultural drones generating 360 MB/s per unit cannot transmit raw data over 25 Mbps links. When data volume exceeds available bandwidth by 100x+, edge processing is the only viable option regardless of cost considerations.
Hybrid architectures are the production reality: No real-world deployment is purely edge, fog, or cloud. The winning pattern combines real-time edge decisions (under 50ms), regional fog coordination (under 500ms), and deferred cloud analytics (hours to days).
9.7 Knowledge Check
Quiz: Edge and Fog Computing: Use Cases
9.8 Concept Relationships
Concept
Relates To
Relationship Type
Why It Matters
Predictive Maintenance (Use Case 1)
Edge Anomaly Detection
Application domain
Demonstrates 99.9% data reduction (1kHz sensors → event summaries) and sub-100ms anomaly alerts at fog tier
Autonomous Vehicles (Use Case 2)
Safety-Critical Latency
Life-or-death constraint
Shows why <10ms edge processing is mandatory - cloud’s 200ms+ round-trip means collision happens before response
Privacy-Preserving Architecture (Use Case 3)
GDPR/HIPAA Compliance
Regulatory driver
Fog-based face blur and anonymization satisfy data minimization requirements cloud-only violates
Agricultural Drone Offloading (Worked Example)
Bandwidth Physics
Cost-benefit analysis
Quantifies how 4 GB/s raw imagery makes cloud impossible, edge mandatory despite $8K hardware cost
Fog Gateway Redundancy
System Availability
Design pattern
Single fog node = single point of failure - N+1 redundancy prevents $40K/year downtime risk
Hybrid Edge-Cloud Pattern
Production Reality
Architectural norm
All three use cases combine edge (real-time), fog (coordination), cloud (analytics) - pure single-tier is anti-pattern
9.9 See Also
Explore related chapters to deepen your understanding of edge/fog use case patterns:
Edge-Fog Architecture - Three-tier architecture fundamentals and fog node capabilities
Privacy-Preserving IoT - Techniques for GDPR/HIPAA compliance in IoT systems
Edge AI Applications - Machine learning inference patterns at edge and fog tiers
9.10 How It Works: Smart Factory Predictive Maintenance Pipeline
The Challenge: A CNC machine generates vibration sensor data at 10 kHz (10,000 readings/second). Raw data rate: 10,000 Hz × 4 bytes = 40 KB/s = 3.5 GB/day per machine. With 100 machines, that is 350 GB/day or about 10.5 TB/month, costing approximately $10,080/year in cloud bandwidth alone. How does fog computing make this viable?
Step-by-Step Architecture:
Step 1: Edge Tier (Machine PLC with Embedded Vibration Sensor)
Each CNC machine has a Programmable Logic Controller (PLC) with direct sensor connection:
# Edge processing: Fast Fourier Transform (FFT) on vibration dataimport numpy as npdef edge_vibration_processing():# Collect 1-second window (10,000 samples) vibration_samples = read_sensor_burst(samples=10000, rate=10000)# Apply FFT to detect frequency anomalies fft_result = np.fft.fft(vibration_samples) frequencies = np.fft.fftfreq(10000, 1/10000)# Extract key frequency bands bearing_freq = get_amplitude_at(fft_result, frequencies, 1200) # Bearing frequency gear_freq = get_amplitude_at(fft_result, frequencies, 3600) # Gear mesh frequency# Local threshold detection (immediate action)if bearing_freq > BEARING_THRESHOLD: trigger_alarm() # <10ms local response emergency_stop_if_critical()# Send only feature vectors to fog (not raw 10,000 samples) features = {'timestamp': now(),'bearing_amplitude': bearing_freq,'gear_amplitude': gear_freq,'overall_rms': calculate_rms(vibration_samples) } send_to_fog(features) # 100 bytes vs 40 KB raw# Execute every 1 secondwhileTrue: edge_vibration_processing() sleep(1)
Data reduction at edge: 40 KB raw → 100 bytes features = 99.75% reduction
Step 2: Fog Tier (Factory Floor Gateway - Industrial PC)
Fog gateway aggregates features from 100 machines and runs ML model:
# Fog processing: Machine learning inference for predictive maintenanceimport joblib# Load pre-trained anomaly detection model (trained in cloud, deployed to fog)anomaly_model = joblib.load('bearing_failure_predictor.pkl')def fog_aggregation_and_prediction(machine_features):# Aggregate last 60 seconds of features from this machine feature_window = buffer_last_60_seconds(machine_features)# Run ML inference (local, no cloud latency) failure_probability = anomaly_model.predict_proba(feature_window)[0][1]# Decision logicif failure_probability >0.75: # High risk of failure within 24 hours create_maintenance_work_order(machine_features['machine_id']) send_to_cloud({'alert_type': 'predictive_maintenance','machine_id': machine_features['machine_id'],'failure_probability': failure_probability,'recommended_action': 'replace_bearing_before_shift_end' })# Aggregate hourly summaries for all 100 machinesif time.minute ==0: # Every hour hourly_summary = aggregate_all_machines(last_hour_data) send_to_cloud(hourly_summary) # 10 KB for 100 machines
Data reduction at fog: 100 machines × 100 bytes/sec → 10 KB hourly summary = 96% further reduction
Step 3: Cloud Tier (AWS IoT + ML Training Pipeline)
Cloud receives only alerts and hourly summaries, performs long-term analytics:
# Cloud processing: Model retraining and fleet analyticsdef cloud_analytics_pipeline():# Aggregate data from 10 factories (1,000 machines total) fleet_data = query_last_month_summaries()# Identify patterns failure_correlation = correlate_failures_with_operating_conditions(fleet_data)# Result: "Machines running at >80% capacity fail 3x more often"# Retrain anomaly detection model weekly new_training_data = collect_labeled_failure_events(last_3_months) improved_model = train_random_forest_classifier(new_training_data)# Deploy updated model to all fog gatewaysfor factory in factories: deploy_model_to_fog(factory.gateway, improved_model)
Total Data Flow:
Tier
Data Volume
Reduction
Processing
Edge (100 machines)
3.5 GB/day/machine × 100 = 350 GB/day
-
FFT feature extraction, threshold alerts
Fog aggregation
100 machines × 100 bytes/s = 10 KB/s = 864 MB/day
99.75%
ML inference, work order generation
Cloud receives
Hourly summaries: 24 hours × 10 KB = 240 KB/day
99.9999%
Model retraining, fleet analytics
Cost Comparison:
Architecture
Data Transmitted
Monthly Bandwidth Cost
Annual Cost
Cloud-only
350 GB/day × 30 = 10,500 GB/month
$840/month @ $0.08/GB
$10,080
Edge + Fog
240 KB/day × 30 = 7.2 MB/month
~$0.001/month
~$0.01
Savings
99.9999% reduction
~$840/month
~$10,080/year
Plus hardware costs:
Edge PLCs: Already installed (sunk cost)
Fog gateway: $15,000 one-time
Payback period: $15,000 / $840/month = 18 months
Key Insight: Edge FFT reduces raw 10kHz stream to 1Hz feature stream (99.75%), fog ML inference prevents failures weeks in advance (sub-100ms response), cloud retrains models to improve accuracy over time. No single tier could achieve this alone.
9.11 Try It Yourself
Exercise 1: Calculate Bandwidth Requirements for Your Use Case
Scenario: Design an edge/fog architecture for a logistics fleet.