343Fog/Edge Computing: Requirements and When to Use
343.1 Learning Objectives
By the end of this section, you will be able to:
Identify Requirements: Determine IoT system requirements that benefit from fog computing
Make Deployment Decisions: Evaluate when to use edge, fog, or cloud processing
Analyze Tradeoffs: Compare containers vs VMs, edge vs fog placement, and replication strategies
Apply Decision Frameworks: Use structured approaches to fog architecture selection
⏱️ ~12 min | ⭐⭐ Intermediate | 📋 P05.C07.U03
The motivations for fog computing stem from fundamental limitations of purely cloud-based IoT architectures and the unique requirements of modern distributed applications.
NoteAcademic Resource: Why Not Cloud? The Case for Edge Computing
Data growth chart showing mobile traffic increasing from 2013-2018 with 22% CAGR, followed by analysis of why cloud computing alone is insufficient for IoT: expensive to build/maintain/run, far away from the edge causing latency issues, and requires constant connection. The slide then presents “The New Edge” as the solution: lots of storage, increasing computational power, cheap, and always at hand.
Source: Princeton University, Coursera Fog Networks for IoT (Prof. Mung Chiang)
343.1.1 Latency Reduction
The Latency Problem: Round-trip communication to distant cloud data centers introduces latency (50-200+ ms), unacceptable for time-critical applications.
Fog Solution: Processing at edge nodes reduces latency to single-digit milliseconds by eliminating long-distance network traversal.
Figure 343.1: Total IoT response time breakdown showing device sensing, network transmission, cloud processing, and response delivery. Fog computing reduces latency by processing data at the edge, eliminating long-distance cloud round trips for time-critical applications
343.1.2 Bandwidth Conservation
The Bandwidth Challenge: Billions of IoT devices generating continuous data streams create enormous bandwidth requirements.
Statistics: - A single connected vehicle generates 4TB of data per day - A smart factory with thousands of sensors produces petabytes monthly - Video surveillance cameras generate terabytes per camera per week
Fog Solution: Local processing, filtering, and aggregation reduce data transmitted to cloud by 90-99%, sending only meaningful insights or anomalies.
343.1.3 Network Reliability
Cloud Dependency Risk: Purely cloud-based systems fail when internet connectivity is lost or degraded.
Fog Solution: Local fog nodes continue operating independently during network outages, maintaining critical functions and storing data for later synchronization.
343.1.4 Privacy and Security
Data Sensitivity Concerns: Transmitting raw sensitive data (video, health information, industrial processes) to cloud raises privacy and security risks.
Fog Solution: Processing sensitive data locally enables anonymization, aggregation, or filtering before cloud transmission, minimizing exposure.
Show code
viewof kc_fog_7 = {const container =html`<div class="inline-knowledge-check"></div>`;if (container &&typeof InlineKnowledgeCheck !=='undefined') { container.innerHTML=''; container.appendChild(InlineKnowledgeCheck.create({question:"A European smart building company collects occupancy data from 1,000 office spaces across 10 countries. Under GDPR, personal movement data must not leave its country of origin. How does fog computing help solve this compliance challenge?",options: [ {text:"Fog computing cannot help - GDPR requires all data to be processed in a central EU location",correct:false,feedback:"Incorrect. GDPR actually favors local processing. Data minimization principles encourage processing data locally and only sending anonymized/aggregated results. Fog computing aligns perfectly with GDPR's data locality requirements."}, {text:"Deploy fog nodes in each country to process and anonymize data locally, sending only aggregated statistics to the central cloud",correct:true,feedback:"Correct! Fog computing enables GDPR compliance by processing personal data within national boundaries. Individual occupancy patterns are analyzed locally, and only anonymized aggregates (e.g., 'Building A: 73% average occupancy') cross borders. Raw personal data never leaves the country of origin."}, {text:"Encrypt all data with strong cryptography before sending to cloud - encryption satisfies GDPR requirements",correct:false,feedback:"Incorrect. GDPR's data localization requirements aren't satisfied by encryption alone. Personal data that leaves its country of origin (even encrypted) still represents a cross-border transfer requiring legal basis. Fog-based local processing avoids this issue entirely."}, {text:"Store all data on edge devices without any central collection - this eliminates compliance concerns",correct:false,feedback:"Incorrect. While keeping data on edge devices avoids cross-border transfers, it also prevents valuable cross-building analytics that provide business value. Fog computing provides the optimal balance: local processing for compliance, aggregated insights for analytics."} ],difficulty:"hard",topic:"fog-security-compliance" })); }return container;}
343.1.5 Cost Optimization
Cloud Cost Factors: - Data transmission costs (especially cellular) - Cloud storage and processing fees - Bandwidth charges
Fog Solution: Reducing data transmitted to cloud and leveraging local resources lowers operational costs significantly.
343.1.6 Compliance and Data Sovereignty
Regulatory Requirements: Laws like GDPR, HIPAA, and data localization requirements constrain where data can be stored and processed.
Fog Solution: Processing data locally within jurisdictional boundaries enables compliance while still leveraging cloud for permissible operations.
343.2 Requirements of IoT Supporting Fog Computing
Effective fog computing implementations must address specific IoT requirements that traditional architectures struggle to satisfy.
343.2.1 Real-Time Processing
Requirement: Immediate response to events without cloud round-trip delays.
Applications: - Industrial automation and control - Autonomous vehicles and drones - Smart grid management - Healthcare monitoring and emergency response
Fog Capability: Local computation enables sub-10ms response times.
Figure 343.2: Data time sensitivity classification showing critical real-time applications requiring millisecond responses, interactive applications needing sub-second latency, and analytical applications tolerating minutes to hours of processing delay
343.2.2 Massive Scale
Requirement: Supporting billions of devices generating exabytes of data.
Fog Capability: Distributed processing across fog nodes scales horizontally, with each node handling local device populations.
343.2.3 Mobility Support
Requirement: Seamless service for mobile devices and vehicles.
Challenges: - Maintaining connectivity during movement - Handoff between access points - Location-aware services
Fog Capability: Distributed fog nodes provide consistent local services as devices move, with nearby nodes handling processing.
343.2.4 Heterogeneity
Requirement: Supporting diverse devices, protocols, and data formats.
Challenges: - Multiple communication protocols - Various data formats and semantics - Different capabilities and constraints
Fog Capability: Fog nodes act as protocol gateways and data translators, providing unified interfaces to cloud.
343.2.5 Energy Efficiency
Requirement: Minimizing energy consumption of battery-powered IoT devices.
Challenges: - Radio communication energy costs - Limited battery capacity - Recharging/replacement difficulties
Fog Capability: Short-range communication to nearby fog nodes consumes far less energy than long-range cloud transmission.
343.3 When Should We Use Edge/Fog Computing
Not all IoT applications benefit equally from fog computing. Understanding appropriate use cases ensures effective architectural decisions.
343.3.1 Ideal Fog Computing Scenarios
Latency-Sensitive Applications: - Autonomous vehicles requiring instant collision avoidance - Industrial robots with real-time coordination - Augmented/virtual reality experiences - Tactile internet and remote surgery - High-frequency trading and financial systems
Bandwidth-Constrained Environments: - Remote locations with limited connectivity - Cellular IoT with data plan limitations - Video surveillance with hundreds of cameras - Industrial sites generating massive sensor data
Privacy-Critical Systems: - Healthcare patient monitoring - Security and surveillance systems - Personal home automation - Enterprise confidential data processing
Intermittent Connectivity: - Mobile applications with unreliable networks
Figure 343.3: Smart home fog computing architecture showing local fog gateway processing data from sensors (temperature, motion, door locks) and controlling actuators (lights, thermostat, security) with low-latency local decisions while synchronizing with cloud for analytics and remote access
Maritime and aviation systems
Remote industrial facilities
Disaster response and emergency services
Geographically Distributed Deployments: - Smart city infrastructure across metropolitan areas - Agricultural monitoring over large farms - Pipeline and utility monitoring - Retail chains with distributed locations
Show code
viewof kc_fog_8 = {const container =html`<div class="inline-knowledge-check"></div>`;if (container &&typeof InlineKnowledgeCheck !=='undefined') { container.innerHTML=''; container.appendChild(InlineKnowledgeCheck.create({question:"An agricultural company monitors soil moisture across 10,000 acres of farmland in a rural area with unreliable cellular coverage. Sensors report every 15 minutes, and irrigation decisions must be made within 1 hour of readings. Which is the most appropriate architecture?",options: [ {text:"Cloud-only - 1 hour latency tolerance means cloud processing is perfectly acceptable",correct:false,feedback:"Incorrect. While 1 hour latency might seem acceptable for cloud, the key constraint is unreliable cellular coverage. During connectivity outages (common in rural areas), the cloud-only system would fail to make irrigation decisions, potentially damaging crops."}, {text:"Edge-only - each sensor should make its own irrigation decisions independently",correct:false,feedback:"Incorrect. Individual sensors making independent decisions would lead to suboptimal irrigation - neighboring sensors might make conflicting choices. Coordination across zones is valuable for efficient water usage. Fog enables this coordination."}, {text:"Fog with local gateway - aggregate sensor data locally, make irrigation decisions at fog node, sync to cloud when connectivity available",correct:true,feedback:"Correct! A fog gateway in the farmland aggregates readings from all sensors, makes coordinated irrigation decisions within the 1-hour window, and operates autonomously during connectivity outages. When cellular is available, it syncs historical data to cloud for long-term analytics. This is the ideal architecture for intermittent connectivity."}, {text:"Satellite connectivity to cloud - bypass unreliable cellular with always-available satellite",correct:false,feedback:"Incorrect. While satellite provides connectivity, it's expensive ($0.50-2.00/MB) and adds 500-700ms latency. For 10,000 sensors reporting every 15 minutes, satellite costs would be prohibitive. Fog computing with local processing is far more cost-effective."} ],difficulty:"medium",topic:"fog-use-cases" })); }return container;}
343.3.2 When Cloud-Only May Suffice
Non-Time-Critical Applications: - Historical data analytics - Long-term trend analysis - Batch processing tasks - Monthly reporting and summaries
Small-Scale Deployments: - Home automation with few devices - Personal fitness tracking - Small business monitoring
High-Complexity Analytics: - Advanced machine learning model training - Big data correlation across global datasets - Complex simulations requiring massive compute
NoteQuiz 2: When Should We Use Edge/Fog Computing
Question 5: A healthcare IoT system monitors 100 patients with wearable sensors sending heart rate, blood pressure, and temperature every 10 seconds. Each reading is 50 bytes. The system must alert doctors within 5 seconds if vital signs are critical. Should this use cloud-only or fog computing architecture?
💡 Explanation: Data volume: 100 patients × 50 bytes × 6 readings/min = 30 KB/min = 500 bytes/sec—trivial bandwidth. However, this isn’t about bandwidth! Hospital Wi-Fi outages, cellular dead zones, and network congestion are common. Cloud-only architecture fails during outages, potentially missing critical alerts. Fog computing solution: Local fog node (hospital gateway) continuously monitors vitals and generates alerts within milliseconds, even if cloud connection fails. Alerts trigger local nurse station displays and mobile devices via local network. Cloud receives data for long-term analysis when available but isn’t in the critical alert path. This demonstrates fog computing’s reliability benefit—not just latency/bandwidth, but ensuring critical functions during network failures. Healthcare, industrial safety, and autonomous vehicles all require this local autonomy.
343.4 What’s Next
Now that you understand fog computing requirements:
Design Tradeoffs: Dive deeper into architecture decisions and pitfalls