322 Edge and Fog Computing: Architecture
322.1 Learning Objectives
By the end of this chapter, you will be able to:
- Design three-tier architectures: Structure edge, fog, and cloud layers appropriately
- Identify fog node capabilities: Describe computation, storage, networking, and security functions
- Understand data flow: Trace data paths through hierarchical processing
- Implement processing pipelines: Design collection, aggregation, and forwarding stages
- Apply architectural patterns: Select appropriate configurations for different deployments
322.2 Architecture of Fog Computing
Fog computing architectures organize computing resources across multiple tiers, each optimized for specific functions and constraints.
322.2.1 Three-Tier Fog Architecture
Tier 1: Edge Devices (Things Layer) - IoT sensors and actuators - Smart devices and appliances - Wearables and mobile devices - Embedded systems
Characteristics: - Severely resource-constrained - Battery-powered typically - Focused on sensing/actuation - Minimal local processing
Tier 2: Fog Nodes (Fog Layer) - Gateways and routers - Base stations and access points - Micro data centers - Cloudlets and edge servers
Characteristics: - Moderate computational resources - Networking and storage capabilities - Proximity to edge devices - Protocol translation and aggregation
Tier 3: Cloud Data Centers (Cloud Layer) - Large-scale data centers - Virtually unlimited resources - Global reach and availability - Advanced analytics and storage
Characteristics: - Massive computational power - Scalable storage - Rich software ecosystems - Higher latency from edge
Three-Tier Architecture Summary:
| Tier | Components | Functions | Data Flow |
|---|---|---|---|
| Cloud | Data centers | Unlimited compute/storage, global analytics, ML training | Receives aggregated insights, sends commands/ML models |
| Fog | Gateways, edge servers, base stations | Data aggregation, filtering, protocol translation | Receives raw data (90-99% reduction), sends to cloud |
| Edge | Sensors, actuators, smart devices | Data collection, simple filtering, threshold detection | Sends raw/filtered data via Bluetooth/Zigbee |
Data Flow: Edge -> (raw data) -> Fog -> (90-99% reduction) -> Cloud Control Flow: Cloud -> (ML models, commands) -> Fog -> (<10ms response) -> Edge
322.2.2 Fog Node Capabilities
Computation: - Data preprocessing and filtering - Local analytics and decision-making - Machine learning inference - Event detection and correlation
Storage: - Temporary data buffering - Caching frequently accessed data - Local databases for recent history - Offline operation support
Networking: - Protocol translation (e.g., Zigbee to IP) - Data aggregation from multiple sensors - Load balancing and traffic management - Quality of Service (QoS) enforcement
Security: - Local authentication and authorization - Data encryption/decryption - Intrusion detection - Privacy-preserving processing
322.3 Applications of Fog Computing
Building on the use cases below, these deployment patterns show where fog-tier processing is essential and how they connect to other parts of the book.
- Real-time rail monitoring: Fog nodes along tracks analyse vibration and axle temperature locally to flag anomalies within milliseconds
- Pipeline optimization: Gateways near pumps and valves aggregate high-frequency pressure/flow signals, run anomaly detection, and stream compressed alerts upstream
- Wind farm operations: Turbine controllers optimize blade pitch at the edge; fog aggregators coordinate farm-level balancing
- Smart home orchestration: Gateways fuse motion, environmental, and camera signals to automate lighting/HVAC without WAN dependency; cloud receives summaries and model updates
322.3.1 Hierarchical Processing
Data Flow: 1. Edge devices collect raw data 2. Fog nodes filter, aggregate, and process locally 3. Refined data/insights forwarded to cloud 4. Cloud performs global analytics and long-term storage 5. Results and commands flow back down hierarchy
Processing Distribution: - Time-Critical: Processed at fog layer - Local Scope: Handled by fog nodes - Global Analytics: Sent to cloud - Long-Term Storage: Cloud repositories
322.4 Working of Fog Computing
Understanding the operational flow of fog computing systems illustrates how distributed components collaborate to deliver responsive, efficient IoT services.
322.4.1 Data Collection Phase
- Sensing:
- Edge devices continuously or periodically sense environment
- Data includes temperature, motion, images, location, etc.
- Sampling rates vary by application requirements
- Local Processing (Device Level):
- Basic filtering and validation
- Analog-to-digital conversion
- Initial compression or feature extraction
- Energy-efficient operation
- Communication:
- Transmission to nearby fog nodes
- Short-range protocols (Bluetooth, Zigbee, Wi-Fi)
- Energy-efficient due to proximity
322.4.2 Fog Processing Phase
- Data Aggregation:
- Combining data from multiple sensors
- Time synchronization
- Spatial correlation
- Redundancy elimination
- Preprocessing:
- Noise filtering and smoothing
- Outlier detection and correction
- Data normalization and formatting
- Missing value handling
- Local Analytics:
- Pattern recognition
- Anomaly detection
- Event classification
- Threshold monitoring
- Decision Making:
- Rule-based responses
- Local control commands
- Alert generation
- Adaptive behavior
- Selective Forwarding:
- Sending only relevant data to cloud
- Summaries and statistics instead of raw data
- Triggered transmission on significant events
- Bandwidth optimization
322.4.3 Cloud Processing Phase
- Global Analytics:
- Cross-location correlation
- Long-term trend analysis
- Complex machine learning
- Predictive modeling
- Storage:
- Long-term archival
- Historical databases
- Data lake creation
- Backup and redundancy
- Coordination:
- Multi-site orchestration
- Resource allocation
- Software updates distribution
- Configuration management
322.4.4 Action Phase
- Local Response (Fog Level):
- Immediate actuator control
- Real-time alerts
- Emergency responses
- Automatic adjustments
- Global Response (Cloud Level):
- Strategic decisions
- Resource optimization across sites
- Long-term planning
- Policy updates
322.5 Context Awareness and Location
322.5.1 Location Awareness
Proximity-Based Processing: Fog nodes leverage knowledge of device locations for intelligent data routing and processing.
Example: Smart parking system knows which sensors are in which parking lot, enabling lot-specific availability calculations without cloud involvement.
322.5.2 Environmental Context
Local weather, traffic, events, and conditions provide context for intelligent interpretation.
Example: Smart city fog node near intersection combines: - Traffic camera data - Inductive loop sensors - Local event calendar - Weather conditions -> Optimizes traffic light timing based on complete local context
322.5.3 Data Gravity
Concept: Large datasets have “gravity” - moving them is costly in time, bandwidth, and money.
Implication: Bringing computation to data (fog) is often more efficient than bringing data to computation (cloud).
Example: Video surveillance generating 1TB/day per camera: - Sending to cloud: Massive bandwidth and cost - Fog processing: Extract only motion events, faces, or anomalies - Result: 1GB/day instead of 1TB/day sent to cloud (99.9% reduction)
322.6 Edge Computing Architecture: GigaSight Framework
GigaSight represents an exemplary edge computing framework designed for large-scale video analytics, illustrating practical fog computing architecture patterns.
322.6.1 Architecture Overview
Problem: Real-time video processing from thousands of cameras generates petabytes of data with latency requirements incompatible with cloud-only processing.
Solution: Hierarchical edge computing architecture distributing processing across three tiers.
322.6.2 Three-Tier Architecture
Tier 1: Camera Edge Devices - Smart cameras with embedded processors - Perform basic video preprocessing - Motion detection and frame extraction - H.264/H.265 video compression
Tier 2: Edge Servers (Cloudlets) - Deployed near camera clusters (e.g., building, floor, or area) - GPU-accelerated video analytics - Object detection and tracking - Face recognition and classification - Event extraction
Tier 3: Cloud Data Center - Long-term video storage - Cross-location analytics - Model training and updates - Dashboard and user interfaces
322.6.3 Processing Pipeline
- Capture: Cameras capture video streams
- Filter: Motion detection filters static periods
- Extract: Key frames and events extracted
- Analyze: Edge servers run ML models (YOLO, CNNs)
- Index: Metadata and events indexed
- Store: Relevant clips and metadata stored locally
- Forward: Summaries and alerts sent to cloud
- Query: Users query metadata, retrieve specific clips
322.6.4 Benefits Demonstrated
Latency: Real-time alerts (sub-second) vs. cloud (several seconds)
Bandwidth: 99% reduction through local processing
Privacy: Video stays local; only metadata and specific events sent to cloud
Scalability: Thousands of cameras supported through distributed edge servers
322.7 Summary
Fog computing architecture provides a structured approach to distributing computation across edge, fog, and cloud tiers. Each tier has distinct capabilities and responsibilities, working together to deliver responsive, efficient, and resilient IoT systems.
Key takeaways:
- Three-tier architecture (Edge, Fog, Cloud) provides hierarchical processing
- Fog nodes offer computation, storage, networking, and security capabilities
- Data flows upward with progressive filtering and aggregation
- Control flows downward with ML models and commands
- Context awareness enables intelligent local processing
- GigaSight demonstrates practical edge architecture for video analytics
322.8 What’s Next?
Now that you understand the architecture, the next chapter explores the advantages and challenges of fog computing implementations.