322  Edge and Fog Computing: Architecture

322.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Design three-tier architectures: Structure edge, fog, and cloud layers appropriately
  • Identify fog node capabilities: Describe computation, storage, networking, and security functions
  • Understand data flow: Trace data paths through hierarchical processing
  • Implement processing pipelines: Design collection, aggregation, and forwarding stages
  • Apply architectural patterns: Select appropriate configurations for different deployments

322.2 Architecture of Fog Computing

Fog computing architectures organize computing resources across multiple tiers, each optimized for specific functions and constraints.

322.2.1 Three-Tier Fog Architecture

Tier 1: Edge Devices (Things Layer) - IoT sensors and actuators - Smart devices and appliances - Wearables and mobile devices - Embedded systems

Characteristics: - Severely resource-constrained - Battery-powered typically - Focused on sensing/actuation - Minimal local processing

Tier 2: Fog Nodes (Fog Layer) - Gateways and routers - Base stations and access points - Micro data centers - Cloudlets and edge servers

Characteristics: - Moderate computational resources - Networking and storage capabilities - Proximity to edge devices - Protocol translation and aggregation

Tier 3: Cloud Data Centers (Cloud Layer) - Large-scale data centers - Virtually unlimited resources - Global reach and availability - Advanced analytics and storage

Characteristics: - Massive computational power - Scalable storage - Rich software ecosystems - Higher latency from edge

Three-Tier Architecture Summary:

Tier Components Functions Data Flow
Cloud Data centers Unlimited compute/storage, global analytics, ML training Receives aggregated insights, sends commands/ML models
Fog Gateways, edge servers, base stations Data aggregation, filtering, protocol translation Receives raw data (90-99% reduction), sends to cloud
Edge Sensors, actuators, smart devices Data collection, simple filtering, threshold detection Sends raw/filtered data via Bluetooth/Zigbee

Data Flow: Edge -> (raw data) -> Fog -> (90-99% reduction) -> Cloud Control Flow: Cloud -> (ML models, commands) -> Fog -> (<10ms response) -> Edge

Detailed three-tier fog computing architecture with bidirectional data flow: Tier 1 Edge (navy, 1-10ms) with sensors, actuators, simple filtering, battery-powered devices sends raw data upward to Tier 2 Fog (teal, 10-100ms) with gateways, edge servers, local analytics, 90-99% data reduction sending 5-10% filtered data to Tier 3 Cloud (gray, 100-300ms) with unlimited compute, ML training, long-term storage which sends ML models downward to fog layer that sends commands (under 10ms) back to edge tier

Detailed three-tier fog computing architecture with bidirectional data flow: Tier 1 Edge (navy, 1-10ms) with sensors, actuators, simple filtering, battery-powered devices sends raw data upward to Tier 2 Fog (teal, 10-100ms) with gateways, edge servers, local analytics, 90-99% data reduction sending 5-10% filtered data to Tier 3 Cloud (gray, 100-300ms) with unlimited compute, ML training, long-term storage which sends ML models downward to fog layer that sends commands (under 10ms) back to edge tier
Figure 322.1: Three-tier fog computing architecture detailing components, functions, and metrics at each layer. Tier 1 (Edge) provides 1-10ms responses with battery-powered sensors performing simple filtering. Tier 2 (Fog) offers 10-100ms local analytics with 90-99% data reduction at gateways and edge servers. Tier 3 (Cloud) delivers unlimited compute with 100-300ms latency for global ML training and long-term storage. Bidirectional data flow shows raw data flowing upward (5-10% selection) while ML models and control commands flow downward.

322.2.2 Fog Node Capabilities

Computation: - Data preprocessing and filtering - Local analytics and decision-making - Machine learning inference - Event detection and correlation

Storage: - Temporary data buffering - Caching frequently accessed data - Local databases for recent history - Offline operation support

Networking: - Protocol translation (e.g., Zigbee to IP) - Data aggregation from multiple sensors - Load balancing and traffic management - Quality of Service (QoS) enforcement

Security: - Local authentication and authorization - Data encryption/decryption - Intrusion detection - Privacy-preserving processing

322.3 Applications of Fog Computing

Building on the use cases below, these deployment patterns show where fog-tier processing is essential and how they connect to other parts of the book.

  • Real-time rail monitoring: Fog nodes along tracks analyse vibration and axle temperature locally to flag anomalies within milliseconds
  • Pipeline optimization: Gateways near pumps and valves aggregate high-frequency pressure/flow signals, run anomaly detection, and stream compressed alerts upstream
  • Wind farm operations: Turbine controllers optimize blade pitch at the edge; fog aggregators coordinate farm-level balancing
  • Smart home orchestration: Gateways fuse motion, environmental, and camera signals to automate lighting/HVAC without WAN dependency; cloud receives summaries and model updates

322.3.1 Hierarchical Processing

Data Flow: 1. Edge devices collect raw data 2. Fog nodes filter, aggregate, and process locally 3. Refined data/insights forwarded to cloud 4. Cloud performs global analytics and long-term storage 5. Results and commands flow back down hierarchy

Processing Distribution: - Time-Critical: Processed at fog layer - Local Scope: Handled by fog nodes - Global Analytics: Sent to cloud - Long-Term Storage: Cloud repositories

322.4 Working of Fog Computing

Understanding the operational flow of fog computing systems illustrates how distributed components collaborate to deliver responsive, efficient IoT services.

322.4.1 Data Collection Phase

  1. Sensing:
    • Edge devices continuously or periodically sense environment
    • Data includes temperature, motion, images, location, etc.
    • Sampling rates vary by application requirements
  2. Local Processing (Device Level):
    • Basic filtering and validation
    • Analog-to-digital conversion
    • Initial compression or feature extraction
    • Energy-efficient operation
  3. Communication:
    • Transmission to nearby fog nodes
    • Short-range protocols (Bluetooth, Zigbee, Wi-Fi)
    • Energy-efficient due to proximity

322.4.2 Fog Processing Phase

  1. Data Aggregation:
    • Combining data from multiple sensors
    • Time synchronization
    • Spatial correlation
    • Redundancy elimination
  2. Preprocessing:
    • Noise filtering and smoothing
    • Outlier detection and correction
    • Data normalization and formatting
    • Missing value handling
  3. Local Analytics:
    • Pattern recognition
    • Anomaly detection
    • Event classification
    • Threshold monitoring
  4. Decision Making:
    • Rule-based responses
    • Local control commands
    • Alert generation
    • Adaptive behavior
  5. Selective Forwarding:
    • Sending only relevant data to cloud
    • Summaries and statistics instead of raw data
    • Triggered transmission on significant events
    • Bandwidth optimization

322.4.3 Cloud Processing Phase

  1. Global Analytics:
    • Cross-location correlation
    • Long-term trend analysis
    • Complex machine learning
    • Predictive modeling
  2. Storage:
    • Long-term archival
    • Historical databases
    • Data lake creation
    • Backup and redundancy
  3. Coordination:
    • Multi-site orchestration
    • Resource allocation
    • Software updates distribution
    • Configuration management

322.4.4 Action Phase

  1. Local Response (Fog Level):
    • Immediate actuator control
    • Real-time alerts
    • Emergency responses
    • Automatic adjustments
  2. Global Response (Cloud Level):
    • Strategic decisions
    • Resource optimization across sites
    • Long-term planning
    • Policy updates

322.5 Context Awareness and Location

322.5.1 Location Awareness

Proximity-Based Processing: Fog nodes leverage knowledge of device locations for intelligent data routing and processing.

Example: Smart parking system knows which sensors are in which parking lot, enabling lot-specific availability calculations without cloud involvement.

322.5.2 Environmental Context

Local weather, traffic, events, and conditions provide context for intelligent interpretation.

Example: Smart city fog node near intersection combines: - Traffic camera data - Inductive loop sensors - Local event calendar - Weather conditions -> Optimizes traffic light timing based on complete local context

322.5.3 Data Gravity

Concept: Large datasets have “gravity” - moving them is costly in time, bandwidth, and money.

Implication: Bringing computation to data (fog) is often more efficient than bringing data to computation (cloud).

Example: Video surveillance generating 1TB/day per camera: - Sending to cloud: Massive bandwidth and cost - Fog processing: Extract only motion events, faces, or anomalies - Result: 1GB/day instead of 1TB/day sent to cloud (99.9% reduction)

322.6 Edge Computing Architecture: GigaSight Framework

GigaSight represents an exemplary edge computing framework designed for large-scale video analytics, illustrating practical fog computing architecture patterns.

322.6.1 Architecture Overview

Problem: Real-time video processing from thousands of cameras generates petabytes of data with latency requirements incompatible with cloud-only processing.

Solution: Hierarchical edge computing architecture distributing processing across three tiers.

322.6.2 Three-Tier Architecture

Tier 1: Camera Edge Devices - Smart cameras with embedded processors - Perform basic video preprocessing - Motion detection and frame extraction - H.264/H.265 video compression

Tier 2: Edge Servers (Cloudlets) - Deployed near camera clusters (e.g., building, floor, or area) - GPU-accelerated video analytics - Object detection and tracking - Face recognition and classification - Event extraction

Tier 3: Cloud Data Center - Long-term video storage - Cross-location analytics - Model training and updates - Dashboard and user interfaces

322.6.3 Processing Pipeline

  1. Capture: Cameras capture video streams
  2. Filter: Motion detection filters static periods
  3. Extract: Key frames and events extracted
  4. Analyze: Edge servers run ML models (YOLO, CNNs)
  5. Index: Metadata and events indexed
  6. Store: Relevant clips and metadata stored locally
  7. Forward: Summaries and alerts sent to cloud
  8. Query: Users query metadata, retrieve specific clips

322.6.4 Benefits Demonstrated

Latency: Real-time alerts (sub-second) vs. cloud (several seconds)

Bandwidth: 99% reduction through local processing

Privacy: Video stays local; only metadata and specific events sent to cloud

Scalability: Thousands of cameras supported through distributed edge servers

Question: A retail store uses fog computing for inventory tracking with RFID readers. Edge readers detect 1,000 tag scans/second, fog node aggregates duplicate reads (same tag read by multiple readers), reducing data by 85%. How much data reaches the cloud?

Edge RFID readers scan 1,000 tags/second total across all readers. Many tags are read by multiple readers simultaneously (redundancy for reliability). Fog node performs data aggregation: identifies unique tags, eliminates duplicates, and tracks actual item movements. Example: Item moves through doorway with 3 readers–all 3 detect it simultaneously, generating 3 scans. Fog node aggregates these into 1 unique item movement event. If 85% of scans are duplicates, fog reduces 1,000 scans/s to 150 unique events/s. Cloud receives: “Item X moved from zone A to zone B at timestamp T” (150 events/s) instead of “Reader 1 saw X, Reader 2 saw X, Reader 3 saw X…” (1,000 scans/s). This demonstrates fog computing’s data aggregation benefit: converting high-volume raw sensor data into meaningful business events. Bandwidth reduction: 85%. Semantic value increase: raw scans -> actionable inventory movements.

322.7 Summary

Fog computing architecture provides a structured approach to distributing computation across edge, fog, and cloud tiers. Each tier has distinct capabilities and responsibilities, working together to deliver responsive, efficient, and resilient IoT systems.

Key takeaways:

  • Three-tier architecture (Edge, Fog, Cloud) provides hierarchical processing
  • Fog nodes offer computation, storage, networking, and security capabilities
  • Data flows upward with progressive filtering and aggregation
  • Control flows downward with ML models and commands
  • Context awareness enables intelligent local processing
  • GigaSight demonstrates practical edge architecture for video analytics

322.8 What’s Next?

Now that you understand the architecture, the next chapter explores the advantages and challenges of fog computing implementations.

Continue to Advantages and Challenges –>