336  Edge and Fog Computing: Use Cases

336.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Apply fog computing to industrial scenarios: Design predictive maintenance systems
  • Understand autonomous vehicle requirements: Recognize why edge is mandatory for safety-critical decisions
  • Implement privacy-preserving architectures: Process sensitive data locally
  • Calculate real-world benefits: Quantify latency, bandwidth, and cost improvements

336.2 Use Case 1: Smart Factory Predictive Maintenance

336.2.1 Scenario

Manufacturing facility with hundreds of machines, each instrumented with vibration, temperature, and acoustic sensors generating data at 1kHz sampling rate.

336.2.2 Requirements

  • Real-time anomaly detection (<100ms)
  • Predictive failure alerts (hours to days advance warning)
  • Minimal network load
  • Continued operation during internet outages

336.2.3 Fog Architecture

Edge Tier: Machine Controllers - Collect sensor data at 1kHz - Basic filtering and feature extraction - Detect critical threshold violations (immediate shutdown)

Fog Tier: Factory Edge Servers - Deployed per production line - Run ML models for anomaly detection - Analyze vibration patterns, thermal signatures - Predict component failures - Store recent data (rolling 24-hour window) - Generate maintenance work orders

Cloud Tier: Enterprise Data Center - Aggregate data from all factories - Train improved ML models - Long-term trend analysis - Supply chain and inventory optimization - Dashboards for management

336.2.4 Benefits

Latency: Immediate shutdown on critical failures; real-time anomaly alerts Bandwidth: 99.9% reduction (1kHz data -> event summaries) Reliability: Continues operating during internet outages Value: Reduced downtime, optimized maintenance, extended equipment life

336.3 Use Case 2: Autonomous Vehicle Edge Computing

336.3.1 Scenario

Connected autonomous vehicles requiring instant decision-making with sensing, communication, and coordination.

336.3.2 Requirements

  • Ultra-low latency (<10ms for critical decisions)
  • High reliability (safety-critical)
  • Massive sensor data (cameras, LIDAR, radar)
  • Vehicle-to-vehicle (V2V) communication
  • Infrastructure coordination

336.3.3 Fog Architecture

Edge Tier: Vehicle On-Board Computing - Powerful edge servers in vehicle - Real-time sensor fusion - Immediate driving decisions (steering, braking, acceleration) - Trajectory planning - Collision avoidance

Fog Tier: Roadside Units (RSUs) - Deployed along roads at intersections - Coordinate multiple vehicles - Provide local traffic information - Extend sensor range (communicate what’s around corner) - Handle V2V message relay

Fog Tier: Mobile Edge Computing (MEC) at Base Stations - Cellular network edge - Regional traffic management - HD map updates - Software updates - Non-critical cloud services

Cloud Tier: Central Data Centers - Fleet management - Route optimization - Long-term learning - Software development - Regulatory compliance

336.3.4 Processing Example

Collision Avoidance Scenario: 1. Vehicle sensors detect potential collision (5ms) 2. On-board edge processing decides evasive action (3ms) 3. Action executed (braking/steering) (2ms) 4. Total: 10ms (Cloud round-trip would be 200ms+ - collision already occurred)

Cooperative Perception: 1. RSU combines sensor data from multiple vehicles 2. Shares augmented awareness (blind spot information) 3. Vehicles receive enhanced situational awareness 4. Better decisions through cooperation

336.3.5 Benefits

Safety: Life-critical response times achieved Bandwidth: Terabytes/day of sensor data processed locally Reliability: Critical functions independent of cloud connectivity Scalability: Millions of vehicles supported through distributed architecture

336.4 Use Case 3: Privacy-Preserving Architecture

Fog computing enables privacy-preserving architectures that process sensitive data locally while still providing useful insights and services.

336.4.1 Privacy Challenges in IoT

Personal Data Exposure: - Video surveillance - Health monitoring - Location tracking - Behavioral patterns

Cloud Privacy Risks: - Data breaches - Unauthorized access - Third-party sharing - Government surveillance

336.4.2 Fog-Based Privacy Preservation

Local Processing Principle: “Process data where it’s collected; send only necessary insights”

Techniques:

Data Minimization: - Extract only required features - Discard raw sensitive data - Aggregate individual data

Example: Smart home: Count people in room (1 number) instead of sending video stream

Anonymization: - Remove personally identifiable information - Blur faces in video - Generalize location (area vs. precise GPS)

Differential Privacy: - Add noise to data before transmission - Provide statistical guarantees on privacy - Enable aggregate analytics while protecting individuals

Encryption: - End-to-end encryption for necessary transmissions - Homomorphic encryption for cloud processing of encrypted data - Secure multi-party computation

336.4.3 Architecture Pattern

  1. Edge Devices: Collect raw sensitive data
  2. Fog Nodes:
    • Extract privacy-safe features
    • Anonymize or aggregate
    • Encrypt if transmission needed
  3. Cloud:
    • Receives only privacy-preserved data
    • Performs authorized analytics
    • Returns results to fog/devices

Example: Healthcare Monitoring - Wearable: Collects heart rate, location, activity - Fog (smartphone): Detects anomalies, triggers alerts - Cloud: Receives only: “Anomaly detected at approximate location X” - Privacy preserved: Raw health data never leaves personal fog node

336.5 Worked Example: Compute Offloading Decision for Agricultural Drone

Scenario: A fleet of agricultural drones surveys 1,000-acre farms for crop disease detection. Each drone carries cameras, processes images to detect diseased plants, and triggers precision pesticide spraying. The question: should image processing happen on the drone (edge), at a ground station (fog), or in the cloud?

Given:

  • 20 drones per farm
  • Each drone: 5 cameras, 12 MP each, 2 fps capture rate
  • Processing requirement: ResNet-50 inference for disease classification
  • Latency requirement: <500ms to identify diseased plant and trigger spray nozzle
  • Flight time: 2 hours per charge
  • Connectivity: 4G LTE (25 Mbps upload, 150ms latency to cloud)
  • Cloud GPU instance cost: $3/hour (NVIDIA T4 equivalent)

Step-by-step Analysis:

  1. Calculate raw data rate per drone:

    • 5 cameras x 12 MP x 3 bytes/pixel x 2 fps = 360 MB/s per drone
    • 20 drones x 360 MB/s = 7.2 GB/s total farm
  2. Evaluate cloud processing:

    • Upload bandwidth needed: 7.2 GB/s = 57.6 Gbps
    • Available: 20 drones x 25 Mbps = 500 Mbps (0.5 Gbps)
    • Bandwidth deficit: 115x more data than upload capacity!
    • Even with 10:1 compression: 5.76 Gbps needed, 11.5x shortfall
    • Cloud processing is physically impossible for real-time
  3. Evaluate fog processing (ground station):

    • Wireless to ground: 5.8 GHz link, 100 Mbps per drone = 2 Gbps total
    • Still need: 57.6 Gbps
    • Bandwidth deficit: 29x shortage
    • Latency: 39 seconds per image (2 Gbps / 57.6 Gbps x 500ms)
    • Fog processing too slow for real-time 500ms requirement
  4. Evaluate edge processing (on-drone):

    • NVIDIA Jetson Xavier NX: $400, 15W power
    • 21 TOPS INT8 inference
    • ResNet-50 inference: ~15ms per image
    • Power: 15W
    • Processing pipeline: Image capture (10ms) + Preprocessing (5ms) + Inference (15ms) + Decision logic (2ms) = 32ms
    • Throughput: 5 cameras x 2 fps = 10 images/sec
    • Required: 10 x 15ms = 150ms compute per second
    • Utilization: 15% (sustainable)
  5. Design hybrid edge-cloud architecture:

    • Real-time (on-drone): Disease detection inference, spray trigger decisions, flight path adjustments
    • Deferred (ground station buffer -> cloud overnight): Full-resolution image archival, detailed analysis for treatment planning, model retraining data collection
  6. Calculate costs:

    • Option A (Cloud - if possible): $3/hour x 2 hours/day x 20 drones x 365 = $43,800/year + bandwidth: 34.6 TB/day x $0.09/GB x 365 = $1,136,340/year = $1,180,140/year (IMPOSSIBLE anyway)
    • Option C (Edge + deferred cloud): Drone GPUs: $400 x 20 = $8,000 (one-time), Ground station: $15,000 (one-time), Cloud analysis: $3/hour x 4 hours/day x 365 = $4,380/year, Overnight bandwidth: ~100 GB/day x $0.09 x 365 = $3,285/year
    • Year 1: $8,000 + $15,000 + $4,380 + $3,285 = $30,665
    • Year 2+: $7,665/year
  7. Summary decision matrix:

    Factor Cloud Fog Edge
    Latency 13+ hours 39 seconds 32ms
    Meets 500ms req? No No Yes
    Annual cost $1.18M N/A $7,665
    Works offline? No Partial Yes

Result: Edge processing is the only viable option for real-time crop disease detection. The hybrid architecture uses on-drone inference for immediate decisions (32ms latency) while deferring full-resolution uploads for overnight cloud analysis.

Key Insight: When bandwidth is the bottleneck (which it almost always is for high-resolution imagery), edge processing becomes mandatory regardless of cost. The compute offloading decision is often determined by physics (data size vs link capacity) rather than economics. In this case, even unlimited budget couldn’t make cloud processing work in real-time. Design for edge-first when dealing with high-bandwidth sensors (cameras, lidar, radar) and use fog/cloud for deferred analytics.

336.6 Summary

Edge and fog computing enable use cases that cloud-only architectures cannot support. From factory floors requiring sub-100ms anomaly detection to autonomous vehicles needing 10ms collision avoidance, local processing is often a fundamental requirement rather than an optimization.

Key takeaways:

  • Smart factory: 99.9% bandwidth reduction with real-time anomaly detection
  • Autonomous vehicles: Edge processing mandatory for safety-critical decisions
  • Privacy: Process sensitive data locally to comply with regulations
  • Agricultural drones: Bandwidth physics make edge processing the only option

336.7 What’s Next?

Learn about common mistakes and how to avoid them in edge/fog implementations.

Continue to Common Pitfalls –>