Centralized Era (1960s-1990s): Computing began with mainframes where all processing happened in central facilities. Data centers emerged in the 1990s, consolidating servers for efficiency.
Cloud Revolution (2006-2015): Amazon Web Services launched in 2006, followed by Azure (2010) and Google Cloud (2012). Cloud computing offered elastic, pay-per-use compute that scaled infinitely. But geography imposed hard limits: network latency remained 50-200ms to the nearest region.
IoT Challenge (2010-2015): The explosion of connected devices revealed cloud’s limitations. Autonomous vehicles generate 1-4 TB/hour and need <10ms collision avoidance - cloud’s 100-500ms round-trip is 10-50x too slow. Smart factories require <50ms for safety shutdowns.
Fog Computing (2015-2018): Cisco coined “fog computing” in 2015, proposing hierarchical processing between edge and cloud. Key insight: not all data needs to reach the cloud - process time-critical decisions locally.
Edge AI (2018-present): Google’s Edge TPU (2018), NVIDIA Jetson, and TensorFlow Lite Micro brought ML inference to edge devices. Edge went from “dumb sensors” to “intelligent nodes.”
Why This Matters: A 2024 study found 75% of enterprise data will be processed outside traditional data centers by 2025, up from 10% in 2018.