Operating systems have been around since the beginning of modern computing, but in those early days, the operating system played a secondary role within the infrastructure. For decades, the hardware itself—massive mainframes in dedicated server rooms and laboratories—was the primary consideration. The operating system (OS) was just part of the framework, an efficient way to interact with the physical hardware, the peripherals, and the subsystems. With the shift to cloud-based infrastructure, it feels like we’ve circled back to that time. Today, the operating system is often considered incidental to the cloud provider and available services.