March 2016
Beginner to intermediate
400 pages
8h 34m
English
Over the past decade, we have witnessed a revolution in the way information technology (IT) works from an infrastructure perspective. Gone are the days of purchasing and deploying a new server for each new application. Instead, IT has become much more adept at sharing existent underutilized resources across a catalog of IT applications, and this technique is called virtualization.
Of course, sharing of resources is nothing new. Back in the days of mainframe computers, IT would segment processing capacity to provide many smaller logical central processing units (CPUs). Similarly, networks have used virtualization for years to create segregated logical networks that share the same wire (for example, virtual local-area ...