Monday, April 18, 2011

EVOLUTION OF VIRTUALIZATION

In the 1970s, mainframes ruled the datacenter. Partitioning ensured both optimum use and efficient sharing of resources. This was a great way to get the most for the many, many dollars organizations spent to acquire, implement, and manage these behemoths.

All processing was performed on a single computer with data retrieved from and stored to storage located in the datacenter. Access to the datacenter was tightly controlled. In many cases, users received reports from the computer operators through a window or slot. They accessed electronic information with dumb terminals with no local processing capabilities. The terminals were simple devices which collected keystrokes and presented data in green-screen text.

Distributed processing began in the 1980s, with personal computers finding their way to the desktop. These were fat clients which participated in client/server configurations and connected to the mainframe’s smaller cousin, the minicomputer. Although many companies still performed the bulk of their business processing in a centralized environment, both applications and data began to drift out to endpoint devices.

During the 1990s, another shift in business processing architecture took place with the advent of layered system technology. This included building applications with presentation and data access logic layers. Data resided in database servers in the datacenter. Still, fat client endpoint devices continued to run applications, and more data than ever before found its way to local hard drives. This was also a time when malware writers began perfecting their art. Attacks that eventually spread across entire enterprises often started on an unprotected—or weakly protected— personal computer.

In the twenty-first century, IT managers began to realize that traditional methods of managing desktop and laptop systems were no longer effective in dealing with changes in business requirements, user demands regarding technology implementations, and black hat hackers transitioning from fun and games to an organized crime business model. Demands for the rapid turnaround of application installation or upgrade requests, the need to quickly apply security patches to operating systems and applications, and many other management headaches are driving a new approach to endpoint and server processing and management— virtualization.

Source of Information : Elsevier-Microsoft Virtualization Master Microsoft Server Desktop Application and Presentation

No comments: