Generation of operating systems




        A computer operating system is the software and/or firmware which manages the hardware of the computer and provides those resources, through an API, to application programs. By taking care of the hardware's needs and restrictions for the application program, the application program, and the application programmer, is freed from the additional work load and extra knowledge that is needed to deal with the hardware. This makes it easier to write application programs for that computer. It also makes it easier to keep a file system intact and working since all the changes to the file system go through the Operating system and are not done by each application programmer themselves

        The first computers didn't have operating systems, and could only run one program at a time. The first electronic computing circuits were little more than separate functions and didn't really need even a programming language to be tested. In about 1945 computers such as the Eniac were built that took up 4 square blocks of space and had about the same power as a 4 function calculator

By the 1950's the invention of the punch card machine made it easier to read in a small program, but all the operating of the system required was the pushing of a few buttons, one to load the cards into memory and another to run the program. They were designed to smooth the transition between jobs. Before the systems were developed, a great deal of time was lost between the completion of one job and the initiation of the next. This was the beginning of batch processing systems in which jobs were gathered in groups or batches.  Once a job was running, it had total control of the machine. As each job terminated (either normally or abnormally), control was returned to the operating system that "cleaned up after the job" and read in and initiated the next job.

In 1960 the operating systems were characterized by the development of shared systems with multiprogramming and beginnings of multiprocessing. In multiprogramming systems several user programs are in main storage at once and the processor is switched rapidly between the jobs. In multiprocessing systems, several processors are used on a single computer system to increase the processing power of the machine.
The IBM 360 was the first computer line to use small scale integrated circuits, and thus offered a major cut in price over earlier solid state machines.
Device independence began to appear. In first generation systems, a user wishing to write data on tape had to reference  a particular tape drive specifically. In this era, the user program specified only that a file was to be written  on a tape drive with a certain number of tracks and a certain density. The operating system located  an available tape drive with the desired characteristics and instructed the operator to mount a tape on that drive.
Timesharing systems were developed in which user could interface directly with the computer through typewriter like  terminals. Time sharing systems operate in an interactive or conversational mode with users. The user types a request to the computer, the computer processes the request as soon as it can (often within a second or less), and a response (if any) is typed on the user's terminal. Conversational computing made possible great strides in the program development process. A timesharing user could locate and correct errors in seconds or minutes, rather than suffering the delays, often hours or days, in batch processing environments.
Timesharing didn't become popular until late in the third generation when the hardware for protection mechanisms became widespread

In 1964 The IBM system/360 family of computers third generation computers was designed to be general-purpose systems. They were large, often ponderous, systems purporting to be all things to all people. The concept sold a lot of computers, but it took its toll. Users running particular applications that did not require this kind of power played heavily in increased run-time over head, learning time, debugging time, maintenance, etc.
Third generation operating systems were multitude systems. Some of them simultaneously supported batch processing, time sharing, real-time processing, and multiprocessing. They were  large and expensive. Nothing like them had ever  been constructed before, and many of the development efforts finished well over budget and long after scheduled completion.
These systems introduced to computer environments a greater complexity to which users were, at first, unaccustomed. The systems interposed a software layer between the user and the hardware. This software  layer was often so thick that a user lost sight of the hardware and so only the view created by the software. To get one of these systems to perform the simplest useful task, users had to become familiar with complex job control languages to specify the jobs their resource requirements. Third generation operating systems represented  a great step forward, but a painful one for many users.

     Fourth generation systems are the current state of the art. Many designers and users are still smarting from their experiences with third generation operating systems and are careful before getting involved with complex operating systems.
With the widespread use of computer networking and on-line processing, user gain access to networks of geographically dispersed computers through various types of terminals. The microprocessor has made possible the development of the personal computer, one of the most important developments of social consequence  in the last several decades. Now many users have dedicated computer systems available for their own use at any time of the day or night. Computer  power that cost hundreds of thousands of dollars in the early 1960s is now available for less than a thousand dollars.
Personal computers are often equipped with data communications interface, and also serve as terminals. The user of a fourth generation system is no longer confined to communicating with a single computer in a timeshared mode. Rather the user may communicate with geographically dispersed systems. Security problems have increased greatly with information now passing over various types of vulnerable communications lines. Encryption is receiving much attention it has become necessary to encode highly proprietary or personal data so that, even if the data is compromised, it  is of no use to anyone other than the  intended receivers.
     The percentage of the population with access to computers in the 1980s is far greater than ever before and growing rapidly. It is common to hear the term user friendly denoting systems that give users of average intelligence easy access to computer power. The highly symbolic, mnemonic, acronym-oriented user environments of the 1960s and 1970s are being replaced in the 1980s by menu-driven systems that guide the user through various options expressed in simple English.
The concept of virtual machines has become widely used. The user is no longer concerned with the physical details of the computer systems (or network) being accessed. Instead the user sees a view called a virtual machine created by the operating system. Today's user is more concerned with accomplishing work with a computer. And is generally not interested in the internal functioning of the machine.
Database systems have gained central importance. Ours is an information-oriented society, and the job of database systems is to make information conveniently accessible in a controlled fashion to those who have a right to access it. Thousands of on-line database have become available for access via terminals over communications networks.
The concept of distributed data processing has become firmly entrenched. We are now concerned with bringing the computer power to the site at which it is needed, rather than bringing the data to some central computer installation for processing.
Resource:

No comments: