*** Welcome to piglix ***

Time sharing


In computing, time-sharing is the sharing of a computing resource among many users by means of multiprogramming and multi-tasking at the same time.

Its introduction in the 1960s and emergence as the prominent model of computing in the 1970s represented a major technological shift in the history of computing.

By allowing a large number of users to interact concurrently with a single computer, time-sharing dramatically lowered the cost of providing computing capability, made it possible for individuals and organizations to use a computer without owning one, and promoted the interactive use of computers and the development of new interactive applications.

The earliest computers were extremely expensive devices, and very slow in comparison to later models. Machines were typically dedicated to a particular set of tasks and operated by control panels, the operator manually entering small programs via switches in order to load and run a series of programs. These programs might take hours, or even weeks, to run. As computers grew in speed, run times dropped, and soon the time taken to start up the next program became a concern. Batch processing methodologies evolved to decrease these "dead periods" by queuing up programs so that as soon as one program completed, the next would start.

To support a batch processing operation, a number of comparatively inexpensive card punch or paper tape writers were used by programmers to write their programs "offline". When typing (or punching) was complete, the programs were submitted to the operations team, which scheduled them to be run. Important programs were started quickly; how long before less important programs were started was unpredictable. When the program run was finally completed, the output (generally printed) was returned to the programmer. The complete process might take days, during which time the programmer might never see the computer.

The alternative of allowing the user to operate the computer directly was generally far too expensive to consider. This was because users might have long periods of entering code while the computer remained idle. This situation limited interactive development to those organizations that could afford to waste computing cycles: large universities for the most part. Programmers at the universities decried the behaviors that batch processing imposed, to the point that Stanford students made a short film humorously critiquing it. They experimented with new ways to interact directly with the computer, a field today known as human–computer interaction.


...
Wikipedia

...