*** Welcome to piglix ***

History of the GUI


The history of the graphical user interface, understood as the use of graphic icons and a pointing device to control a computer, covers a five-decade span of incremental refinements, built on some constant core principles. Several vendors have created their own windowing systems based on independent code, but with basic elements in common that define the WIMP "window, icon, menu and pointing device" paradigm.

There have been important technological achievements, and enhancements to the general interaction in small steps over previous systems. There have been a few significant breakthroughs in terms of use, but the same organizational metaphors and interaction idioms are still in use. Although many GUI operating systems are controlled by using a mouse, the keyboard can also be used with keyboard shortcuts or arrow keys. The interface developments described, below, have been summarized and omit many details in the interest of brevity. The influence of game computers and joystick operation has been omitted.

Early dynamic information devices such as radar displays, where input devices were used for direct control of computer-created data, set the basis for later improvements of graphical interfaces. Some early cathode-ray-tube (CRT) screens used a light pen, rather than a mouse, as the pointing device.

The concept of a multi-panel windowing system was introduced by the first real-time graphic display systems for computers: the SAGE Project and Ivan Sutherland's Sketchpad.

In the 1960s, Douglas Engelbart's Augmentation of Human Intellect project at the Augmentation Research Center at SRI International in Menlo Park, California developed the oN-Line System (NLS). This computer incorporated a mouse-driven cursor and multiple windows used to work on hypertext. Engelbart had been inspired, in part, by the memex desk-based information machine suggested by Vannevar Bush in 1945.


...
Wikipedia

...