*** Welcome to piglix ***

Television Interface Adaptor


The Television Interface Adaptor (TIA) is the custom computer chip that is the heart of the Atari 2600 game console, generating the screen display, sound effects, and reading input controllers. Its design was widely affected by an attempt to reduce the amount of RAM needed to operate the display. The resulting design is notoriously difficult to program, an ongoing challenge for developers.

Development of the TIA was led by Jay Miner who continued at Atari expanding on the design of the TIA for the Atari 400/800 computers with the ANTIC and CTIA/GTIA chips. Jay Miner would go on to lead the design of the custom chips for the Amiga Lorraine which would become the Commodore Amiga computer.

Early video games generally used two distinct types of graphics, the "players" which were controlled by the player or the computer (today known as sprites), and the "playfield", or background graphics normally drawn under the players. Players are constantly moving, while playfields were generally static, changing only at well defined times at "level changes".

The conventional way to draw the playfield is to use a bitmap held in a framebuffer. Each memory location in the framebuffer holds a value that describes pixels on the screen. The display circuitry reads these values out of the buffer and uses it to generate an analog signal for display on a CRT video monitor. The mapping of the memory to screen locations, or pixels was often limited by the display hardware. On a conventional NTSC color television, maximum resolutions generally fell between 256 and 320 pixels per line, and 192 to 240 lines per screen.

At the time the 2600 was being designed, RAM was extremely expensive, costing tens of thousands of dollars per megabyte. A typical 320 by 200 pixel display with even a single bit per pixel would require 8000 bytes of memory to store the framebuffer. This would not be suitable for a platform that aimed to cost only a few hundred dollars. Even dramatic reductions in the resolution would not reduce the cost of memory to reasonable levels. Instead, the design team decided to remove the memory-based framebuffer entirely.


...
Wikipedia

...