Once upon a time, flat screens were a vision of the future—something you would only see in Star Trek or other science fiction works. In the dawn of the computer age, almost every single computer monitor used the cathode ray tube (CRT). How did they manage to flatten this giant box that allowed you to use the computer? Let’s start with the CRT, which was the main component in television sets then (and later, computers).
Cathode rays were discovered as early as 1869. German physicist Johann Hittorf, who was the first to also compute the electricity-carrying capacity of charged atoms and molecules, discovered the new properties of cathode rays. By the beginning of the 20th century, Japanese physicist Kenjiro Takayanagi demonstrated a 40-line resolution CRT television. He improved on his own invention in 1927 when he managed to increase this to 100 lines; this resolution was not surpassed until 1931. While Takayanagi was not well known outside of his native Japan, he played an important role in the pioneering of television and computer monitors. In 1935, he invented the all-electronic CRT television; televisions and computer monitors would mainly use CRTs for the next seven decades. The German company Telefunken managed to mass produce CRT television sets in the early 1930s as well.
Color television, which also made use of CRT screens, came around as early as the 1940s. The first color broadcast was done by NBC on January 1, 1954. More and more shows began to convert to color TV, but less than five percent of Americans had color TV as late as 1964. By 1972, color television sets started outselling black-and-white sets. It took until 1986 for all television stations in the United States to fully convert to color.
The TRS-80 and Commodore PET, which were both released in 1977, were early examples of CRT monitors used in computers, but were mostly monochrome displays. The Apple II, which was also released in 1977, used color, as well as the Atari 800, which was released in 1979. IBM introduced the Color Graphics Adapter in 1981, which allowed four colors at a 320×200 resolution, or two colors at a 640×200 resolution. IBM followed up with the Enhanced Graphics Adapter in 1984, which allowed 16 colors at a 640×350 resolution. By the end of the 1980s, 1024×768 resolution CRT monitors were widely available and relatively cheaper.
At the end of the 1990s, CRT monitors started to be replaced by liquid crystal displays (LCDs). Like all new technologies, LCD screens were extremely expensive when they were released. By 2003, they were outselling CRT monitors. As new technologies emerged, CRT monitors went by the wayside, namely because they were bulky and difficult to transport around (and LCD, LED and OLED screens offered far higher image and video quality).
If you still have a giant box called a CRT monitor, it may be one of the longest-lasting standards of technology out there. Today’s technology is constantly changing and a new type of monitor might be coming a lot sooner than you think.
Also published in GADGETS MAGAZINE October 2017 Issue
Words by Jose Alvarez