Long before sleek laptops and portable CPUs came, computing machines were very overwhelming to look at. Apart from their complicated hardware, the size and components were too big and heavy to transport, limiting the users. These devices were also not affordable to consumers and were mainly designed for commercial use, particularly for business, military intelligence, research, and communication. Fast forward to 2021, time and technology have simplified and changed the way we use these productivity workhorses.
In earlier times, the concept of a small and affordable personal computer was quite unthinkable. For one, computers were targeted at a niche market composed of business entities, government agencies, and a handful of enthusiasts. Computers were very expensive as the production of each hardware back then was very complex, while resources were limited and the technology still in the process of development. Personal computers were not commercially available until 1977 when microcomputers with eight-bit microprocessors such as Tandy Radio Shack TRS-80, Commodore Business Machines Personal Electronic Transactor, and Apple II arrived in the market. The IBM PC, launched in 1981, became the precursor of modern day personal computers. The IBM PC had a monitor capable of showing colored graphics or monochromatic text, a wired and tactile keyboard, and various expansion slots for expandable memory, graphics, and peripherals. In terms of internal hardware, it was powered by a central processing unit (CPU) with upgradable random access memory (RAM).
Intel’s dynamic random access memory, which can store up to 1 kilobit of data, eventually became the standard memory across all computers since it could store a significant amount of data at a cheaper price. The same company also introduced the Intel 4004, a four-bit microprocessor and one of the first single-chip microprocessors on which current microprocessors are based.
Dubbed as the first commercially produced microprocessor, it catapulted Intel to be a major microprocessor manufacturer to this day.
There were also home computers, another type of microcomputer more accessible to consumers. Back then, home computers allowed people to program software, play video games, and accomplish productivity tasks. These continue to be the essentials that modern PCs offer to present day consumers.
Apple then created Lisa, the first personal computer with a graphical user interface (GUI). This desktop computer sparked the development of an operating system with protected memory, document-oriented workflow, and a display format for starting programs and selecting commands. Since users can layout text and graphics on the screen, the device also introduced the concept of desktop publishing. However, it was Microsoft’s MS-DOS that spearheaded the development of operating systems for x86-based personal computers. In 1985, Microsoft released Windows 1 which is a GUI for MS-DOS and the great ancestor of the brand’s line of graphical operating systems such as Windows 10 and Windows 11.
NEC introduced the NEC Ultralite in 1989, one of the firsts notebook computers in the world. It offered the same computing power as personal computers in a much more compact form factor.
By the 1990s, computers had become more portable and powerful, heralding the birth of the digital age. The Internet was born. Consumers were exposed to desktops and laptops with multimedia capabilities. Storing digital files became more agile and convenient, thanks to floppy disks and rewritable CDs. Colored displays with higher resolution and capable of running
simple animations replaced monochrome screens. Digital sound systems were even installed on the devices for an immersive experience. Creative professionals started to explore online
journalism, animations, photo editing, and video editing with the rise of DVD-ROMs as storage devices. Ultimately, the use of personal computers connected people for research, efficient
communication, and productivity. The rest is history.
Pioneers of change
Some of the game-changers in the history of computers are worth mentioning. Apple catapulted the evolution of all-in-one desktops starting with the iMac G3, considered the first model in the iMac family. This game-changing machine abandoned legacy technologies such as floppy disks and serial ports, and pioneered the use of a universal serial bus (USB). The tech company also introduced the iBook and the PowerBook, Apple’s entry-level and high-end laptop, respectively. The former was the first consumer laptop to offer WiFi connectivity while the latter was originally targeted for the professional market. Both models were the basis of MacBook Air and MacBook Pro.
Gaming peripherals brand Razer reimagined the way we play games today with the Razer Blade, the first true gaming laptop powered by an Intel i7 processor and NVIDIA GT 6555M graphics card. Gamers could download their games and other available software from a virtual library to their local devices as game cache files with Steam. Online live streaming of games was also initiated in the middle of the 2010s with the launch of Twitch. This was a vital milestone in the development of eSports and the thriving community of professional gamers and gaming influencers globally.
In terms of software, physical installers became obsolete with cloud computing. Google introduced GSuite, its own lineup of services. Now widely used as Google Workspace, this is a collection of productivity and collaboration tools that allows people to work and even learn remotely. Microsoft has its own version for larger companies with its Microsoft 365 Apps for
As for entertainment, YouTube became a platform for content creation. Digital music streaming made cassette tapes, records, and CDs obsolete but it provided record labels, big shot artists, and even independent musicians the platform to reach more audiences. Thanks to Last.fm, more music streaming services, such as Spotify and YouTube Music, were able to adopt the same model. Video conferencing and messaging apps of today were heavily inspired by Yahoo Messenger and Skype, popular in the 2000s. Tumblr became the platform for microblogging, while Friendster and MySpace introduced the concept of social media network websites where people can connect and interact online. Online advertising is now ubiquitous due to the integrations made by Google and Facebook to their respective platforms.
Lastly, storage solutions evolved from floppy disks and CD ROMs to hard drives and solid-state drives for faster data processing. Now, cloud storage solutions do not even require a physical device. LCD screens are now made from LED and AMOLED monitors with higher display resolutions and refresh rates for smoother gaming and workflow. Peripherals and network connectivity are now wireless.
As tech companies continue to adapt to the changing needs of consumers, it is likely that content creation will be made simpler and easier than ever. We’ve seen Twitter allow its users to create live voice chat rooms in the form of Spaces. Australia’s very own graphic design platform, Canva, makes everyone’s life easier with its free and subscription service. It’s also highly possible that other social media networks will allow users to directly create, edit, and distribute their own videos and allow cross-posting to other platforms like TikTok.
Whether we like it or not, live streaming is here to stay and can be applied to online classes, event launches, company training programs, and more. Locally owned streaming service Kumu has been tapping local streamers to host various promotions within the platform, play and participate in live games, and even interact with other users.
Offices and classrooms are also becoming more virtual due to the COVID-19 pandemic. Now, more people are accomplishing their tasks remotely. This means that more collaborative tools featuring AI content generation for an immersive experience may be in the works. The future of computing is boundless, beyond even our wildest imaginations.
Words by Jewel Sta. Ana
Also published in Gadgets Magazine September 2021 Issue