Seventeen years in the world of tech is practically an eternity. We’ve been at this long enough to know that technology has only one direction, and that’s forward. In this month’s CoverStory, we take a look at 17 years of development in some of the most relevant fields we’ve ever covered. There’s a lot of material to go through, so you’d best grab a drink, take a seat, and prepare to travel back in time to see where we’ve all been, and maybe get an idea about where we’re going.
Around the time we hadn’t begun to regret jamming to Creed’s “Higher,” Nokia was, without a doubt, king of the mobile phone hill. This was also the year Nokia’s legendary 3310 hit the market, setting the stage for tales of durability until now unheard of. A handy 133-gram candybar phone with a five-line monochrome display, the 3310, arguably the perfection of the basic phone, also had removable shells to suit your style needs. It has recently been resurrected and updated, absolute proof that it’s indestructible.
Around the time people were still scouring Napster for music downloads, Siemens released the SL45, a mobile phone that was capable in its own right, but had an impressive trick up its sleeve. The SL45 was the first commercially available phone that could play music. Sure, you had to use the included headphones, thanks to a proprietary connector, and yeah, 32 MB isn’t a lot of storage, but at a time when the options were either a discman or mini-disc player, being able to listen to music on a device you always had anyway was a huge boon, foreshadowing things to come.
Sometime between 2001 and 2002, something big happened in the world of mobile phones. Screens, which previously were LCD panels, started to give way to full-color. The phone to have was Nokia’s 7650, which is part of its fashion line of devices. Along with a slick, sliding form-factor, 4096-color display, and .3 MP main camera, the 7650 had 4 MB of RAM, and 16 MB of storage. It was interesting, capable, and gorgeous. Even now, you can expect to pay a pretty penny to get yourself a working specimen.
Along with broader availability of mobile data, we could much better work outside the office. As such, certain brands rose to fill the void created by the increased business mobility. While Blackberry didn’t quite have the same impact here as it did overseas, we had Palm. In 2003, they released what might have been their best device, the Treo 600. This Palm’s OS was the logical combination of Personal Digital Assistant (still a thing back then) and mobile phone. It kept your schedule, let you browse the Internet, send email, and open documents, all on top of mobile phone capabilities. If you think it sounds a lot like a current-generation mobile device, you wouldn’t be wrong. The Palm Treo even had a stylus for handwriting input and precise screen taps on the resistive touchscreen. With 32 MB of RAM and 24 MB of expandable storage, this was a clear precursor to the smartphone of today.
The year Casey Kasem gave up American Top 40 hosting duties to Ryan Seacrest, and Return of the King was on, we got what was the most coveted mobile phone of its era, the Motorola RAZR V3. Correctly pronounced “razor,” it lived up to its name. In a world where mobile phones were mostly thick candybars, the RAZR was an impossibly thin, exceedingly gorgeous phone, with a beautiful metal keypad and useful secondary screen. It was so beautiful that despite a few shortcomings with the interface (Nokia was still the king here), it sold enough units to become the best selling clamshell phone in history. I’ll be honest. If I had one now, I might actually still use it. It still cuts quite a profile to this day.
In 2005, just as Craig Ferguson was taking over duties on the Late Late Show, Sony Eriscsson released the K750. The main gimmick of this particular SE mobile phone was that it had a back panel meant to look like a digital camera. The rear of the phone had a sliding panel behind which was the 2 MP main camera and a small selfie mirror. It also used the Memory Stick PRO standard for storing media, and could serve as a music player along as a backup snapper. It was not a bad phone, and for its time, not a bad camera as well, leading it to sell millions of units globally, and keeping Sony Ericsson up near the top of the mobile phone market.
At this point in time, Nokia was still more or less leading the mobile phone industry, consistently selling millions upon millions of devices to consumers all over the globe. 2006, however, saw another player start to fight its way into the game. Samsung, large in other industries, but nowhere near the goliath Nokia or Sony Ericsson was, released the SGH-900. This diminutive little basic slider had 60 MB of storage, with the option for a microSD, a 3.15 MP main camera, and little else wasn’t much at the time, but a convenient size, quad-band connectivity, and a hands-free function, along with MP3 ringtones, made it reasonably popular. It’s eerie looking back at the company and seeing them rise to where they are now. This might be the start of that trend.
Nokia started a new line of devices this year, and the N95 was one of the stars of that show. Running the Symbian OS, the phone gave a huge screen up front that slid upwards to reveal a keypad underneath, similar to the 7650 we saw earlier. A 332 MHz Texas Instruments OMAP processor, 160 MB of system memory, and microSD expandability made this phone look a lot more like the phones we have today, though a little larger and clunkier than the smartphone you have now. It was beautiful in its utility, powerful, and gave multimedia capabilities that weren’t widespread at the time—all ingredients for a winning device.
A phone you might have expected to see here is the original iPhone, which completely changed the game. We do, however, have a whole Apple CoverStory this month, so head on over there to see it!
The arrival of the iPhone and the change it brought to the phone industry was starting to ripple through the world. Quietly, in the background, there was another player gaining ground. HTC, not far into the past a mere OEM, had started to venture into their own devices. The HTC Touch Diamond was one of their devices, and at the time, one of the best Windows Mobile devices available. It had the popular slider design, which worked with a very satisfying snap, a beautiful body, and features that were in-line with the best devices the market had at the time. A 538 MHz ARM processor, 192 MB of RAM and 4 GB of storage gave it a lot of power. This was one of the phones I really truly loved, and to this day, I remember it fondly.
Smartphones as we know them now were more and more becoming the norm. Feature phones always had a place in the market, but the smartphone was to be king, and everyone knew it. Released late in 2008-early 2009 depending on where you were, HTC’s Dream brought to the world what would be a giant in the mobile space: Android. The device, which had a 528 MHz processor, 192 MB of RAM (does that sound familiar?), and up to 16 GB of removable storage was the first commercially available device to run the Android OS. The playing field was set, and it wasn’t long before it was a two-pony race with Android and iOS.
As House M.D. and Grey’s Anatomy were winning awards, Samsung, once a quiet player content with making a name for itself with common slide phones, or other more unique designs, launched the first in what would be a wildly successful line of mobile devices, the Galaxy S. More or less the Android iPhone, the Galaxy S was the flagship that Samsung bet on. With solid specs, loads of connectivity, and a large screen, it was the start of the “Samsung Boom” that would carry over more or less at the top of Android devices from the S2 to today’s S8. This was seven years ago now. Take a moment to drink that in.
Android was in full swing come 2011. Samsung continued its dominance of the Android market with the S2, with other budget android models following suit. Other brands had fallen away, or allied themselves with Google, and while Nokia was still around, volumes were down significantly on their Symbian devices, owing to user expectations from a mobile phone, and the Windows Mobile gambit was still one they were waiting on.
New players started to make their impact. Xiaomi, with a massive following in China, started to break out into the rest of the world, getting attention for their very competitively priced mobile devices, and setting the stage for others such as OnePlus, Vivo, and Oppo to follow suit. Smartphones were the norm. Android, iOS, Windows Mobile, and Blackberry were all still around, though the power was clearly skewed to the first two. Blackberry in particular was starting to have some problems, and while it was still popular in the US thanks to the messenger service, and deep compatibility with business infrastructure, it didn’t have quite that same presence elsewhere.
We could go on yearly, but we’re willing to bet technology from these years is still quite fresh in your minds, so here’s a quick rundown to get us all caught up. Things have changed a lot, and not at all. Nokia has come and gone, and is set to come back again. Windows Mobile has all but flatlined, and the race is really only between Apple and Google. Google’s own mobiles have made waves, though more with “hardcore” users than the common consumer, while Samsung continues to take the smartphone market, even with the issue of exploding Galaxy Note devices. Phablets were a thing for a short while, until every smartphone became a phablet, and tablets have been relegated to niche spaces. While edge-to-edge displays and bezel-less phones are all hot right now, we’re still all waiting for the next big thing to flip the smartphone world on its head. Some say wireless charging, others say dual cameras, yet others say VR. At this point, it seems like we’re caught up with the best technology has to offer, but we wait with bated breath to see what we have lined up next.
Seventeen years might be a long time, but we have so much more to go.
The story behind the tech behemoth isn’t all unicorns and rainbows. Like every success story is a history of struggles and conflicts that dared derail Apple from their path to success. Facing initial defeat, a plethora of failures, and a series of sales flops, the success of Apple seemed very unlikely. But by sheer will and persistence, the fact that we see Apple’s logo in all corners of the globe, tells it all. Here, we take a look at pivotal moments in Apple’s Timeline.
Beating the odds: a humble yet rocky beginning
The beginnings of Apple would sound like a joke to many, and for some, a tech fairytale. The year was 1976, the day, April fool’s. Two college dropouts in their 20s, seemingly ahead of themselves, alongside 40-year-old Ron Wayne, tested the first Apple I computer in the garage of Steve’s childhood home, thus sparking the creation of the Apple computer company.
The two youngsters, Jobs and Wozniak, were notorious for orchestrating pranks and tech hacks, one of which allowed them to make long distance calls for free, with one that almost connected them to the Pope. Despite their playful behavior, the two were brilliantly gifted. Wayne, on the other hand, acted out as the company’s arbiter. He was tasked to create the company’s first logo, and handle the business side of the trio’s first partnership agreement with The Byte Shop, which initially placed 50 orders. Feeling shadowed by intellectual savants, and at the same time fearing financial loss, Wayne later withdrew his share of 10 percent just 12 days after the company was founded. In today’s economy, his share would amount to billions.
Nearing the year’s end, Mike Markkula, a chip industry veteran, collaborated with Jobs in writing a business plan. Makkula predicted a sales of 500-million USD in the next 10 years.
In the year 1977, Jan Mike Markkula became Apple’s chairman, while Michael Scott, Markkula’s former co-worker from National Semiconductor, was invited to sit as Apple’s CEO. With Scott as the CEO, Apple was able to hit a decent ground with a solid corporate infrastructure. Apple’s USD 666 Apple I successor, the new Apple II was then launched, the first personal computer in a plastic enclosure that featured colored graphics. Made for the mass market, the Apple II had huge appeal, mainly due to its all-in-one package that included a power supply, colored features, and a standard keyboard.
Ups and downs: a roller coaster ride
In the year 1980, the company’s employee count grew to about 1000. Going public, Apple became the largest IPO
company, surpassing Ford’s top IPO records. With a valuation of around 1.8 billion USD, more than 30 of Apple’s
employees become instant millionaires. However, stocks don’t say it all. In the same year, Apple’s supposed flagship computer, the Apple III, flops due to serious reliability issues.
Competition was always there, but was never as strong as the threat IBM posed. In the year 1881, IBM introduced the PC, while Apple’s supposed next generation computer, Lisa, faced shipping delays. Although IBM’s PC sported less than impressive specs, the product faced good audience reception. Within two year, IBM’s market share eclipsed that of Apple’s.
And if you thought Apple’s lows stopped from there, then you’d be wrong. Apple fired 40 employees during that year’s “Black Wednesday.”
The year 1983 came, and what seemed like a promising computer, left many disappointed. A heavy USD 9,995 pricetag, slow performance, and troublesome compatibility issues caused the launch of Lisa to flop, with only 100,000 units sold.
With flop after flop, and business that seemed to be going downhill, Steve sought leadership from outside the company. “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” Steve Jobs lured John Sculley, then president of Pepsi-Cola, to lead Apple as the president and CEO. Convinced with Job’s proposal, John Sculley joined in.
A fixed focus on consumers and professionals
In 1984, Apple launched its personal computer, with it, was a dark commerical based on George Orwell’s ‘1984’ novel. Some interpreted the commercial as Apple’s way of discouraging the market to conform to the “Big Brother” (speculated as IBM). The year after, of what many consider an unfortunate occasion, Steve Jobs resigns from Apple due to a power struggle with Sculley and the company’s board of directors. Apple moves on.
Directed towards professionals, 1986 proved to be year for desktop publishing. Graphic artists, publishers, and printers were given the option to use Aldus PageMaker, the Mac Plus, and LaserWriter Plus to render documents for a lower cost, when compared to traditional equipment.
Heading to another direction
In the years ahead, Apple controlled its aggression and looked in the direction of partnerships. The saying “don’t build walls, build bridges” applied to Apple’s philosophy in the next steps it took. In 1991, IBM agrees on a partnership with Apple to create computers that utilized Motorola’s RISC-based processor, the PowerPC. In the same year, Apple introduced the Powerbook 100, its first hit portable computer.
Ever wonder how Apple’s first handheld device did? Surprisingly, it was a miss for the Cupertino- based company. In 1993, Apple’s first handheld device debuts, the Newton Message Pad. It fails miserably due to its lousy handwriting recognition. Sculley ends his tenure the same year.
In 1994, the first Power Macs are shipped. With just a decade of the 8 MHz Mac 128k, Apple launched new PowerPC based 60 MHz models that were also able to sustain backwards compatibility. In the same year, in an event similar to Apple easing up to IBM, the first Mac License was granted, allowing Apple to boost its market share by permitting third party vendors to create Macs aka clones.
1998 was the year the iMac resuscitated Apple from its ill fate. Designed by Jony Ive, and the first consumer product since Steve Job’s return to the company as interim CEO, the translucent iMac brought color and financial stability to the previously troubled company.
Banking on the success of the iMac, Apple launched a similarly designed iBook. Featuring diferent colors and a translucent design, the iBook looked and felt like a smaller and portable iMac. Following the booming Wi-Fi craze the iBook also featured an optional AirPort Card.
The king returns
In 2000, the king of Apple finally returned. Steve Jobs once again became the CEO of Apple. From there on, almost every major release made a global mark. From there on, Jobs disrupted the market with the products we know oh so well today. There’s 2001s first iPod, with 5 GB of hard drive storage that held 1,000 MP3. 2007 iPhone that changed the world. The Macbook Air, the App Store, iTunes, the iPad—all of these came with the return of Jobs.
In 2011, Jobs passed away. Although we attribute much of Apple’s success to his brilliant mind, his legacy continues with Apple’s creations.
The father of aerobics, Doctor Kenneth Cooper, once said “we do not stop exercising because we grow old—we grow old because we stop exercising.” Regular exercise is the single most important way to improve our quality of life; it is one of the secrets to living a long, healthy life and we couldn’t agree more. The benefits of exercising go beyond just achieving a fit body. In more ways than one, it reduces the risks of you acquiring various diseases, boosts your mood, improves your memory, increases your body’s strength and flexibility, just to name a few. Given this context, we have gone leaps and bounds to make exercising accessible and as well integrated in our everyday lives.
The truth is, however, most people would shun the idea of exercising. Because, why would you go out of your way to bust moves when you can just use that time to binge shows on Netflix, read a book, or better yet, sleep? But the good news is, thanks to the brilliant minds and the advancement of technology, fitness is now just within arm’s reach—literally.
Working out in the living room
Visiting gym chains in the 80s and 90s was the easiest choice when wanting to engage in some motivated workout. You get to spot and burn fat alongside gym rats clad in the most fabulous neon, skin-tight unitard-over-pants combination, legwarmers, and forehead embracing band. Since cardio was all the rage this era, many also resort to joining aerobics classes at nearby parks on weekends. But for people wary of leaving the comforts of home, fitness TV shows and video cassette tapes were godsend. These allowed people to follow along sweat-breaking moves at their own pace and without having to shell out for costly gym membership fees.
Fitness icons were also on the rise this period. Tae Bo (taekwondo and boxing) and Jazzercise, anyone? Who would dare not emulate these people when you get pumped just by looking at them?
The fitness craze wouldn’t be complete without the many compact exercise equipment that took over home shopping networks. If you see anything you like, you just have to phone in and choose whatever payment option is most practical. These multi-use trainers for the home go from ab-sculpting machines like ab crunchers and ab ‘butterfly’ electro muscle stimulators, limb-toning leg curl machines and pulley machines for the arms, to resistance tools for strengthening your body. These might have done wonders for some, but were stowed away in the attic after first use for a few.
Marriage of technology and exercise
People like to have fun, and technology was able to amplify this through video games. Combine this with exercise and you have a combination that’s bound for success. This was exactly the case for the PlayStation version of Dance Dance Revolution (DDR) and Nintendo’s Wii Fit Exercise games of the early 2000s. More than the new experience, what made these games extra appealing is the fact that you get to lose unnecessary pounds while letting loose. DDR for its PlayStation version allowed players to play via controller or a dedicated dance pad. This tested the stamina and foot-eye coordination of many as the gameplay requires its players to stomp arrows laid out in the pad according to the arrows that roll out from the screen. Speed of the flashing arrows depend on the music chosen by the player. The game will also calculate your rank based on arrows caught and missed—if you fail to stomp on the right arrows consecutively, you will be booted from the round. Nintendo also made use of an add-on apparatus for the Wii Fit Exercise Games, which is an aptly-sized plastic slab named Balance Board. Through the game and its peripheral, you can do physical exercises such as yoga, strength training, aerobics, and more, plus calculate your body mass index (BMI). These two pioneered video games that encourage fitness, and development in this space saw more types of exercises adapted into games in recent years.
Come the current decade, the accessibility of the internet along with the proliferation of smartphones was a cue for fitness shows and workout DVDs to take a bow. New channels in which fitness personalities and enthusiasts can share their passion have sprung up like mushrooms, making it easy for anyone with an internet-connected device to get into shape without the need to pay subscription fees.
YouTube is one of the many known platforms you can search for easy-to-follow exercise videos, while there are fitness-centric apps you can download for free via the Apple App Store and Google Play Store. These apps can usually register the steps you take, keep track of your sets and reps, count calories, and feature exercise demos with intensity increasing as you progress by the day.
For those who need to be in a group to be motivated to move, buffet-style fitness subscription apps like GuavaPass and KFit are the answer. These apps, that came about just recently, will charge you for a month of unlimited booking of classes with its many partner fitness establishments that offer physical activities such as yoga, pilates, kickboxing, crossfit, circuit training, dance, boxing, among others. What’s more is you could actually still use your account when you go out of the country—as long as the apps are supported where you are travelling to.
Fitness trackers: optimizing even the most basic forms of exercises
The most basic form of exercise is walking. No frills involved, just you and your feet going places. Let’s face it, not everyone’s keen to go out of their way and do strenuous physical activity. To encourage humankind to get on their feet and start walking, fitness trackers were born. This tech you wear on your wrist—and sometimes integrated into fitness wear such as shoes—help optimize your daily exercise routine by analyzing, recording, and giving you access to the steps you’ve taken. Having this information conveniently stored and updated as you move on your paired smartphone succeeds in motivating you to beat your previous record. There’s something about seeing you’ve successfully taken more than 10,000 steps in a day. This type of feedback gave a sense of fulfilment and motivates its user to push further.
Wearable tech didn’t just pop up yesterday, it’s the result of continuous development in tech and the fusion of one device with another. Pedometers existed way way back, but consumer grade movement tracking devices only came into light in the 2000s and made popular with Fitbit in 2009. Fitness trackers don’t solely record your steps, they are also integrated with technology able to monitor your sleeping habits, heart rate, brain activity, and whatnot. Currently, smart watches also double as fitness trackers.
As we move forward, these devices wouldn’t just serve the purpose of simply collecting data. In time they’ll be able to provide valuable insights that will allow you to take action and improve your quality of life further.
There has never been a better age for the technology of cinema. Considering the leaps and bounds being made in CGI and SFX, actors may find themselves out of work, as we’ll soon replace them with computer-generated renditions that can do their job better. Additionally, advances in camera technology has made filmmaking an accessible artistic medium; anyone can make a movie nowadays, and sometimes all you need is a smartphone.
Films and television shows can take us to infinite versions of futuristic, dystopian worlds, but now we face a new conundrum: how can we make compelling stories that portray ordinary, modern life? The classic “hiccups” that kickstart some of our favorite 90s movies are easily fixed by a number of available apps today. Searching for your best friend in the middle of Las Vegas? Pin-drop his location. Lost the phone number of a beautiful stranger who may be your soulmate? As long as you know their full name, a quick Google search will bring up their Facebook profile. Or better yet, nowadays people simply enter their numbers directly into your phone, eliminating the possibility of losing it in the first place.
In short, this is not a piece about the incredible advances in cinematic technology, but rather an analysis on how the convenience of technology in our everyday life presents a unique challenge to our generation’s filmmakers. And the problem of how to portray text messaging on screen may be the greatest unsolved puzzle so far.
In a video essay titled A Brief Look at Texting and the Internet in Film, creator and narrator Tony Zhou pinpoints good and bad examples of how to cinematically portray texting. For most of the early 2000s, the standard practice was to simply insert a brief shot of the physical phone screen, but this was painfully boring to watch. Some movies had characters read messages aloud or avoided using phones altogether, but now we live in an era where digital communication is an integral and inescapable part of daily life, so cinema and TV must adapt.
On-screen text bubbles
In recent years, film and television has adopted a new technique: the on-screen text. This method first emerged in East Asian films like Take Care of My Cat (2001) and All About Lily Chou-Chou (2001), and the occasional American teen movie during the 2000s, but didn’t reach international popularity until BBC’s Sherlock premiered in 2011.
On a practical level, this approach saves time and money: the text is easier for viewers to read and all you need is a basic knowledge of Adobe After Effects to create it. However, it’s not always the most visually appealing option.
“The bubble is the first thing that becomes outdated,” says Zhou. Considering how often Apple changes its interface, crafting text bubbles to imitate a certain model can trap a show or movie in a specific time period. In addition, gigantic colored text blocks are distracting, and come off as campy and unrefined.
Zhou stresses three key pieces of advice, “cheap, efficient, elegant,” and highlights the minimalist approach of having thin, white text superimposed on a plain stretch of space—a blank wall for instance. Directors have also taken to showing the process of a character typing, deleting, and editing messages, revealing bits of his or her personality at the same time. This gentle aesthetic can, at times, add a little more movement to a shot without distracting the viewer. When done well, the text-on-screen approach allows a scene to play uninterrupted, and introduces information more efficiently. However, it’s not a universally accepted solution.
The problem with realism
As a sci-fi anthology series, Black Mirror, which first aired in the UK in 2011 but has since moved to Netflix, is the best example of a show that could—but doesn’t—utilize the text-on-screen aesthetic as often as you would expect. Each episode of Black Mirror depicts a different version of the not-so-distant future, where a specific technological advance has drastically changed everyday life. At times, this calls for some help from the graphics department, but only in cases where the episode involves some type of implanted-brain-melding device that affects how people see the world (for example, “Nosedive” from season three).Otherwise, if characters within the episode’s universe can’t see suspended texts in the air in front of them, neither can the audience.
Part of the reason why on-screen text works so well in Sherlock is because the show already utilizes a surreal visual style of cinematography. With its use of rapid-fire cuts, slow-motion pans, and warped perspective shots already in play, adding on-screen text doesn’t feel like a stretch of the imagination. But for shows with less dynamic cinematography and editing, the appearance of a floating text bubble is a jarring reminder that the viewer is watching a work of fiction. Because of this, there are current shows that prefer the shot-reverse-shot approach, because while it’s less dynamic, the technique keeps the audience grounded in reality.
Embracing the virtual world
Filmmakers today are in a beautiful age of experimentation, because the challenge of creating a cinematic representation of our digital communication is one that cannot be solved with money. Anyone can answer the mystery of cinematic texting—it doesn’t have to be a major production studio. Already, new visual styles have emerged, such as the “desktop film,” where the action takes place solely on a computer screen. When done well, especially in short form, it shows the potential to possibly become the next evolutionary step of the found-footage genre. In 2014, Unfriended became the first feature-length film to take place entirely through video-chat on a computer screen. To be fair, it wasn’t a great movie, but then again, found-footage films aren’t exactly considered Academy-worthy gold either.
Still, this is the first of many milestones. As online communication becomes an embedded part of our everyday life, our entertainment must adapt with the times. No more static phone-screens. No more ugly blocks of floating text. A fresh generation of filmmakers has to imagine a new way to visualize the digitally attuned world we live in. And once they figure it out, no doubt there will be new, life-changing pieces of technology that will pose more creative challenges for them to solve.
It’s fascinating to see—considering the lengthy collective history of human invention—the pace at which computing has evolved in the last 17 years. Through our existence as a magazine, we’ve documented in part the transition of computers from the clunky, whirring, beige-white, plastic, literal “desk-tops” to the sleek, shiny, aluminum pieces of modern art we have today. Below I chronicle the highlights of this span of time.
Fresh from the Y2K bug scare that saw businesses small and large as well as consumer software developers conduct widespread (and much needed) updates to their architecture and security. While not necessarily related to the apparently unwarranted fear of the new millennium, the 2000s saw a refinement to the user experience which coincided with the improvement of hardware capabilities.
Empire strikes back
Notwithstanding the mistakes that Microsoft made with its Windows operating systems, and there are many—*cough* ME *cough* 2000 *hacks lung* Vista—the company managed to influence how graphical interfaces should and would look like with its greatest triumph the XP. Launched in 2001, it was one of the company’s most popular operating systems, second only to Windows 10 and currently has the longest lifespan, with the last full release dating back to 2008 and support ending just three years ago.
Windows XP was a far cry from the buggy and frighteningly open-ended Windows 98 and gave inexperienced would-be users a visually pleasing, feature-packed yet easy-to-use, and fully customizable operating system. XP solidified Microsoft’s foothold on the world of computing for the first half of the decade, but soon after would be challenged by its rival Apple.
This success was followed by the largely underwhelming Vista, and the polarizing Windows 8. However, the Windows 8, to its credit, took the experience of mobile computing to the more powerful computer. While the original Windows 8 was riddled with bugs, Microsoft was quick to update it with a much better 8+ and the current Windows 10 which toned it down a bit with the tiles and improved ease of use, and intuitiveness. I however, have nothing to say about Vista (cue sad violin).
Black is the new beige
“Rig: Pentium III, 40 GB HDD, 128 MB RAM, NVIDIA GeForce 2 GTS with 32 MB VRAM and 56K internet connection. I’m a gamer, sweaty, a rebel. Deal with it.”
The above statement would have been as seldom heard then as it would be now. This is because those specs were top of the line in the early 2000s and would have cost anything upwards of USD 2000 with Clinton-era valuation. They were that expensive.
Today, even the cheapest 2016-2017 smartphone could out-compute the most expensive PCs in those days, in keeping with Moore’s Law. This goes for every aspect of the computer including the processor, storage, memory, and GPU. And as these parts become more and more powerful, getting a good performing PC is also a less expensive affair.
In terms of accessories and peripherals, the past 17 years have also seen a massive amount of improvement with keyboards and mice becoming more and more accurate and agile. The cathode ray tube monitors later become LCD screens and presently LEDs can now display a once unimaginable amount of pixels, giving us super clear images akin to any contemporary entertainment system. Flash drives were introduced in 2000 and started out in megabyte increments; and while the size of flash drives have barely changed since then, their capacities have certainly improved with flagship models being available in the terabytes. With the evolution of CPUs and GPUs, developers were able to create new and exciting peripherals that expanded the user experience into virtual reality, mixed reality and augmented reality.
Software, the Internet and the Cloud
Remember when MS Paint and Solitaire were enough to make us happy with owning a PC? Pretending to be Picassos, randomly creating triangles and coloring them in, and losing productivity just to watch that deck of cards cascade across your 800 x 600 resolution screen. Those were the days.
Now, the tools of creativity, distraction, and everything in between have so greatly improved that most people would not be able to remember how they spent their pre-Facebook days on the Internet.
As hardware became stronger and internet connections became faster, program functionality and interface quality progressed drastically. Take MS Paint for example, from the flat canvass it was before, better processors allowed the program to have realistic textured brushes which can mimic how paint would flow on paper, and eventually leading to its 2017 expansion Paint 3D, turning the square canvass into a cube.
Video games made full use of computer processing and graphics processing capabilities with video game graphics and physics engines becoming more realistic, gameplay mechanics more complex, and worlds and plots more expansive. No more of those polygonal whatsits we used to describe as “life-like.”
Since the introduction of peer-to-peer sharing software like BitTorrent, LimeWire, and Napster, and the development of public and private cloud infrastructure, users became less reliant on physical containers of files, saying goodbye to floppies and laser discs. This led developers to incorporate internet and cloud connectivity to their software, giving birth to fully browser-based programs, and traditional installed programs with cloud-sync such as Adobe CC.
Social networking used to rely on clients like mIRC, AOL, and Yahoo! Messenger, but with Flash, Java, and HTML constantly improving, these means of communication became overshadowed by websites like Friendster, Bebo, MySpace, and Multiply, and eventually Facebook and Twitter.
Uncle Ben was right
“With great power comes great responsibility.”
As cliché as it is, with our increasing reliance on the power of computers, a lot of new problems as well as evolved old ones are ever-present which deserve our utmost concern. Cybersecurity is the main issue today, with cyber-attacks that were once birthed from dark bedrooms of IT nerds now becoming a multi-billion dollar vulnerability aimed at the omnishambles that is the structure of the World Wide Web.
The last 17 years for computing were certainly interesting, and the next 17 are set to be even better—and we’re excited to witness them.
Words by Gadgets Team
Also published in GADGETS MAGAZINE August 2017 issue