25 moments in tech that defined the past 25 years

[Source images: David Paul Morris/Getty Images; fizkes/iStock; Eshma/iStock; amnachphoto/iStock; Flickr user Christiaan Colen; Flickr user notoriousxl]

From the obvious (Steve Jobs unveils the iPhone) to the obscure (AT&T stops charging an hourly rate for internet access), these events were landmarks.


Sometimes, it’s obvious from the get-go that a moment in tech history is . . . well, historic. Other times, it’s clear only in the fullness of time. Yet another type of historic moment flies largely under the radar, shaping our lives more than most people ever realize.

As Fast Company celebrates our 25th anniversary, we’ve compiled a list of 25 moments that have defined the tech industry since our first issue hit the stands with a cover date of November 1995. (These calls are tough to make, so we also picked 10 runners-up.) For better or worse—and sometimes both at the same time—these events have had lasting impact. If there’s some alternate universe where they never happened, it’s a different place indeed.


Brought into law alongside the Telecommunications Act of 1996, Section 230 of the Communications Act says that online services can’t be sued for their users’ content. While it predated YouTube and Facebook by nearly a decade, it ultimately allowed those sites to flourish by protecting them against civil complaints such as defamation. But they might not enjoy that immunity forever. Aggrieved by the idea of anti-conservative bias, the Trump administration is now trying to dismantle Section 230’s protections through an executive order. Whether this move is successful could affect not just the likes of Facebook, Google, and Twitter, but whatever companies come next.


In the early days of online services and ISPs, you paid several dollars an hour for access, making spending too much time in cyberspace a pricey habit. That didn’t change in a big way until 1996. As explained in an excellent article by Tedium’s Ernie Smith, AT&T gave its new WorldNet internet service a flat rate of $20 a month ($25 if you weren’t otherwise an AT&T customer).  The move was meant as a shot across the bow of other major service providers—and by October, even the biggest kahuna of them all, America Online, had to match it. All of a sudden, people were free to binge themselves silly on internet content—and as you may have noticed, we still do.https://www.youtube.com/embed/Lkb3wUzDHFs?feature=oembed


In the 1990s, the world discovered the power of the internet, the most open computer network ever devised. Its untrammeled nature soon alarmed the Chinese government, which believed in suppressing dissent rather than encouraging free expression. In 1997, it enacted a law forbidding people from using the net to “injure the interests of the state or society.” Then it soon began a decades-long effort to erect “the Great Firewall of China,” which filters out access to information the government deems harmful to its interests. Today, the firewall is more effective than ever, and everything from Facebook to Wikipedia is banned, denying the world’s most populous country access to the promise of the internet. The government has further leveraged its control of online activity through measures such as a nationwide “social credit system” that monitors users and exacts real-world penalties for unacceptable behavior.


Google’s search engine was a technical breakthrough, but it wasn’t a viable business until the company embraced pay-per-click ads. And for that innovation, it has GoTo.com to thank. As Will Oremus wrote in 2013, GoTo.com founder Bill Gross hatched the idea of having advertisers bid on top placement in search results, then charging them when users clicked through. Google then aped the approach in its own search engine with AdWords, which added quality scores to further discourage irrelevant ads. The company eventually brought the bidding concept to third-party sites with AdSense. Though GoTo (later known as Overture) didn’t survive, its concept is still a cash cow for Google—just ask the Department of Justice and state attorneys general who say that the company turned them into a monopoly.


In June 1999—more than two years before the iPod’s arrival—digital music was just beginning to catch on, and most of it consisted of MP3s that people ripped from their own CDs. Then a teenager named Shawn Fanning created Napster, a peer-to-peer network that let uses share all their music with every other user over the internet. It was a revelation—and totally illegal. Lawsuits shut Napster down a little more than two years after it launched. Despite fears that consumers would never willingly pay for music again, later for-pay services, such as iTunes and Spotify, succeeded. In retrospect, Napster’s legacy has less to do with the wanton piracy it encouraged than the way it proved that unlimited songs delivered as a service were the future of music.https://www.youtube.com/embed/h92TwH1n4pE?feature=oembed


What was the best acquisition in tech history? Sorry, it wasn’t Google buying YouTube or Facebook snapping up Instagram. Those transactions were blips compared to Apple’s December 1999 agreement to acquire NeXT, the company Steve Jobs had founded after being ousted from Apple in 1985. At the time of the deal, Apple was on the brink of failure, or at least irrelevancy. Jobs (who became interim CEO in 1997 and dropped the “interim” in 2000) made the company cool and financially solvent again. He also provided it with NeXT’s operating system software, which became the basis of the Mac’s OS X—and then, in 2007, the software for the history-making iPhone. Without Jobs’s genius and NeXT’s intellectual property, Apple might be nothing more than a wistful memory.


When Borland and Starfish Software founder Philippe Kahn wanted to share photos of his newborn daughter in June 1997, there was no such thing as a camera phone. So he MacGyvered a wireless-photography rig of his own, using his Casio digital camera, Motorola phone, and laptop. The setup let him send snapshots from the hospital to friends and family around the world. Then it led him and his wife, Sonia Lee, to found LightSurf, a company that helped companies such as Sharp build the earliest cellphones with built-in cameras and wireless infrastructure for transmitting images. Two decades later, instant photo sharing à la Kahn’s 1997 experiment has become such a fundamental mode of human communication that it’s hard to remember it had to be invented.


Early programmers didn’t have much storage space to work with, so they saved a couple of bytes by leaving the “19” off dates—”78,” for example, instead of “1978.” That meant that software couldn’t handle any year from 2000 onwards. As the new century approached, this became a problem—especially for mainframe computers deployed for banking, utilities, and any other task critical to everyday life. Fear of Y2K-bug disaster spawned a cottage industry of books devoted to predicting the worst, from weeks-long power outages to mass starvation. But by the time the calendar rolled over to January 1, 2000,  the industry had done such a competent job of updating software that any mishaps were minor. People went back to being largely optimistic about tech . . . well, at least until recent years.


When Ghyslain Raza recorded himself wielding a golf ball retriever like a lightsaber in 2002, he didn’t expect the video to become public. Classmates at his high school in Québec had other plans, uploading it to the internet and turning it into one of the web’s first viral videos alongside “Numa Numa” and “All your base are belong to us.” But “Star Wars Kid” was far from a positive experience for Raza, who later told MacLean’s that he endured a stream of harassment both online and off, lost friends, and had to change schools. The incident foreshadowed a dark side of viral fame that YouTube would make much more common just a few years later.https://www.youtube.com/embed/HPPj6viIBmU?feature=oembed


The fervor for all things internet—profitability, significant revenues, or even viable business model be damned—heated up throughout 1999 and into early 2000, and Pets.com seemed to embody every noxious trend of the era. Yet even with free shipping for 40-pound bags of dog food and a sock-puppet mascot that appeared everywhere from the Macy’s Thanksgiving Day Parade to the Super Bowl, the company could garner only $5.8 million in net sales, racking up a $62 million operating loss in 10 and a half months. So, when Pets.com went public in February 2000, the market could not make this dog with fleas stand up and bark: It opened at $11 and closed its first day of trading at $11 and 1/16. (For comparison: In the first quarter of 2000, the average IPO almost doubled in value upon its debut.) The Nasdaq would peak a few weeks later and crash in April. Pets.com declared bankruptcy in November 2000 and shut down. But its legacy as the dot-com bubble popper lives on to this day.https://www.youtube.com/embed/nXHrlm5Nk5w?feature=oembed


At first, they were called weblogs, soon shortened to simply “blogs.” Published in a reverse-chronological feed and often both newsy and deeply personal, the journal-like sites started to make their mark on the web in 1997. But their profile was dramatically raised by the terrorist attacks of September 11, 2001. Bloggers such as Josh Marshall (Talking Points Memo), Glenn Reynolds (Instapundit), Andrew Sullivan (Daily Dish), and Dave Winer (Scripting News) aggregated links, added opinion and (sometimes) reporting, and posted at a pace that old-school news outlets couldn’t match. In retrospect, the blog’s golden age was brief—but it was an early sign that individuals, not big media, would be the dominant content creators of the 21st century.


When Google unveiled Gmail on April 1, 2004, many people thought it was an April fools’ prank, which was an entirely reasonable take. It claimed to offer 1 GB of storage—which, at 500 times the capacity of Microsoft’s Hotmail, did not sound plausible. But Google was serious. With its gigantic capacity, instant search, and approachable interface, Gmail competed not only with freemail services such as Hotmail but also Outlook and other full-blown desktop email apps. It was a big moment for the web. And it marked the beginning of the era when Google went way beyond search to occupy more and more of our time with useful services of all sorts.


After soundly defeating Netscape in the original browser wars, Microsoft’s Internet Explorer went on to control well over 90% of the browser market—which meant that almost everyone used IE whether they liked it or not. Then Mozilla’s Firefox—a browser built on the open-sourced bones of the old Netscape—appeared in 2004. It was simple, secure, and innovative, during an era when IE 6 was a bloated, calcifying mess. Firefox eventually captured about a third of the market, initiating browser war 2.0 and setting the stage for Google’s Chrome, which finished the job of pummeling IE into irrelevance. If Firefox masterminds Dave Hyatt, Joe Hewitt, and Blake Ross hadn’t been so gutsy and visionary, we’d all be worse off today.


Paul Graham, who sold his e-commerce software startup to Yahoo, started blogging about startups in 2001. But it wasn’t until 2005 that he put his ideas into practice by starting Y Combinator (an obscure lambda calculus reference). That summer, he and his girlfriend, Jessica Livingston (now his wife), along with two others, launched an experiment. They invited founders to submit company proposals, picked the most promising ones, and gave them $6,000 to scrape by while they bootstrapped themselves. The 2,000 startups YC has backed include Reddit, Airbnb, Dropbox, Stripe, and other seminal successes. Today’s ecosystem of angel investors, seed-stage funds, and pre-seed funds likely would not exist if not for YC inspiring others to find gaps in the world of venture capital and fill them.


As far as major tech moments go, Amazon’s 2006 launch of Elastic Compute Cloud flew under the radar. But that early application marked an inflection point for Amazon Web Services, which had originally launched in 2002 with a handful of web-based tools. AWS has arguably become just as important to modern computing as the iPhone. Services such as Dropbox, Netflix streaming, Lyft, and Apple’s iCloud might not have existed—or, at least, not have worked nearly as well—without the on-demand cloud computing services that Amazon provided. Rivals such as Microsoft Azure and Google Cloud are still scrambling to catch up. And AWS became an enormously profitable enterprise that proved that Amazon could succeed in businesses far afield of its original online department store.


For Facebook, no product launch was as important as News Feed, which arrived in 2006 as a central hub for friend activity and has ballooned into one of the internet’s biggest distributors of information (and misinformation), and an influence on countless other services. Even in its earliest form, some users hated the News Feed for its drastic design changes and for its perceived creepiness, as all their activity was suddenly on full display. But CEO Mark Zuckerberg’s response to the backlash couldn’t have been more Facebook-like, paying sympathy to users’ concerns while insinuating that the company knew what’s best for them. “This may sound silly, but I want to thank all of you who have written in and created groups and protested,” he wrote. “And I am also glad that News Feed highlighted all these groups so people could find them and share their opinions with each other as well.”


As any BlackBerry devotee will helpfully point out, the iPhone was hardly the first smartphone, or even the first to change plenty of lives. But its introduction—a six-month extravaganza that began with its unveiling on January 9, 2007, and lasted until it went on sale the following June 29—was the single event that best represents Peak Gadget. The first iPhone packed so many innovations into such a lustworthy package that it largely lived up to the off-the-charts hype. Even if you were smitten at the time, however, you might not have predicted that the iPhone would matter so much 13 years later. No piece of consumer electronics released since has had remotely the same impact—and it’s an open question whether any future one will match it.https://www.youtube.com/embed/0XJg74qnvxE?feature=oembed


Although the iPhone itself was a breakthrough in personal computing when it launched in 2007, the real revolution didn’t start until Apple launched the App Store a year later. Suddenly, a safe and convenient source of touchscreen-friendly software—from games to utilities to fart apps—was just a tap away, and developers lined up to make apps in hopes of becoming the next runaway hit. A vibrant app marketplace soon became table stakes for any mobile platform; latecomers such as BlackBerry 10 and Microsoft’s Windows Phone 7 tried desperately to play catch-up but couldn’t. That virtuous cycle still benefits Apple today, but as developers begin revolting against the walled garden approach, it’s also become the company’s biggest source of antitrust scrutiny.


Flinging cartoon birds into towers full of nasty pigs might’ve seemed like a silly endeavor in 2009, but it quickly proved that smartphone gaming was a big business. Ro II’s original Angry Birds game racked up 10 million paid downloads within a year—impressive for Apple’s still-nascent App Store—and more importantly, established its disgruntled protagonists as cultural icons who spread to everything from other gaming platforms to the silver screen. Angry Birds demonstrated how games could become hits in the age of the smartphone, even without bloated budgets and multiyear development cycles.https://www.youtube.com/embed/kQiZ8ZUmKB4?feature=oembed


For its first four years as a streaming video service, Netflix relied on licensing a pretty darn random assortment of movies and TV shows that already existed. That changed in 2011 when Netflix outbid HBO for the rights to House of Cards. Netflix downplayed the move at the time. Still, the implications were obvious from the start: Instead of relying on cable’s leftovers, Netflix would host its own hit shows that weren’t available elsewhere. When House of Cards debuted to critical acclaim a couple of years later, it kicked off the streaming wars in earnest. Now it’s hard to remember that there was a time when multiple streaming giants weren’t vying for big-budget shows and movies starring some of Hollywood’s biggest names.


In 2014, groups of online trolls launched a series of harassment campaigns against female game developers and journalists. The months-long effort, called Gamergate, ostensibly opposed unethical game journalism and its beneficiaries, under the logic that the people it attacked were trying to undermine longstanding aspects of gamer culture. But as Deadspin’s Kyle Wagner astutely wrote at the time, the whole ordeal doubled as a battleground for conservative grievances such as political correctness and social justice, especially once right-wing figures like Christina Hoff Sommers and Milo Yiannapolous glommed on. It wasn’t just the resulting harassment that stood out, though. Gamergate’s proponents and those speaking out against it could not even agree on the most basic set of facts, like what the campaign was about, who was being harmed, and what counted as unethical behavior. That failure to communicate now colors online conversations of all kinds.


The Amazon Echo speaker was easy to write off when it arrived in 2014. Coming soon after Amazon’s Fire Phone flamed out, its combination of music, news, and basic information didn’t seem much different from what Apple’s Siri offered on iPhones, and what Google voice search provided on Android. But voice-first computing has its advantages, and in about five years, more than 100 million devices had been sold with Amazon’s Alexa voice assistant built in. Along the way, it’s turned the smart home from nerdy novelty into a mainstream concept while prompting Apple and Google to release smart speakers of their own. It’s also proved that the notion of Star Trek’s ambient “Computer” isn’t as futuristic as it once seemed.


With its dark combination of privacy invasions and political intrigue, the Cambridge Analytica affair—leaked by whistleblower Christopher Wylie and detailed by The New York Times and The Guardian in March 2018—turned #DeleteFacebook into a mantra for many. The British political consulting company gained access to a vast repository of data collected by a Facebook personality quiz purportedly designed for academic purposes; Facebook said more than 87 million users were at risk. Cambridge Analytica, whose clients included the 2016 Trump campaign and pro-Brexit forces, used the trove to help microtarget ads. Some experts say that the firm grossly exaggerated the power of its data and that there’s no evidence it played a meaningful role in the Brexit and Trump victories. Regardless, Facebook has since tightened the access third-party apps and services have to user data. But cynicism about the company’s motives has colored public reaction to other revelations ever since.


Historically, people employed by big Silicon Valley tech firms have seen their work as idealistic rather than raw capitalism. So it’s understandable that many Google employees were stunned by an October 2018 article by The New York Times’s Daisuke Wakabayashi and Katie Benner revealing instances of the company failing to take charges of sexual misconduct among its upper ranks seriously, and even rewarding some execs accused of misbehavior with millions of dollars in severance. But what 20,000 Google staffers did in response was unexpected: They staged a walkout. The rare instance of public dissent has led to additional acts of tech-worker activism, including further Google protests over its response to workplace organizing. More recently, multiple Facebook employees have been tweeting their unhappiness over the company’s handling of incendiary posts by Donald Trump.


More than a half-century after Bell introduced the Picturephone at the 1964 World’s Fair, video calling finally became an essential way to conduct business. It only took a global pandemic for people get comfortable with the idea. Zoom alone says it now has 300 million daily meeting participants—up from 10 million before the coronavirus hit—and its success has forced rivals like Google, Microsoft, and Facebook to reckon with years of video chat neglect. With businesses starting to embrace remote work as a permanent option for employees, video calls will likely stick around even after the pandemic winds down.


The PalmPilot makes PDAs cool (1996): More than a decade before the first iPhone, Palm’s handheld was sexy, useful, and capable of running a bevy of great third-party apps. When the Palm-powered Treo came along, you could even make phone calls.

MapQuest goes online (1996): A website that lets you get custom driving directions between any two points? For free? Even when you had to print them out at home, that changed everything.

TiVo and ReplayTV release the first DVRs (1999): Long before Netflix and Amazon streamed their first shows, these two digital video pioneers—both of which announced a DVR at CES in 1999–made binge watching irresistible.

AOL and Time Warner agree to merge (2000): The combination of the country’s biggest online service provider and a media giant was supposed to unlock synergies and dominate a new era of communications. Instead, it became the benchmark for every disastrous merger that followed.

Wikipedia publishes its first article (2000): With all due respect to the editors of the Encyclopædia Britannica, the web’s crowdsourced encyclopedia quickly became a better, vastly more comprehensive repository of knowledge than any of its dead-tree ancestors.

Apple becomes a chipmaker (2008): When the company quietly purchased a processor startup called P.A. Semi in 2008, it started down a path that led to it designing its own bespoke, high-performance CPUs for iPhones, iPads, and—starting soon—Macs.

Minecraft turns gamers into builders (2009): Markus “Notch” Persson’s block-building game turned out to be a platform for creativity of all kinds—and is still going strong 11 years later under Microsoft ownership.

Facebook buys Instagram (2012): In retrospect, Facebook paying a mere $1 billion to take out its most formidable rival was a steal. And instead of withering away under new ownership, Instagram thrived.

Sony Pictures gets hacked (2014): In a nightmarish episode of corporate intrigue, the movie studio had bushels of embarrassing internal correspondence stolen by hackers who demanded that it cancel the North Korea-mocking comedy The Interview.

Equifax gets breached  (2017): The credit-bureau giant’s failure to patch a server lead to a leak of data on more than 150 million consumers. Its response to its faux pas and the penalties it pays didn’t make anyone feel better.