This Day In Tech Events That Shaped the Wired World

March 26, 1999: ‘Melissa’ Wreaks Havoc on Net

1999: The “Melissa” worm makes a sudden appearance, screwing up specific e-mail programs by clogging them with infected e-mails issuing from the worm. It is the first successful mass-mailing worm.

Melissa was first distributed in alt.sex, a Usenet discussion group, hidden inside a file that contained the passwords to 80 pornographic websites. The worm was then sent by e-mail to the multitudes worldwide, spread mainly on the Microsoft 97 and 98 e-mail clients.

While the actual damage wrought by Melissa was minimal, the corporate herd mentality caused people to panic like startled wildebeests, leading many companies to shut down their internet connections with the outside world.

The worm’s author, David L. Smith, named his creation, appropriately enough, after a lap dancer he had met in Florida. Smith used the alias Kwyjibo, but his identity was matched with virus writers VicodinES and Alt-F11, and he was eventually arrested.

Faced with a 10-year prison term, he served only 20 months and was fined $5,000.

Source: Sophos, Answers.com

This article first appeared on Wired.com March 26, 2007.

See Also:

March 25, 1995: First Wiki Makes Fast Work of Collaboration

wikiwikiweb1994
1995: The collaborative internet takes a giant leap forward with WikiWikiWeb, the first site that actually invites people to hack it.

User-generated content and open source reporting are now standards of digital civilization. But for the internet’s first dozen years or so, even the eggheads who had invented the medium as a way of collaborating reliably over distances hadn’t thought of creating databases anyone could contribute to and edit other people’s work.

The state of the art in 1995 was the listserv — still very serviceable groupware, but limited by chronological indexing where, especially in a long discussion, the context could get completely buried.

With a wiki, you can jump in at the exact context — insert a sentence here — so the next reader doesn’t have to assemble random bits into a cohesive whole. With a wiki, contributors alter a “primary” document, like so many chefs perfecting the broth.

It all seems so obvious now, with cloud-based documents that multiple people can edit simultaneously.

But the wiki’s beginnings were humble: Like many things internet-related, the first was a practical application by one person trying to solve his own work-related problem. In this case, that person was Ward Cunningham, and the problem was how to better collaborate with a bunch of other programmers.

On this day in 1995, Cunningham installed the WikiWikiWeb on a $300 computer someone gave him and connected it to the mostly barren landscape that was then the internet, using a 14.4-baud dial-up modem.

“Think of it as a moderated list where anyone can be moderator and everything is archived. It’s not quite a chat, still, conversation is possible,” he wrote to his collaborators at the time.

Cunningham also coined the word “wiki,” which has nothing to do with computers. As he explained to the American Heritage Dictionary:

Wiki wiki is the first Hawai’ian term I learned on my first visit to the islands. The airport counter agent directed me to take the wiki wiki bus between terminals. I said what? He explained that wiki wiki meant quick. I was to find the quick bus. I did pick up a book about the language before my return home. I learned many things from this but wiki wiki is the word that sticks the most.

Wikis are everywhere now, empowering a collective as big as the planet to improve any idea, make any suggestion, find the flaw in any plan. Wired.com hosts its own How-To Wiki — where you can learn how to run an ultramarathon or make your gadgets tweet.

The sixth-most trafficked site in the world is Wikipedia, which has more than 3.25 million articles in English and millions more in scores of other languages — all written, edited and maintained by nobody in particular.

In a trusted community like that WikiWikiWeb served, it’s not a great leap of faith to allow colleagues to update and edit web pages the group depends on.

What nobody could have predicted was that, in general, opening up the books and shelves of a reference library to anyone on the internet would not be an unmitigated disaster. While Wikipedia has been widely criticized as being fundamentally unreliable since at any given moment an entry could be vandalized — many schools and media outlets ban it as a primary source — its impact and utility is difficult to dismiss.

Many more people consume wikis than feed them, which may partly explain why enough damage isn’t done to these knowledge bases to make them unusable. And, in fact, Wikipedia’s core community does exert controls on contributors, uses software to aid in flagging vandalism, and acts very quickly to remove defacement of an entry.

But the notion of letting some stranger into your office to write — or erase — something on your whiteboard is an “only on the internet” phenomenon.

While the rules to live by are obvious, Cunningham himself has a few, which he still maintains on his WikiWikiWeb pages. They include:

- Write only factual information.
- Give concrete advice, rather than abstract.
- Respect the freedom you have been given.
- Be concise and stay on topic for the page.
- Use language you’d be comfortable reading out loud — “use” versus “utilize” — and keep it simple. Simple language often communicates better.
- Check for spelling and grammar errors — errors detract from the content.<
- Edit only when you think a page is lacking -- don't just sign your name at the bottom of every page.
- Delete only if doing so adds value.<
- Don't say things that are likely to make others mad. Practice civility and understatement.
- Above all, be good, and play nice!

Funny how these principles apply to so many things.

Source: Various

Image: WikiWikiWeb screenshot, circa 1994.

See Also:

March 24, 2001: Apple Unleashes Mac OS X

stevejobs

2001: Apple gives birth to Mac OS X — the beating heart of today’s Macs, iPhones and, soon, the iPad.

Fired and then rehired by his own company, Steve Jobs drove a near-broke Apple Computer to profitability with the success of the iMac in 1998. But arguably the reacquisition of Jobs would prove even more valuable to the Cupertino, California, corporation in 2001, when Apple introduced its cutting-edge operating system Mac OS X.

Mac OS X was comprised of Unix-based technologies that were developed by NeXT, a company Jobs founded in 1985 during his 11-year exile from Apple. With NeXT, Jobs’ goal then was to make a Mac-like computer for education that would put Apple out of business.

Fortunately for Jobs, Apple nearly did that to itself. During Jobs’ absence, Apple’s stock fell 68 percent and the company neared bankruptcy. Over those years, Apple had promised Mac users a new operating system again and again, and failed to deliver each time. In 1996, Apple killed its operating system project codenamed Copland, and soon the floundering corporation announced it was purchasing NeXT to build a new Mac OS. Of course, that meant rehiring Apple’s ousted leader. Jobs soon retook the helm as Apple’s CEO.

Finally, on March 24, 2001, Apple released its new operating system Mac OS X with a retail price of $130. The X, enthusiasts have neurotically noted, stands for “10″ to represent its version number, and is thus not to be pronounced “ex.”

The OS promised improved stability and delivered a new “Aqua” user interface along with backward compatibility for the earlier Mac OS 9. Like most first-generation products, Mac OS X was rough: Many features were missing, and it suffered from a number of compatibility issues. For example, DVD playback and CD burning were not supported, and many pieces of external hardware were incompatible with the system.

Still, Mac OS X was an important step for Apple. John Siracusa, Ars Technica’s Apple specialist, summed up the significance of Mac OS X when he reviewed the operating system in 2001:

To say that Mac OS X has been eagerly awaited by Mac users is an understatement. Apple has been trying to produce a successor to the classic Mac OS for almost 15 years. It’s a tragicomic litany of code names: Pink, Taligent, Copland, Rhapsody. In the early days (the Pink project was launched in 1987), Mac users paid little attention to these efforts, confident that their current OS was the most advanced in the personal computer market. But as the years passed and competing operating systems evolved, both by adopting Mac-like GUIs and by advancing their core OS features, Mac users — as well as Apple itself — became skittish.

Siracusa had noted that the “success of Mac OS X is still an open question.” Today, the operating system’s success is indisputable. Mac OS X has been refined over the years to eliminate its early flaws. Now in its seventh version (Snow Leopard), Mac OS X still powers Apple’s latest Macs, which have helped Apple brave the economic recession. And perhaps even more importantly going forward, specialized versions of Mac OS X are driving the iPhone, the iPod Touch and the upcoming iPad. Apple’s annual revenues are now beyond $50 billion, according to Jobs.

Sources: Ars Technica, Various

Photo: Steve Jobs introduces the Mac OS X public beta.
m.p.3./Flickr

See Also:

March 23, 1857: Mr. Otis Gives You a Lift

picture-4

1857: Attention shoppers: The first commercial elevator goes safely up and down in a New York City department store. Like air conditioning and public transportation, elevators are supposed to make the working life a little easier. Maybe they do. But there’s no doubt they introduce the necessary condition to fill cities with skyscrapers.

You may, after all this time, still not know what to do with yourself during the inexplicably long seconds you share in vertical captivity with strangers and other people you’d rather not acknowledge. But this dilemma almost certainly did not concern Elisha Graves Otis in 1853 when he founded Otis Elevator, the company that would dominate the elevator business for more than a century and a half — and counting.

The secret of Otis’ success wasn’t so much that he could make a platform go up and down, which (patent trolls note) isn’t really much of an engineering achievement. There were already steam and hydraulic elevators in use here and there for a couple of years before Otis stepped up. No: Otis’ achievement was that he convinced people he could make an elevator that would go not only up, but also down without going into a free fall.

Otis set up business in Yonkers, New York, an emerging industry town about 15 miles north of Times Square. He sold only three elevators in 1853 — for $300 each — and none in the first few months of the following year. So the entrepreneur decided to make a dramatic demonstration at the New York Crystal Palace, a grand exhibition hall built for the 1853 Worlds Fair.

The company recounts this milestone in its history.

Perched on a hoisting platform high above the crowd at New York’s Crystal Palace, a pragmatic mechanic shocked the crowd when he dramatically cut the only rope suspending the platform on which he was standing. The platform dropped a few inches, but then came to a stop. His revolutionary new safety brake had worked, stopping the platform from crashing to the ground. “All safe, gentlemen!” the man proclaimed.

Otis’ demonstration had the desired effect. He sold seven elevators that year, and 15 the next. When Otis died only seven years, later his company, now run by his sons, was well on its way. By 1873 there were 2,000 Otis elevators in use. They expanded to Europe and Russia. In rapid succession his company got the commissions for the Eiffel Tower, the Empire State Building, the Flatiron Building and the original Woolworth Building — in its day, the world’s tallest. In 1967, Otis Elevator installed all 255 elevators and 71 escalators in the World Trade Center.

But the very first commercial installation was on March 23, 1857, at a five-story department store at Broadway and Broome Street in what is now New York City’s SoHo district.

The elevator’s wide adoption had a dramatic effect on how we work and live. Before, most buildings were built only a few stories high, since climbing stairs is a tiring, high-impact activity. With elevators, the sky became the limit. Offices, and later homes, on higher floors commanded the highest prices, for the view and the respite from street noise. The world-famous New York City skyline? Impossible without the elevator.

Elevators also created new jobs and helped empower the United States’ most oppressed citizens. You may not see them much anymore, but there were once tens of thousands of elevator operators, most of whom were black. Indeed, the first elevator operator’s union was formed in 1917 by none other than legendary labor organizer and civil rights leader A. Philip Randolph — an elevator operator who went on to create the game-changing Brotherhood of Sleeping Car Porters.

In their earliest days, the job of elevator operator required the skill and touch of a barista: An operator ran the lift with a sliding lever that raised, lowered and stopped the lift. Later on, elevators became fully automated vehicles with buttons anyone could push and electronics that knew where each floor was. Now there are few manual elevators still in operation — but their age and safe records are testaments to Otis’ early work.

One thing hasn’t changed: Riding in elevators may be the most boring few seconds of daily life, next to waiting for the microwave to “ding.” And for the tiny fraction of our lives we spend in them, being there presents an inordinate number of etiquette challenges. And, for the most part, we don’t even have elevator music to distract us anymore.

One thing you almost certainly don’t have to worry about, though, is the risk of serious injury or death. Out of millions of elevators in the world, only 20 to 30 elevator-related deaths are reported every year. Those fatalities tend to happen when someone steps into an elevator shaft when the elevator should be there, or from the extreme(ly stupid) sport of elevator surfing — not because an elevator hurtles out of control.

So it does seem that they are as safe as Otis knew they were when he cut the cord on himself in 1854.

Source: Various

See Also:

March 22, 1995: Longest Human Space Adventure Ends

gpn-2002-000078

1995: Cosmonaut Valeri Polyakov returns to Earth from the longest-ever stay in space by a human. He spent just over 437 days in the Mir space station.

Thanks to a strenuous workout regimen, he returned to Earth looking “big and strong” and “like he could wrestle a bear,” in the words of NASA astronaut Norman Thagard.

Polyakov, a medical doctor, said that he volunteered for the extra-long mission to prove that the human body could survive microgravity long enough to make a trip to Mars. As such, he took pains to show that he was no worse for the zero-g wear when he got back onto terra firma.

“[W]hen his capsule landed in Kazakhstan he walked from it to a nearby chair, a tremendous achievement,” Philip Baker wrote in his book The story of manned space stations. “He also stole a cigarette from a friend nearby, but could hardly be blamed for that. He sipped a small brandy and inwardly celebrated his mission. His record still stands, and it is unlikely to be broken until man ventures to Mars.”

Reportedly, his first statement back on Earth was to tell a fellow cosmonaut, “We can fly to Mars.”

Polyakov’s mission did not get off to an auspicious start. When the cosmonauts who dropped him off did a flyby to take pictures of Mir, they grazed the space station with their craft. Luckily, no major damage was done.

The rest of Polyakov’s mission wasn’t that eventful. After a rough first three weeks, his mental performance bounced back (.pdf) to his Earth-bound norms.

At the time, Polyakov held the record for most cumulative time in space, but he has since been surpassed by Sergei Krikalyov.

Source: Various

Photo: NASA

See Also:

March 19, 1979: House Proceedings Air Live on C-SPAN

C-Span

1979: Tennessee congressman Al Gore stands before his colleagues on the floor of the House of Representatives and gives a speech about the democratic virtues of television: “The marriage of this medium and of our open debate have the potential, Mr. Speaker, to revitalize representative democracy.” Kicking off an otherwise business-as-usual congressional session, Gore becomes the first politician to share his thoughts with the nation live on a C-SPAN broadcast.

A year earlier, former Naval officer Brian Lamb pitched his C-SPAN concept to cable industry officials at the Cable Satellite Access Entity. The plan called for the creation of a nonprofit public-service TV channel.

Lamb, who would become CEO of C-SPAN, had perfect timing: Congress was in the mood for transparency. Still reeling from the Watergate scandal four years earlier, Congress in October 1978 overwhelmingly approved by a vote of 342 to 44 the H. Res. 866. The bill authorized C-SPAN to broadcast proceedings live from the House of Representatives floor to a potential audience of 3.5 million households.

To set up the network, C-SPAN spent about $500,000 to hire four staffers and set up a satellite feed in Richmond, Virginia. It had one telephone line for calls from both creditors and viewers, and it broadcast only during business hours because the satellite it shared aired sporting events at night.

Technologically, most of the heavy lifting had already been accomplished a few months earlier by Speaker Thomas “Tip” O’Neill of Massachusetts, who championed the installation of six cameras and the television studio in the basement of the Capitol at a cost of roughly $1.2 million.

Both the Public Broadcasting System and C-SPAN picked up the House feed and broadcast the proceedings to the public.

Once Gore finished his “welcome C-SPAN” address, praising the concept as “a solution for the lack of confidence in government,” the initial C-SPAN session carried little resonance for the public at large. For two hours and 20 minutes, Reps. Millicent Fenwick (R-New Jersey), Richard Bolling (D-Missouri), Abner Mikva (D-Illinois) and Thomas Foley (D-Washington) discussed the restructuring of House committees.

Live broadcasts of Senate floor proceedings began in 1986.

In the 4,365 (and counting) programs logged since C-SPAN’s inception, the cable channel has pulled back the curtain on tense congressional grilling sessions, including the recent Toyota Hearings, in which politicians and businessmen were held accountable for their actions.

Sources: Various

Photo: C-SPAN’s Brian Lamb (right) interviews former Oklahoma Rep. Dave McCurdy on an early C-SPAN set. - Associated Press

See Also:

See Also:

March 18, 1987: Woodstock for Physicists

The APS March Meeting of 1987 - The "Woodstock of Physics"

1987: Thousands of physicists crowd a ballroom at the New York Hilton for a hastily arranged marathon session on high-temperature superconductivity. The event generates so much excitement that it is later referred to as the “Woodstock of Physics.”

Discovered in 1911, superconductivity is a phenomenon in which certain materials, at very low temperatures, become essentially transparent to electricity: Their resistance drops to zero and electrons can flow freely, with perfect efficiency.

However, for most of the 20th century, superconductivity was only observed at extremely low temperatures, just a few degrees above absolute zero. From 1973 on, physicists had not been able to induce the phenomenon at temperatures higher than 23 degrees Kelvin.

Then, in 1986, a number of researchers achieved breakthroughs with new materials. K. Alex Müller and J. Georg Bednorz at IBM’s Zürich Research Center discovered a new class of ceramics, known as perovskites, that became superconductive at 30 K. “I celebrated this with one or two beers,” Dr. Bednorz told The New York Times later.

Other researchers in Tokyo and Beijing confirmed the Swiss results, and then in February 1987, American physicist Paul C.W. Chu demonstrated superconductivity at 93 K (minus 283 degrees Fahrenheit), or 16 degrees above the boiling point of nitrogen.

While still very cold, the finding was a significant breakthrough, because it meant that relatively inexpensive liquid nitrogen — which is cheaper than beer — could be used to cool superconductors. It opened up the possibility that these unusual materials might find practical applications, such as powerful magnets for mag-lev trains or superefficient power transmission lines.

Chu’s paper was published in the March 2 issue of Physical Review Letters. It was just a few weeks before the annual meeting of the American Physical Society, and interest in the subject was so high that organizers hastily threw together a session on the topic, starting at 7:30 p.m. on March 18. Eager physicists lined up as early as 5:30 p.m. for the session, and eventually more than 1,800 physicists crammed into a ballroom meant for 1,100. An overflow crowd of 2,000 more watched the proceedings on television monitors.

The session, with 51 presenters, went until 3:30 in the morning.

“It was an electrifying event,” deadpanned Philip F. Schewe, a science writer who was there, according to the Times.

And indeed, high-temperature superconductors captured New York City’s and the nation’s imagination for a short while. Visions of levitating trains and supercolliders danced in the public’s mind. Scientists found that their APS badges got them into Chelsea nightclubs for free. Newspapers wrote about superconductors. Funding for superconductors surged.

(The Superconducting Supercollider, or SSC, kicked off just the year before with a $200 million congressional allocation. It would be killed in 1993, and the SSC is incomplete and abandoned today, through no fault of the superconductors.)

But most importantly, the breakthrough opened new avenues of inquiry into a phenomenon that is still a bit of a mystery.

The discovery “brought a flash of sunlight on one of the fields … that many of us had thought was rather mature and fairly well-understood,” physicist Douglas Finnemore wrote 20 years later. “It opened a new mindset that materials with complex chemical bonding can lead to totally new phenomena.”

Sources: Wikipedia, American Institute of Physics, The New York Times

Photo: Physicists pack the ballroom for a presentation on high-temperature conductivity, at the American Physical Society meeting in New York in 1987. Courtesy of the American Institute of Physics

See Also:

March 17, 1953: The Black Box Is Born

Black Box

1953: After several high-profile crashes of de Havilland Comet airliners go unsolved, Australian researcher David Warren invents a device to record cockpit noise and instruments during flight.

During the first half of aviation’s history, crashes rarely came with any answers. Even if an eyewitness saw an airplane crash, little was known of the cause or what pilots might have been aware of before the crash.

In the early 1950s, the world’s first jet-powered airliner, the de Havilland Comet, crashed several times. Warren, a researcher at the Aeronautical Research Laboratories in Melbourne, Australia, believed if the pilot’s voices could be recorded, as well as instrument readings, the information could help determine the cause of a crash — and help prevent them. His device was called a “Flight Memory Unit.”

By 1957, the first prototypes of the device were produced. Early versions could record up to four hours of voice and instrument data on a steel foil. Warren believed the device would be popular and help solve the mysteries behind aviation crashes, but the device was initially rejected by the Australian aviation community for privacy issues.

Eventually, British officials accepted the idea of a flight data recorder and Warren began producing FDRs in crash- and fire-proof containers and selling them to airlines around the world. After a 1960 crash in Queensland, where the cause could not be determined, the Australian government required all commercial airplanes carry a recorder. The country became the first to require the use of the devices.

Early recorders logged basic flight conditions such as heading, altitude, airspeed, vertical accelerations and time. Today’s FDRs can record many more parameters including throttle and flight-control positions. Analyzing so many parameters allows investigators to recreate most of the pilot-controlled activity in the moments leading up to a crash. In recent years, digital reproductions of flights using FDR data have been valuable in recreating accidents and analyzing both the problems leading to the crash and the pilots’ response.

Modern FDRs, aka “black boxes,” are actually bright orange. They must withstand several tests, including fire and piercing, and the ability to withstand the pressure of being submerged to 20,000 feet below the ocean. Perhaps most impressive is their ability to withstand a 3,400-g crash-impact test. To aid in recovery, a locator-beacon signal is emitted for up to 30 days.

While early designs recorded the information onto a steel foil, modern FDRs use solid-state memory that can be downloaded almost instantly. This data can also be checked during routine maintenance inspections to monitor the performance of aircraft.

Future improvements to flight recorders include the possibility of transmitting flight data in real time to ground stations, which would eliminate the need to physically find the flight data recorder. Interest in this kind of in-flight transmission of data gained momentum after Air France flight 447 disappeared over the Atlantic in 2009 and a flight data recorder could not be found.

Source: Various

Photo: Officials transfer the TWA Flight 800 flight data recorder from saltwater into freshwater on July 25, 1996, at the Coast Guard station in East Moriches, New York.
Associated Press/US Coast Guard

See Also:

March 16, 1802: Army Engineers Get New Foundation

West Point

1802: An act of Congress establishes the Army Corps of Engineers. The corps will help shape the nation, literally.

General George Washington appointed the first U.S. Army engineers June 16, 1775, the day before he actually received his commission from the Continental Congress “to be General and Commander in chief, of the army of the United Colonies.” Colonel Richard Gridley served as the army’s first chief engineer.

Congress waited until 1779 to establish a separate Corps of Engineers. Army engineers, with the help of some French officers, made significant contributions to key battles of the Revolutionary War, right up to the final victory at Yorktown, Virginia.

The new Congress of the Confederation was reluctant to maintain a large, standing army, and the engineers mustered out after the war ended. After the U.S. Constitution replaced the Articles of Confederation, Congress organized a combined Corps of Artillerists and Engineers in 1794.

The current Corps of Engineers traces its history directly back to the second foundation in 1802. Congress also directed on March 16, 1802, that the Army create a new military academy 50 miles up the Hudson from New York City at a location called West Point.

One of the Corps’ first jobs, in fact, was to build the U.S. Military Academy. West Point’s first superintendent, Jonathan Williams, became chief engineer of the corps. From its founding until 1866, every superintendent of the academy was an engineer officer.

For its first half-century the academy was the nation’s foremost engineering school, and the only one until the Rensselaer School was founded still farther up the Hudson at Troy, New York, in 1824. (The University of Virginia’s School of Engineering and Applied Science arrived in 1836, and MIT didn’t show up until 1861.)

Besides the obvious function of building fortifications for the rapidly expanding nation, the corps picked up civilian duties and responsibilities almost from the start. It built lighthouses, constructed jetties and piers for harbors, and charted navigation channels. (You could argue that these projects had defensive as well as commercial intent, but it was the Army building them, not the Navy.)

Inland, the Corps of Engineers mapped large swaths of the West. The corps became the primary federal flood-control agency in the 20th century, integrating and building out thousands of miles of levees in the Mississippi River system. Its dams and lakes also became a major provider of hydroelectric energy and recreation.

Some Corps of Engineers projects have been criticized as pork-barrel giveaways to one region or another. Various critics have questioned how well some of the projects work, or if on balance they’re causing more harm than good. Environmentalists in particular have decried dams and levees for confounding the continent’s natural drainage system, and have made a case that coastal erosion-control structures are worthless or worse.

Congress has attempted several times to reform the corps and make it more responsive to these concerns. But the shifting sands of politics and the military do not always move as fast as evolving ways of how we view the natural world.

Source: Army Corps of Engineers, others

Photo: The histories of West Point and the Army Corps of Engineers have been intertwined since their beginning.

This article originally appeared on Wired.com March 16, 2009.

See Also:

March 15, 1985: Dot-Com Revolution Starts With a Whimper

symbolics36401985: Symbolics, a Massachusetts computer company, registers symbolics.com, the internet’s first domain name. The market for these unique addresses would not heat up for years, but this click heard ’round the world would eventually provide just about anyone a place in cyberspace to call their own.

Owning your own domain is nothing to brag about anymore, while trying to get one that resembles your name or something personally meaningful has become an exercise in futility. But a quarter of a century ago, when Symbolics took the first step, there was barely an internet — it was years before the world wide web and graphical web browsers.

In those early days, even before AOL, the internet was a noncommercial medium that only eggheads and propellerheads used. It was more of a military and academic tool than today’s vast playground, time suck and, for some, golden goose now central to everyone’s waking moments.

Back in 1985, nobody thought to register, say, sex.com, the most expensive domain ever. It was bought from Network Solutions in 1994 and changed hands in 2006 for a reputed $14 million, and it goes on the auction block later this week to the highest bidder, starting at $1 million.

The entire cybersquatting era was a decade away, as was the rush to acquire a personal domain to customize and control e-mail and to make blogs memorable in name, if not in content.

Nobody seemed in a terrible hurry to get a domain; only five were registered in all of 1985. As you’d expect, the first 100 are packed with computer companies. Apple registered its namesake, the 64th domain, on Feb 19, 1987. Microsoft waited until 1991 to buy theirs.

IBM and Sun registered on the same March day in 1986, the same year Intel and AMD joined the cool crowd. That was 14 months ahead of even Cisco Systems, whose tag line in the future would be: “Empowering the Internet generation.”

No, none of these obvious suspects were first, or particularly early adopters. In fairness, Symbolics was not exactly chopped liver; it was a member of the legendary Route 128 corridor of high-tech firms that fueled the Massachusetts Miracle (no, not the election of Scott Brown). That remarkable stretch of economic power catapulted Massachusetts Gov. Michael Dukakis into a dismal 1988 Democratic presidential candidacy, and then exile to obscurity — which is similar to Symbolics’ trajectory.

But the company’s place in history is well-deserved. Symbolics was conceived at the MIT Artificial Intelligence lab, the renown academic incubator. One employee, a former member of the lab, created the LISP machine — the world’s first workstation, before that term was even invented.

Symbolics was best-known for developing what was thought at the time to be the best computing platform for developing AI software. This was during a lush, Darpa-funded renaissance for the sexy-sounding, yet broadly-defined technology. Others know Symbolics for its software, which, among other things, was used to create some scenes in Star Trek III: The Search for Spock.

By 1985 Symbolics was marketing its fifth-generation 3600 series of LISP workstations and, in a era bedazzled by the prospects for AI, was riding high. Things then turned south. Born between two AI winters, Symbolics went into a freefall: Founders were fired, buyers panicked, real estate investments turned bad and the inexorable march of the PC trampled it into near oblivion.

Symbolics still exists, but in a very diminished capacity — and at an entirely new (and less snazzy) address: symbolics-dks.com.

Last August, symbolics.com changed hands for the first time, bought by a domain-aggregation company whose owner was five years old when the address was first registered. The site now hosts the personal blog of Aron Meystedt, who owns both XF.com and his trophy domain.

Everyone and his sister now owns a domain, or has personalized space on someone else’s. Facebook alone has more than 400 million members, and each one of them can have a vanity address.

Even if you don’t have or share one, chances are you work somewhere with a storefront on the internet that has a “.com” after its name.

But on this day in 1985, very few people could make even that claim.

Source: Various

Photo: Symbolics 3640 LISP machine
Michael L. Umbricht and Carl R. Friend/Retro-Computing Society of RI

See Also: