July 23, 1996: Stand By … High Definition TV Is on the Air

1996: WRAL-HD becomes the first U.S. television station to broadcast a high-definition signal.

This milestone, witnessed by a mere handful of invited guests, was the culmination of a 20-year global initiative to improve the over-the-air TV signal that hadn’t changed in four decades. The broadcast was a success, but the stunning audio and video clarity of HD would not become a universal way of life until 2009 — and then only because of a government mandate.

Television may no longer have the impact on our collective consciousness it did when you had a choice among maybe three channels and before there was an internet. That medium, the only one to seriously challenge TV’s half-century supremacy, provides not only new ways to watch TV but also disruptive competition from amateurs (gifted and otherwise) using such democratizing platforms as YouTube.

Still, considering the extent to which TV still permeates our lives, it is really rather remarkable how little about it has actually changed: A TV set is still just a single-purpose appliance that shows scheduled programming in the privacy of your own home, for free (despite persistent and questionable efforts to add phone calls and web browsing and e-mail to the platform).

TV is great because it’s one of the original literally plug-and-play devices. And because no matter what time it is, there it is, waiting for you, in the words of legendary broadcaster Tom Snyder, to “fire up a colortini, sit back, relax and watch the pictures, now, as they fly through the air.”

Oh sure, there have been big advances: the remote control, programs done entirely in “living” color, affordable flatscreens, TiVo, SlingBox, cable and satellite delivery, hundreds of channels, receivers so large they fill a living-room wall and so small they fit in your pocket.

But there has been nothing like the four-year national initiative to retool the nation’s entire broadcast infrastructure which culminated in 2009 with the rollout of universal digital television. For all of the preparation, it was a messy rush to the finish line because a handful of households, despite four years’ warning, still weren’t getting with the program.

It was this same kind of frenzy at the start line, but for entirely different reasons. After working for a few years with an FCC-created industry consortium known as the Grand Alliance, the agency granted the first-ever HD license June 19, 1996, to WRAL-TV, the CBS affiliate for Norfth Carolina’s Raleigh-Durham-Fayettville market. For the next 34 days, technicians worked at a fever pitch to upgrade the station for the nation’s first HD broadcast.

Working day and night, “An army of engineers and equipment experts” installed an HD transmitter in five weeks, WRAL says in its giddy account of the time — half the time it should have taken. And on this day in 1996, their hard work was rewarded.

“Television history was made as WRAL shared the first public demonstrations of the new high-definition technology in the nation,” WRAL recalls. “Over 200 members of the media and the television industry watched their first HDTV show at the WRAL studios” and at an experimental station in Washington, D.C.

But, of course, nobody else.

This was because HDTV sets would not be in stores until 1998. When they did hit the market they would cost $1,000 or so more than analog sets — a hefty premium for a TV receiver, especially considering that there was precious little HD programming to receive.

Given the impediments, even HD proponents had their doubts about the future of the technology. “I’m not so sure that I see a way to reach the greatest number of masses with HDTV,” Dale Cripps, publisher of the HDTV Newsletter, told the Raleigh News and Observer in a story about the WRAL launch. “There’s obviously a price barrier.”

What’s the big deal about HD? If you don’t already have one, visit the wall of TV sets at Best Buy or Costco. Picture quality is hard to suitably convey. Perhaps equally important is the different “aspect ratio” of HD, which makes the screen wider that analog sets. This means that you see more of the soccer pitch, all of Rachel Ray’s kitchen and widescreen movies as they were meant to be seen — without the black “letterbox” bars above and below the picture some people despise.

They say TV adds 10 pounds to a person’s onscreen weight. If that’s true, HD adds warts and all. This phenomenon was a real novelty in the earliest days, as The New York Times reported in a March 3, 1997, article about WHD, the HD arm of NBC affiliate WRC-TV in Washington, D.C., that was the nation’s second HD station.

“For 50 years, television stars, both men and women, have applied heavy powder and thick, pancake makeup to cover wrinkles, 5 o’clock shadows and other facial imperfections.” the Times reported. “Though the makeup is far from subtle, on TV it looks just fine. Even for problems that makeup cannot easily hide, television’s low resolution usually smooths the rough edges. Not so with high-definition TV.”

High-definition television broadcasts are the law of the land now. Which means, of course, that it’s time to change everything again.

A scant few months after the digital switchover, a number of manufacturers and broadcasters are pushing 3-D TV, which would require a whole new upgrade cycle. No government mandate for this, so only the market will determine whether 3-D TV is the next great thing. We’ll see.

Follow us for disruptive tech news: John C. Abell and Epicenter on Twitter.

Source: Various

Image: Mountain Hermit/Flickr

See Also:

July 22, 1933: Wiley Post Flies Around the World Alone

1933: Pilot Wiley Post returns to Floyd Bennett Field in Brooklyn, New York, 7 days, 18 hours, 49 minutes after leaving. Aided by new technology, his flight is the first solo circumnavigation by air, and it’s also the fastest-ever around-the-world-trip.

Born in Texas, Post wanted to be a pilot after seeing his first airplane at a county fair at the age of 15. He got his break at age 24, when a barnstormer let him fill in for his injured skydiver. Post performed several jumps, but always wanted to be the pilot, not the skydiver.

His dream was almost ruined while working in the oil fields to earn money for an airplane: He lost his left eye in an accident. Despite the lack of depth perception, Post was able to earn his pilot’s license and, with his workers’ compensation checks, bought his first airplane.

Post quickly advanced his flying skills and became the personal pilot for wealthy oilman F.C. Hall. His boss encouraged Post to use the plane when it wasn’t needed for business, and the now-32-year-old pilot promptly went out and won a prestigious air race from Los Angeles to Chicago.

With the success, Hall allowed Post to use the sleek Lockheed Vega aircraft, named Winnie Mae after Hall’s daughter, to pursue any air records he wished.

Post wasted no time, and in 1931 he and navigator Harold Gatty broke the around-the-world record that had been held by an airship, the Graf Zeppelin. Their 15,000-mile-flight lasted 8 days, 15 hours, 51 minutes and included 13 refueling stops. The Winnie Mae had slashed more than 11 days off of the previous record.

After several people suggested Gatty was the brains behind the effort, Post set out to disprove his critics the very next year by making the trip solo. He equipped the Vega with two significant pieces of new technology: a primitive autopilot from Sperry Gyroscope and a radio direction finder for navigation. The trip would be the first significant flight where the new navigation technology would replace the human navigator.

The early autopilot proved to be problematic at times, though it did help the solo pilot stay on his desired course. Aided by the radio direction finder that allowed Post to navigate to any radio station’s transmitter, the Winnie Mae stayed on record pace through the early part of the flight.

After several unscheduled stops in the Soviet Union and the need to fix a bent propeller, Post was able to make it back to North America still ahead of schedule. Fighting fatigue in the final hours, Post developed a very simple piece of technology to keep from falling asleep in the cockpit. The former mechanic tied one end of a string to a wrench and the other end to a finger.

He would simply hold the wrench while he flew. If he fell asleep, the wrench would fall, tugging on his finger and waking him up.

The klugey wrench alarm worked, and as the clock approached midnight, Post landed back at Floyd Bennett Field in front of thousands of spectators who had come to greet him. He credited the autopilot and radio direction finder for making the record-setting flight possible. He had beaten his previous record by 21 hours.

Post would later go on to develop a pressure suit allowing him to set more records by flying at altitudes as high as 40,000 feet.

In 1935, the record-setting pilot set off on a flight with his good friend Will Rogers. The famous humorist had hired Post to fly him around Alaska in search of new material for his newspaper column.

Post ended up settling for some pontoon floats that were too big for the modified Lockheed they were flying on the trip. Post and Rogers took off from a lake in northern Alaska on Aug. 15, and the engine quit.

The airplane was difficult to control with the oversize floats, and it crashed into a lake, killing both on board. Rogers was 55. Post was 36.

Source: Various

Photo: Wiley Post and Harold Gatty in Germany, 1931/Wikipedia

See Also:

July 21, 1911: Media Messenger McLuhan Born

Marshall McLuhan

Marshall McLuhan, Canadian philosopher, holds a book as he leans over a chair in 1966.
Photo: Bettmann/Corbis

1911: Media theorist Marshall McLuhan escapes the medium of the womb to become a founding messenger of the electronic future. His scholarly analyses like The Mechanical Bride, The Gutenberg Galaxy, Understanding Media, The Medium Is the Massage encode pop culture and postmodernism’s cultural and economic dominance from the 20th century onward.

(Surfing his wave, Wired magazine adopted Marshall McLuhan as its patron saint in its debut 1993 issue.)

Born in Edmonton, Alberta, Canada, his mother eventually became an actress, and his father sold real estate and served in the Canadian army in World War I. Young McLuhan merged his initial interest in engineering with a love of literature that taught him to unlock the power of language.

His fascination with the machinery of wordplay won him two bachelor’s and two master’s degrees in English — from the University of Manitoba and England’s Cambridge University, respectively — as well as a Ph.D. from Cambridge, His dissertation examined 16th-century English pamphleteer and satirist Thomas Nashe.

By the time McLuhan left academia’s student ranks in December 1943, the post-war dawn of boundlessly influential mass media had nearly arrived, and McLuhan had married aspiring actress Corinne Keller Lewis. Before regular network broadcasting flourished in the ’50s, the productive McLuhan had taught at University of Wisconsin, St. Louis University, Ontario’s Assumption University and the University of Toronto.

He’d also published his first analytic compilation The Mechanical Bride: Folklore of Industrial Man, which both dissected and skewered advertising, in 1951.

But it is his subsequent Ford Foundation–funded lectures on communication and culture at the University of Toronto that galvanized critical and commercial inquiry into marketing, technology and perception, and laid the groundwork for his own future fame.

To keep McLuhan from being spirited away by mounting academic competitors, the University of Toronto made him a full professor in 1952, created the Centre for Culture and Technology in 1963, and smartly sewed him up until 1979. That year, he suffered a stroke. The university tried to shutter his research center, but popular protests from students and other fans helped to keep it open.

That included director Woody Allen, who in 1977 inserted McLuhan into the Oscar-winning film Annie Hall to berate a prattling New Yorker with the immortal line: “You know nothing of my work.”

McLuhan’s work has become ever more important in an internetworked 21st century, where his legendary aphorisms like “the medium is the message” are provided with no shortage of persuasive case studies. The phrase first appeared in McLuhan’s landmark 1964 study Understanding Media: The Extensions of Man, which explored the ways media affect and accelerate local and global cultures, regardless of their content.

Like his 1962 work The Gutenberg Galaxy: The Making of Typographic Man, Understanding Media charted the ways in which technological innovations, from the printing press to electronic communication, have reoriented personal identity and social organization.

It is not the message but the “medium that shapes and controls the scale and form of human association and action,” he explained in Understanding Media. Or, as he presciently argued in The Gutenberg Galaxy, long before Google, Facebook, Twitter and terrorism competed for humanity’s psychic and economic energy:

The world has become a computer, an electronic brain…. And as our senses have gone outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. Terror is the normal state of any oral society, for in it everything affects everything all the time…. In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture.

Along with anticipating our simultaneous connectivity and depersonalization through communication technology, McLuhan also coined much of the terminology we take as digital-age scripture. Besides “The medium is the message,” technocultural standards like “global village” and perhaps even “surfing” were popularized by McLuhan, who like any brilliant cultural remixer lifted them from source texts like James Joyce’s Finnegan’s Wake, among others.

By the time McLuhan died on the last day of 1980, he had accrued more medals, citations, honorary degrees, controversy and pop-culture shout-outs than almost any other academic on Earth. Genesis sang about him in its acclaimed 1974 concept album The Lamb Lies Down on Broadway, while his work influenced cultural critics like Jean Baudrillard, artists like Andy Warhol and even politicians like Jerry Brown. (Who, it might be noted, is currently running for governor of California, the home of both Hollywood and Silicon Valley, where much of the digital age’s most important and distracting innovations are created.)

As the years pass, McLuhan’s circle of influence has a tendency to widen that way. During his life, it extended to marketers and multinationals. He consulted with and gave speeches to the corporate megaminds at AT&T and IBM, and became a media celebrity commanding explication in Newsweek, Harper’s, Life and even Playboy. Columbia Records released a theoretically ambitious but often impenetrable audio version of his 1967 bestseller The Medium Is the Massage: An Inventory of Effects.

That benchmark work’s central thesis — that all media are extensions of our senses — has achieved new cultural significance in our era of light-speed social networking, push-button wars and so-called reality television. It cemented McLuhan’s technocultural legacy, and reinscribed humanity’s rules of engagement, online and off.

“All media work us over completely,” McLuhan explained. “They are so pervasive in their personal, political, economic, aesthetic, psychological, moral, ethical and social consequences that they leave no part of us untouched, unaffected, unaltered.”

Source: Various

See Also:

July 19, 1989: Human Heroics Overcome Aircraft Failure in Sioux City

1989: A catastrophic hydraulic-system failure forces a plane to make an emergency landing in Sioux City, Iowa. The right wing catches on the tarmac, sending the plane careening down the runway in a ball of flame and twisting wreckage. But thanks to the skill of the flight team, the rescue personnel on the ground and several lucky coincidences, over half the passengers survive.

United Flight 232, a McDonnell Douglas DC-10, took off from Stapleton Airport in Denver at 2:09 p.m., bound for Chicago. At 3:16, when the DC-10 was making a slight right turn at 37,000 feet, the fan disk on the tail-mounted No. 2 engine disintegrated. Shrapnel and debris ripped through the tail section, puncturing the horizontal stabilizer and severing all three hydraulic lines, allowing fluid to bleed out.

The ailerons, stabilizers, flaps, rudder — really any component that controlled the movement of the aircraft — were now completely unresponsive. Prior to this accident, it was thought that a complete hydraulic system failure aboard a DC-10 was impossible, because two backup systems were placed far apart within the fuselage.

One of these could take over in the event one or even two of the other systems became incapacitated. However, all three hydraulic lines were routed through a single 10-inch wide opening in the tail — right where shrapnel from the explosion penetrated and severed all three lines.

The moment the accident occurred, Capt. Alfred C. Haynes, First Officer William Records, and Flight Engineer Dudley Dvorak felt a jolt and saw warning lights indicating the No. 2 engine was malfunctioning. It was at this point Records noticed that the plane was off course and moved to correct.

When he attempted to maneuver the jet, he found the controls had absolutely no response. The 120-ton vessel loaded with 296 people was hurtling more 6 miles above the ground, and there was no way to control it.

Then things got worse.

The initial failure of the No. 2 engine and subsequent explosion caused the plane to oscillate and lose altitude. Captain Haynes knew the situation was extremely dire: If the DC-10 did not land soon, it would eventually go completely out of control and crash. He contacted the nearby Sioux City Gateway Airport to coordinate an emergency landing.

But how to do it?

As luck would have it, Dennis E. Fitch, a United DC-10 flight instructor was aboard. He offered to help the crew, who had found that the throttles for the two intact engines still responded. Using differentiating thrust (decreasing power to one engine, while increasing it to the other) they discovered they could crudely maneuver the plane as well as decrease altitude.

Fitch also manually lowered the landing gear, hoping this action would pump some remaining hydraulic fluid back into the system and restore a small amount of control. It didn’t work. The crew contacted United’s San Francisco maintenance base for help, but because of the cataclysmic nature of the accident, they, too, were unable to help.

Haynes was a seasoned pilot with more than 30 years of flight experience at the time of the incident. He knew how deadly the situation looked, but continued to keep his composure, even joking with air traffic controllers during the emergency.

The crew scrambled to prep for a rough landing. They dumped fuel and made a series of right-handed turns (it was easier for the crew to steer the plane in this direction) to line up with the relatively short 6,600-foot runway 22 at Sioux City.

Meanwhile, emergency crews gathered on the runway. Because the accident happened during shift changes at the regional trauma center and the regional burn center in Sioux City, there was nearly double the normal roster of medical personnel available to treat victims.

In addition, the Iowa National Guard happened to be on duty at the Sioux City Gateway Airport. That added another 285 people trained in triage and evacuation to help out.

As flight 232 began its final descent about 45 minutes after the trouble began, Fitch manipulated the throttle to bring the aircraft down. But because of the loss of all hydraulics, the crew could not control the wing flaps to reduce airspeed effectively.

The DC-10 approached the runway traveling about 276 miles per hour. Safe landing speed for a DC-10 is about 160 miles per hour.

The plane was leaning to its right as it landed. The right wingtip struck the runway first, rupturing the fuel tank, which immediately ignited.

The tail section sheared off at impact, while the rest of the plane skittered across the runway, tearing the fuselage apart and ripping the right landing gear off. The right wing was torn completely away and tumbled end-over-end in a brilliant exploding fireball. Some eyewitnesses perceived this as the plane “cartwheeling” down the runway, but this was ultimately determined to be a misinterpretation.

There were a total of 296 people on board flight 232. Of these, 112 died. Most deaths were caused by blunt-impact force form the crash itself, while others came from smoke inhalation that occurred after the wings caught fire. Survivors tended to be seated behind first class but ahead of the wings.

Strangely, some passengers walked away from the wreckage without a scratch. Others suffered moderate-to-severe injuries, with one passenger dying from his wounds a month later.

The entire flight crew, including Fitch, survived as well. The cockpit was torn off from the rest of the fuselage and ended up nose down and partially buried in a cornfield. It took rescuers 30 minutes to extract the badly injured pilots.

National Transportation Safety Board investigators determined the accident was triggered by a faulty maintenance technique used by United Airlines. It turned out the fan disk had a significant crack in it that escaped detection because of human error.

United Airlines reworked the maintenance technique to detect flaws and cracks in fan blades. Additionally, the hydraulic systems of newer aircraft were fitted with fuses that would seal off the punctured section and prevent a total loss of fluid. These fuses were retrofitted to some DC-10s to prevent such an incident occurring again.

The NTSB report commended the flight crew, saying they “greatly exceeded reasonable expectation.” Had it not been for the quick actions of Haynes and the resourcefulness of his crew, the loss of life could have been much, much greater. When technology fails because of human error, it takes the very best in human ingenuity to overcome it.

Source: Various

Image: The United Flight 232 aircraft makes its final descent with damage to its horizontal stabilizer and fuselage tail cone.

See Also:

July 16, 1965: Mont Blanc Tunnel Opens

1965: After 19 years of planning and construction, the Mont Blanc Tunnel officially opens. The new tunnel stretches 7 miles, linking the French town of Chamonix and the Italian town of Courmayeur. Buried 1.5 miles under the Alps’ highest peak, it becomes the world’s deepest road tunnel beneath rock and gains infamy after a deadly 1999 fire.

Until the opening of the tunnel, road traffic in the Alps between France and Italy wended its way over hairpin turns and sharp grades, with mountain passes closed the majority of the year because of snow. Italian construction teams began drilling a tunnel into Mont Blanc (or Monte Bianco on their side) to build a year-round route in 1946. The next year, France and Italy signed an agreement to build the tunnel together.

Construction, however, did not begin in earnest until May 30, 1959, with the help of an 82-ton tunnel-boring machine. Tunneling began at 4,091 feet on the French side and at 4,530 feet on the Italian side.

It took 783 tons of explosives to complete the drilling. The French and Italian teams met Aug. 4, 1962, with a discrepancy of only 5.12 inches between the two sides.

When it opened in a ceremony featuring Presidents Charles De Gaulle of France and Giuseppe Saragat of Italy, the Mont Blanc Tunnel became the world’s longest highway tunnel, more than three times longer than the previous recordholder, Liverpool’s Mersey Tunnel.

A one-way toll cost a mere $1.50. That’s the inflation-adjusted equivalent of $10 today. Today’s actual one-way toll, however, runs $44.

The tunnel operated without a major incident for 34 years, with management of the roadway divided between the French ATMB (Autoroutes et Tunnels du Mont-Blanc) and Italian SITMB (Società Italiana per azioni per il Traforo del Monte Bianco). In 1990, the agencies installed fireproof shelters and advanced video-surveillance cameras in the tunnel, and upgraded existing safety equipment.

At 10:46 a.m. on March 24, 1999, a refrigerated Volvo FH12 tractor trailer filled with flour and margarine and piloted by Belgian truck driver Gilbert Degraves entered the French side of the tunnel bound for Italy. Six minutes later, oncoming drivers began flashing their headlights at Degraves, who noticed white smoke pouring from the truck’s cab. Degraves pulled over and attempted to fight the fire, but was forced back when the truck burst into flames. Though Degraves escaped, 50 others were trapped in the tunnel with the burning truck.

Over the next 10 minutes, temperatures in the tunnel climbed to a staggering 1,800 degrees Fahrenheit. Since the V-shaped tunnel acted like a chimney, it was soon filled with toxic smoke that instantly killed all those who breathed it in. Without oxygen, vehicle engines stalled and drivers were trapped in the tunnel. Some who escaped sought refuge in the fireproof shelters built into the walls of the tunnel, but even the shelters could not withstand the heat of this fire.

With the heroic efforts of firefighting teams and Pierlucio Tinazzi — a security guard who perished in the fire after evacuating 10 survivors on his motorcycle — 12 people of the 50 trapped in the tunnel survived. It took five days for temperatures to cool down enough for engineering teams to begin removing debris.

After three years, $481 million in reconstruction and safety upgrades — and the restructuring of the separate French and Italian companies into a single management entity — the Mont Blanc Tunnel reopened to traffic March 9, 2002. A court found in 2005 that the fire could have been prevented with better management and safety precautions and sentenced 13 defendants to jail time or fines.

Today, international truck traffic has returned to the tunnel. The associated air pollution is a cause of concern for some residents of nearby communities who formed the Association pour le Respect du Site du Mont-Blanc to ban trucks from the tunnel. While pollution can become unpleasant in the valleys surrounding the tunnel entrances, there are no plans to close the tunnel to trucks any time soon.

Source: Various

Photo: Vehicles leave Italy through this entrance to the Mont Blanc Tunnel.
bramhall/Flickr

See Also:

July 15, 1954: Boeing 707 Makes First Flight

1954: The Boeing 367-80 makes its first flight from Renton Field southeast of Seattle. The jet-powered airliner will become the Boeing 707 and usher in the jet age for passenger travel.

Boeing was not the first company to produce a jet-powered airliner. But just as Ford’s Model T popularized the automobile despite being a latecomer in the car world, the Boeing 707 would be the airplane to popularize jet travel.

Nearly five years before the prototype of the 707 first flew, the British-made de Havilland Comet completed its first flight. The jet was popular thanks to its high speed, but a string of accidents involving the Comet in 1954 forced the company to take the airliners out of service to fix some design flaws.

By that time, Boeing was already nearly two years into the development of its own jet airliner. The company flew its first large jet-powered aircraft, the B-47 bomber, in 1947. With the success of the speedy bomber, the company started looking into building a passenger aircraft that could take advantage of the quickly evolving jet-engine technology.

Boeing engineers started work in 1952 on an airplane that would be jointly developed as both a midair refueler for the Air Force, and a passenger-carrying jet for the airlines. The Air Force was the first customer for the airplane. With the accidents of the de Havilland Comet still fresh in people’s minds, commercial airlines continued to rely on piston-powered propeller aircraft such as the Douglas DC-6 and Lockheed Constellation to carry passengers in safe, well-known designs.

The new model 367-80 was simply known as the “Dash 80,” and development continued despite the cool reception from the airlines. These days, hundreds of orders were on the books for Boeing’s new 787 before the airplane ever flew. But back in the early ’50s, Boeing had to continue developing the passenger version of the Dash 80 on its own. The company was confident that an airplane flying twice as fast as the propeller airliners of the time would eventually bring in the business to make the investment worthwhile.

A year into flight-testing, Boeing invited representatives from the airline industry and aviation community to Seattle to attend the annual hydroplane races on Lake Washington during the summer of 1955. The Dash 80 was scheduled to make a simple flyby to impress the crowds.

But a simple flyby apparently wasn’t enough for Boeing test pilot Alvin “Tex” Johnson. As he approached the lake-shore crowd at low altitude, Johnson gently pulled up on the controls and performed a graceful roll in the airplane. The crowd was in awe as the four-engine airliner completed the maneuver, something usually only seen in airshows performed by aerobatic pilots.

Boeing president Bill Allen reprimanded Johnson, but the pilot pointed out the roll was a simple 1-g maneuver and the airplane was never pushed beyond its limits. The pilot continued working for Boeing for many more years.
Continue Reading “July 15, 1954: Boeing 707 Makes First Flight” »

July 14, 1965: Mariner 4 Brings Mars Up Close and Cardinal

1965: After a few million years of watching Mars from afar, humanity meets the red planet — not quite in person, but through the eyes of NASA’s Mariner 4 satellite.

The half-ton space camera flew past Mars eight months after being shot from Earth on an Atlas rocket, having traveled 325 million miles. It flew within 6,000 miles of the planet’s surface, snapping 22 digital photographs before continuing into space. They were the first close-ups ever taken of another planet, and it was only appropriate that the subject was Mars, a source of fascination since the beginning of recorded history.

There were, alas, none of the canals seen by astronomers in the late 19th and early 20th centuries, nor evidence of senders of messages heard by Nikola Tesla or Gugliemo Marconi. Indeed, the hazy images of a barren, crater-strewn landscape ended speculation that Mars might plausibly be inhabited by higher life forms. But those low-resolution — 0.04 megapixel — images stirred the soul in different ways, and they paved the way for future photo shoots that would reveal a planet every bit as fantastic as imagined.

After leaving Mars, Mariner 4 journeyed to the far side of the sun, and finally returned to Earth’s vicinity in 1967. Long after it was expected to break down, the satellite continued to send information about cosmic dust, celestial dynamics and solar plasma. After being put through a series of operations tests, Mariner 4 was shut down Dec. 20, 1967.

Images: NASA

See Also:

July 13, 1977: Massive Blackout Plunges New York Into Rioting

1977: Lightning strikes a Consolidated Edison substation along the Hudson River, tripping two circuit breakers and setting off a chain of events that results in a massive power failure. The entire city of New York is blacked out, parts of it for more than 24 hours.

A number of fail-safes in the system should have prevented such a catastrophic failure. But bad luck (multiple lightning strikes that seemed to find Con Ed utilities), operator mistakes and some substandard facilities maintenance triggered a chain reaction throughout the system that eventually crippled the biggest generator serving New York City. That failure plunged the Big Apple into darkness at around 9:30 p.m.

The blackout couldn’t have come at a worse time for a city that was already down on its luck. When the lights went out, New York was in the midst of a financial crisis and teetering at the edge of bankruptcy. The rioting and looting that followed the blackout marked one of the lowest points in New York history.

In all, 1,616 stores were either looted or damaged during the blackout. More than a thousand fires were set, 14 of them resulting in multiple alarms. And in the biggest mass arrest in city history, 3,776 people were thrown in the jug. The jails were so overcrowded that the overflow had to be held in precinct basements and other makeshift jails.

A congressional study later put the damage caused by looting and vandalism at $300 million.

Source: Wikipedia

Photo: A man looks through the shattered windows of a looted jewelry store on Utica Avenue in Brooklyn during the blackout that struck New York City in July 1977.
Associated Press

This article first appeared on Wired.com July 13, 2007.

See Also:

July 12, 1960: Etch a Sketch? Let Us Draw You a Picture

1960: The Etch a Sketch goes on sale.

The technology behind this children’s toy is both simple and complex. Simple, in that an internal stylus is used, manipulated by turning horizontal and vertical knobs to “etch a sketch” onto a glass window coated with aluminum powder.

Complex, because the Etch a Sketch employs a fairly sophisticated pulley system that operates the orthogonal rails that move the stylus around when the knobs are turned. The stylus etches a black line into the powder-coated window to create the drawing.

Along with the aluminum powder, the guts of the toy include a lot of tiny styrene beads that help the powder flow evenly when the sketch is being erased (by shaking), recoating the screen for the next drawing. As for how the aluminum powder sticks to the window, well, it pretty much sticks to everything.

Arthur Granjean, a Frenchman, was the Etch a Sketch’s inventor (he called it L’Ecran Magique, or “The Magic Screen”). After failing to get some of the bigger toy companies to bite, he sold his invention to the Ohio Art Company, which has manufactured it ever since.

Although the traditional Etch a Sketch comes in a red plastic housing, it is now available in several colors.

Source: Howstuffworks.com

Image: Cal Ripken Jr./George Vlosich III

This article first appeared on Wired.com July 12, 2007.

See Also:

July 9–10, 1856: Visionary Tesla Born at Midnight

1856: Scientific genius and visionary inventor Nikola Tesla is born at the stroke of midnight in the unassuming village of Smiljan, in what’s now Croatia. He wastes little time in revolutionizing the world through foundational developments in electromagnetism, electrical current, wireless power and communications, weaponry, robotics, computer science, mass media and much more.

“Tesla is like a character out of a science-fiction novel, the quintessential mad genius,” journalist Tom McNichol told Wired.com by e-mail. McNichol authored AC/DC: The Savage Tale of the First Standards War, a deadly serious but sometimes hilarious chronicle of Tesla and Edison’s battle for electrical supremacy, “Whether he was more mad than genius depends on who you’re talking to.”

That’s a massive crowd, as Tesla’s influence on modern (and postmodern) life is practically immeasurable.

Nikola Tesla geeked early on electricity at Austria’s Graz University of Technology and Prague’s Charles University, before landing an electrical engineering gig at Budapest’s national telephone company in 1881. Shortly after becoming its prize engineer and reportedly inventing either a telephone repeater or the first loudspeaker, Tesla job-hopped to the Continental Edison Company in Paris. From there, he eventually migrated to the United States to join his scientific contemporary, and lifelong nemesis, Thomas Edison.

By the time the synesthetic Tesla got to New York, he and his alleged photographic memory had already privately built a successful prototype of the AC induction motor, based on his groundbreaking concept of the principle for the rotating magnetic field. By the time he angrily left Edison’s craven employ after a salary dispute in 1886, Tesla had markedly upgraded the company’s inefficient direct-current motors and generators, and was quickly perfecting the polyphase system for distributing alternating-current electrical power.

But their adversarial battles would continue throughout the turn of the 20th century and beyond to Edison’s eventual deathbed, where he confessed that his greatest mistake was sticking with direct current and ignoring Tesla — and the overwhelming scientific evidence — on alternating current, which galvanized the 20th century.
Continue Reading “July 9–10, 1856: Visionary Tesla Born at Midnight” »