Sept. 10, 1941: Stephen Jay Gould Born

1941: Stephen Jay Gould, who will become a famous evolutionary theorist and popular science writer, is born in New York City.

As a 5-year-old, Gould became fascinated by paleontology during a visit to the American Museum of Natural History with his father. “I dreamed of becoming a scientist, in general, and a paleontologist, in particular, ever since the Tyrannosaurus skeleton awed and scared me,” he later wrote.

Gould became a professor of zoology and geology at Harvard University. And over the course of his career, he sparked controversy and forced academics to rethink entrenched ideas about the nature the history of life and evolution.

The most influential and still hotly debated of his theories is that of punctuated equilibrium. First developed while he was a paleontology doctoral student at Columbia University with fellow student Niles Elredge, the theory proposes that the history of life is punctuated by periods of rapid evolution and speciation, with relatively little evolutionary change, or equilibrium, in between.

The theory was in direct opposition with the Darwinian idea that evolution occurs gradually and consistently over time.

“Life is a copiously branching bush, continually pruned by the grim reaper of extinction, not a ladder of predictable progress,” he wrote.

Outside of academia, Gould was well known by the American public for his clear, entertaining and prolific popular-science writing. He was featured on the cover of Newsweek, and appeared as a character in an episode of The Simpsons (pictured right).

He wrote more than 20 best-selling books on the history of life and geology, and penned 300 consecutive monthly essay columns, “This View of Life,” for Natural History magazine. He often drew on analogies from baseball (he was a Yankee fan his whole life), entertainment, art and history to explain scientific concepts.

An example of this can be found in The Panda’s Thumb, for which he won a 1981 book award:

Fifteen eggs, including but a single male, develop within the mother’s body. The male emerges within his mother’s shell, copulates with all his sisters and dies before birth.

It may not sound like much of a life, but the male Acarophenax does as much for its evolutionary continuity as Abraham did in fathering children into his 10th decade.

Stephen Jay Gould had his first bout with cancer in 1982. He survived after a four-year battle, after which he penned a well-known essay titled “The Median Is Not the Message.” The essay describes the hope he felt after discovering the median survival time for his cancer was eight months, because that meant half of the people with the cancer live longer, and some much longer than that.

As an academic, Gould was also a prolific writer, publishing more than a thousand scientific papers. His final book, The Structure of Evolutionary Change, described as his magnum opus, is a lengthy 1,443-page read.

Gould was diagnosed with a second, unrelated cancer later in life. He died in 2002 at age 60.

Source: Unofficial Stephen Jay Gould Archive, The New York Times

Images: 1) Wally McNamee/Corbis
2) Fox Network

See Also:

Sept. 9, 1926: Radio Sets Up a National Broadcasting Craze

1926: The National Broadcasting Company is established. The network would dominate radio during that medium’s Golden Age and become the foundation of a massive media empire that to this day just keeps growing.

During the Radio Days, NBC was the most successful in the game, but it was far from the earliest successful player. That distinction belongs to AT&T, at the time the largest company in the world. AT&T built a station in New York City with the call letters WEAF, but more to the point it had a monopoly on the telephone lines needed to extend the reach of a station with quality audio — the crux of a network.

AT&T’s interest in radio was simple. The conglomerate’s Western Electric division made radio components. Its Bell System — the phone company — was developing wired and wireless, short-and long-range communications. WEAF was a sandbox, a place to experiment.

But it also proved very popular with the public, proving there was an appetite for this thing which brought news, music, dramatic fare and sketch comedy into the family salon.

By 1925, however, AT&T decided that the telephone was a better fit for its future. Meanwhile, the Radio Corporation of America was itching to get into the business in a big way.

The U.S. Navy had essentially controlled radio technology for years as a matter of national defense during and, for a while, after World War I. In 1919 it turned over to RCA the American Marconi radio stations it had appropriated during the war, making the company an instant radio giant.

RCA had ambitions to tie up its patchwork of local stations into a national network, but it was hamstrung by the relatively poor audio quality available to it, leasing telegraph lines from Western Union. It was the best technology available, because AT&T wouldn’t allow anyone else to use its vastly superior telephone lines.

Such is the cauldron in which deals are cooked. AT&T sold its WEAF station (and another in Washington, D.C.) to RCA for $1 million in a deal that allowed RCA to lease AT&T’s phone lines — a huge audio upgrade. RCA’s new division, the National Broadcasting Company, was formed on this day in 1926 and officially launched with programming on Nov. 15.

NBC would flourish during the ’30s, ’40s and ’50s when radio was king — partly because it was able to control the cost of talent in a sort of studio-system way, but also because there was a huge appetite for this magical technology that brought the world into your living room.

Everything was live, of course, and most of the entertainers are long forgotten — but for nostalgia buffs they included Al Jolson, Jack Benny, Bob Hope, Fred Allen, Burns and Allen, and Edgar Bergen, a ventriloquist (think about it). Some of the programming simply couldn’t be done anymore, even in the age of South Park and Family Guy. One of NBC’s first hits was Amos ‘n’ Andy, a continuing story of two black guys performed by two white guys using the tone, inflections and patter of minstrel shows to conjure mental images of their characters.

The NBC radio network was so big, it was actually two: NBC Red, the flagship network, with established shows and advertisers, and NBC Blue, which had ‘sustaining shows’ — those without regular sponsors, like news and cultural programs. (In the early years, the NBC Orange Network carried Red Network programming on the West Coast, and the NBC Gold Network carried Blue Network programming there.)

As the network grew the need for individual stations to identify themselves (and eventually cut away to air their own local ads) became more complicated. Initially, an announcer would simply read the call letter of all the affiliates at the end of a program, there being so few. But in due course NBC needed a way to alert everyone simultaneously and instantly when it was time for a station break. This was the birth of the three-tones chime NBC still uses.

Various incarnations were used for the better part of two years. First was a sequence of seven tones – G-C-G-E-G-C-E. Too difficult to execute perfectly, live. This was shortened to four — G-G-G-E. Eventually, NBC settled on the iconic G3, E4 and C4 — though apparently not as an homage to one of NBC’s owners, the General Electric Corporation. It would become the nation’s first audio trademark.

Both AT&T and RCA thrived — each of their bets on the future ratified by time. And both were forced to divest as a result of their success. AT&T became seven Baby Bells in 1982. In the early 1940s the FCC forced NBC to drop either its Red or Blue network. RCA tried some fancy footwork — dividing NBC into two companies, NBC (neé Red) and Blue Network Company — but a 1943 Supreme Court decision did not go its way. So RCA sold Blue Network for $8 million, and in 1945 the Blue became the American Broadcasting Company (ABC).

And what of radio? The halcyon days of terrestrial radio are over. Satellite radio’s seems so moribund that its biggest star, Howard Stern, is making noises that he’ll leave Sirius XM radio when his $500 million contract runs out next year. The action (if not money) in radio today is online, in elegant apps like the one developed by NPR and a click away on the web from nearly every station there is.

Personally, this reporter doesn’t think radio (or newspapers) will ever become extinct, because a new medium seldom kills an old one. And these days the opportunities to reinvent are plentiful, and unpredictable. Radio conditioned us for podcasts, and now podcasters are trying to replicate the immediacy — of radio.

And how ironic is it that commercially viable radio — the necessary precursor to the television, the 20th century’s defining medium — turned on the need to access a private phone network owned by AT&T?

Source: Various

Image: Wikipedia

See Also:

Sept. 8, 1930: Scotch Tape Starts Sticking

1930: 3M begins marketing the first waterproof, transparent, pressure-sensitive tape after employee Richard Drew figures out how to coat strips of cellophane with adhesive.

Initially sold by the St. Paul, Minnesota, company as a moisture-proof seal for bakers, grocers and meatpackers, the product quickly got repurposed during the Depression by money-strapped consumers who used the tape as a cheap home-repair tool.

“Cellophane Tape” picked up the “Scotch” tag, according to legend, when a St. Paul car dealer became annoyed because the cellulose ribbons originally only had adhesive on the borders. Slagging 3M (known in those days as the Minnesota Mining & Manufacturing Co.) for being stingy, he invoked Scotland’s penny-pinching reputation and dubbed the product “Scotch tape.”

The name stuck.

In 1939, 3M introduced its so-called “snail” dispenser, which remains in use today. Less durable was the company’s kilt-wearing mascot “Scotty McTape.” Introduced in 1944, the logo became a fixture in the ’50s, when Scotch tape, heavily advertised on TV, dominated its market sector so thoroughly that it became a brand name on par with Kleenex and Coke.

3M had rolled out so many variations of the basic product by 1978 that Saturday Night Live spoofed the product with a skit about a store that sells nothing but Scotch tape.

Outside the pop-culture realm, the tape attached itself to scientific research. Russian experimenters demonstrated in 1953 that if they peeled a roll of Scotch tape in a vacuum, the resulting triboluminescence produced X-rays.

American scientists proved in 2008 that the tape’s tribuliminescent radiation was strong enough to leave an X-ray image of a finger on photographic paper.

Highbrow recognition came in 2004 when New York’s Museum of Modern Art exhibited Scotch tape as one of its “indispensable masterpieces of design.”

Sales show no sign of winding down. 3M reports that enough tape is sold annually to circle the globe 165 times.

Source: Various

Image: via Wikipedia

See Also:

Sept. 7, 1948: Where the Rubber Is the Road

1948: A mile-long stretch of Exchange Street in Akron, Ohio, is the first in the United States to be paved with a rubber-asphalt compound.

Rubber was everywhere in postwar Akron. As the home of B.F. Goodrich, Goodyear, Firestone and General Tire, Akron called itself the “Rubber Capital of the World,” and the fortunes of the city were tied to the synthetic-rubber industry.

As early as the 1840s, scientists added natural rubber to pavement (.pdf) to create surfaces that resisted cracks and better repelled water. Goodyear President Paul Litchfield was so impressed by the rubberized roadways he’d seen on a visit to the Netherlands that he donated synthetic rubber for a real-world test of rubber roads in Akron, the first such test on U.S. soil.

The rubberized asphalt was put down along a stretch of  West Exchange Street, a main Akron thoroughfare. The rubber road opened to the public Sept. 7, 1948, complete with a sign at its terminus that read, “Here ends the first rubber street in America.”

In reality, the road surface only contained between 5 and 7 percent rubber. The rest, as always, was asphalt.

Rubber companies immediately jumped on the rubber-road bandwagon, with dry-powder or latex rubber additives sold under brand names such as Rub-R-Road and Pliopave. Roads from Ohio to Virginia got the rubber treatment at an added cost of $7.25 per cubic mile (about $60 in today’s moolah).

Engineers eventually questioned the benefits of rubberized roads. At the time, pure asphalt was cheap, rubberized asphalt was more expensive, and studies didn’t show any clear advantages of roads paved with rubber.

West Exchange Street was torn up and repaved in 1959.

It was only a few years later, in 1965, that an engineer for the city of Phoenix, Arizona, named Charlie McDonald found a way to blend shredded “crumb” rubber from waste tires into asphalt. With an abundant supply of waste tires, rubber roads once again became popular, especially in warm climates where rubberized asphalt is more resistant to reflective and thermal cracking.

Rubberized asphalt remains most popular in Arizona, where rubberized Phoenix-area roads are touted as “quiet roads” that can reduce the decibel level of road noise up to 12 percent, sometimes negating the need for sound barriers.

While West Exchange Street is now conventional asphalt, Akron’s rubber road lives on nearby in a more-modern incarnation. The pedestrian walkway along the Ohio & Erie Canal (shown above) is made of crumb rubber and runs beneath the Exchange Street overpass.

Source: Various

Photo: City of Akron

See Also:

September 3, 1976: Viking 2 Lands on Mars

Viking 2 Self-Portrait

Viking 2 took this self-portrait. Photo: NASA

1976: Viking 2, the second mission to Mars, lands on the planet and begins transmitting pictures and soil analyses.

The Viking mission went to Mars to look for signs of life, to study the soil and atmosphere, and to take pictures. There were two launches of paired orbiters and landers, aboard Titan-Centaur rockets. Each orbiter took pictures of candidate landing sites before the final landing sequence began.

The Viking landers arrived on Mars within six weeks of one another in 1976.

Viking 2’s landing was more dramatic than NASA might have hoped: As the lander separated and began to descend, the orbiter’s stabilization system went awry, blacking out for almost an hour. The craft rolled in a way that its main antenna no longer pointed to Earth.

The landing was 31 seconds later than planned, but there were no untoward effects from the brief communications problem. Viking 2 landed at 6:38 p.m. EDT on Utopia Planitia, the largest impact crater on Mars. The site was chosen in part because satellite images suggested the presence of more moisture there than at Viking 1’s landing site.

The Viking landers were approximately 10 feet across and 7 feet tall, weighing roughly 1,270 pounds unfueled. In addition to the lander body, which was the platform for scientific experiments, the lander consisted of a bioshield, an aeroshell, a base cover and parachute system, and lander subsystems for communications, power subsystems, descent engines, etc.

The descent engines were designed to disperse exhaust as widely as possible, to disturb the landing site as minimally as possible. In the case of Viking 2, however, a radar miscalculation caused the engines to fire briefly just before landing, cracking the surface.

According to NASA, the onboard computer “had instructions stored in its memory that could control the lander’s first 22 days on Mars without any contact from Earth.” This was accomplished with “two general-purpose computer channels with plated-wire memories (.pdf), each with an 18,000-word storage capacity. One channel would be operational, while the the other was in reserve.”

Viking 2 performed a whole raft of experiments: physical properties of the soil, atmospheric structure, biology, gas chromatography and mass spectroscopy, meteorology, seismology, radio science (location of the lander and also information about Mars’ motion), neutron mass spectroscopy, X-ray fluorescence spectroscopy and ionospheric properties. The Viking mission was able to map nearly all of Mars’ surface, and to understand the planet’s seasonal changes with new precision.

While early soil analyses from Viking 1 seemed to indicate either new chemical processes or new forms of life, these findings were ultimately not confirmed. Moreover, neither Viking lander found any significant amounts of water or ice.

However, this does not diminish the scientific legacy of the Viking mission. A detailed and precise accounting of the Martian soil, an iron-rich clay, was achieved, as well as significant information about Martian wind and climate.

The presence of nitrogen in the atmosphere was also documented. Viking 2’s seismometer may even have recorded a Mars-quake, and both landers observed dust storms — local and global.

For most people, the legacy of the Viking landers is in its pictures — the first surface photos of another planet. For example, here is the first color image of Utopia Planitia.

First color image of Utopia Planitia

And here’s Utopia Planitia covered in a thin layer of frost.

Frost on Utopia Planitia

(You can browse hundreds of pictures from the Viking mission at these NASA sites: The JPL Photojournal and NASA Images.)

The Viking spacecraft were rated for 90 days of performance, but both massively outperformed that expectation. Viking 2’s orbiter sprang a leak and was then shut down July 25, 1978, after more than 700 orbits. The lander’s batteries died April 11, 1980. (Viking 1’s lander worked until Nov. 13, 1982.) The Viking 2 lander is still visible on the surface of Mars — a souvenir of a program that NASA says cost approximately $1 billion.

There were disappointments, of course. Images from Viking 1 had given apopheniacs everywhere reason to believe that remnants of a lost Martian civilization might be found, but no luck.

Recent analyses of crater impacts have shown that if Viking 2 had dug its trench just 3 or 4 inches deeper, it would have discovered ice deposits, confirming the existence of large amounts of water on the planet decades earlier than would ultimately be the case. Some Viking scientists, such as Patricia Straat, have observed that the discovery of ice might well have validated test results consistent with the presence of life, although the weight of the evidence showed sterile soil.

VIking 2 has a special place in the hearts of Star Trek: The Next Generation fans, as its landing site is memorialized in the show as the Utopia Planitia Fleet Yards, a major Federation construction works. There’s also a colony at the landing site.

Source: Various

Photos: NASA

See Also:

Sept. 2, 1969: First U.S. ATM Starts Doling Out Dollars

ATM1969: Six weeks after landing men on the moon, Americans take another giant leap for mankind with the nation’s first cash-spewing, automated teller machine.

The machine, called the Docuteller, was installed in a wall of the Chemical Bank in Rockville Centre, New York. It marked the first time reusable, magnetically coded cards were used to withdraw cash.

A bank advertisement announcing the event touted, “On Sept. 2, our bank will open at 9:00 and never close again!”

Don Wetzel, an executive at Docutel, a Dallas company that developed automated baggage-handling equipment, is generally credited as coming up with the idea for the modern ATM while standing in a bank line. Previous automated bank machines had allowed customers to make deposits, pay bills or obtain automated cash — after purchasing a one-time voucher or card from a teller. The new device was the first in the United States to dispense cash using a mag-stripe card that didn’t require teller intervention.

For the time being, tellers had no need to fear for their jobs. At about $30,000 each ($178,000 in today’s buying power), the machines cost more than a teller’s annual salary.

And they could only dispense cash, not receive deposits or transfer money between accounts. Those features came with the 1971 version, called the Total Teller.

The ATM freed customers from the tyranny of banker’s hours, giving them access to dough 24/7 and even, much later, performing the function of currency converters  — allowing Americans traveling abroad to obtain cash in local currencies.

Of course, the machines were good for banks, too, eventually letting them cut costs, reduce teller lines and, of course, charge outrageous user fees.

There were issues, though. Because the machines were offline there was no way to check a customer’s balance to see if there was enough money to cover a withdrawal.

“Not only was it a technical problem to overcome, it was a problem in the minds of the banker to issue a card to somebody and not know whether he had the money in his account or not,” Wetzel said in a 1995 interview.

To overcome that barrier, there was a $150 daily limit for ATM withdrawals. Other obstacles included finding a manufacturer to put mag stripes on the back of the bank cards, and printing receipts that could be read by machine.

Then there were problems with resistance from banks, who worried that customers would reject the machines, or that reducing face-to-face interaction with customers would lose opportunities to sell customers other bank services.

Customers embraced the new machines, however, which opened the way for other manufacturers to get in the game.

Diebold was one of the first companies to see the gold in the emerging ATM market. A maker of safes and vaults until then, the company decided to branch out in 1974 with the first installation of its TABS 500 ATM. By 1995, Diebold was producing more than half of all ATMs in the United States.

Today there are ATMs everywhere, including one at the McMurdo research station on Antarctica –- but no sign of one, just yet, on the moon. And today’s ATMs go far beyond teller duty. Some even sell lottery tickets and postage stamps.

But along with the ubiquity of the machines came security issues.

The first ATMs were offline mechanical machines. Within a decade, with the rise of PCs, they became electronic devices. By the 1990s, ATMs were being connected to backend networks by modem, and their dominant operating system was Microsoft Windows. This, of course, opened a whole new wave of vulnerabilities.

Since then, hackers and scammers have kept banks on their toes devising ever-more-sophisticated ways to steal cash through ATMs. Skimmers, until recently, were the dominant mode. The devices consist of components slipped over legitimate card readers that surreptitiously record data from the mag strip of cards as customers insert them. A tiny camera captures the customer’s PIN as it’s entered on the keypad.

There have also been a spate of attacks using a default passcode that the maker of one ATM brand inexplicably printed in an operator’s manual easily found online.

Recently, however, hackers have found new ways to strip ATMs of their cash by installing malware on the machines. Last year, malicious software was discovered on 20 bank ATMs in Russia and Ukraine. The program was designed to attack ATMs made by Diebold and NCR that run Microsoft Windows XP software.

The attack requires someone to physically load the malware on to the machine — with a USB stick or cable, for example. Once this is done, attackers can insert a control card into the machine’s card reader to trigger the malware and give them control of the machine through a custom interface and the ATM’s keypad.

A thief could instruct the machine to eject whatever cash was inside the machine. A fully loaded bank ATM can hold up to $600,000. The malware also captures account numbers and PINs from the machine’s transaction application and then delivers them to the thief on a receipt printed from the machine in an encrypted format, or uploaded to a storage device inserted in the card reader.

This year at the Black Hat security conference in Las Vegas, researcher Barnaby Jack took the hack one step further by discovering a way to “jackpot” ATMs by installing malware remotely over modem on one brand of ATM, using a vulnerability he found in the system.

Source: Various

Photo: An unidentified girl puts her computer punch card into the slot of an ATM money machine, outside a bank in central London in 1968.
Associated Press

See Also:

Sept. 1, 1974: New York to London in Less Than 2 Hours

1974: On a flight to the Farnborough Air Show outside London, Maj. James Sullivan and Maj. Noel Widdifield fly the Lockheed SR-71 Blackbird from New York to London in 1 hour, 54 minutes, 56.4 seconds. The 1,806-mph flight still holds the transatlantic speed record between the two cities.

Developed during the middle of the cold war, the Lockheed SR-71 was designed as a reconnaissance aircraft that could fly fast enough to avoid being shot down by Russian aircraft or missiles. Initially developed as the A-12 for the CIA, the aircraft evolved and adapted many times in its more than 30 years of flying.

Designed by the legendary Kelly Johnson and his Lockheed Skunk Works team, the SR-71 was designed to fly at more than three times the speed of sound. There were a number of design challenges the Skunk Works team faced as the realities of such high speed flight were realized.

Beyond the obvious sleek aerodynamics needed, one of the biggest challenges was developing an engine that could fly at such speeds. Rocket-powered aircraft such as the North American X-15 had flown faster than the SR-71, but a rocket engine doesn’t need to worry about ingesting air, mixing it with fuel and then igniting the mixture to create thrust.

The challenge of an air-breathing jet engine is the air must be traveling slower than Mach 1 ( the speed of sound), when it enters the engine. Other supersonic jets use relatively simple inlets to the jet engine, to slow down the air so the shock wave created by supersonic flowing air doesn’t get to the engine.

If supersonic air does reach a jet engine, the result is known as an “unstart,” and the engine stalls and needs to be restarted during flight. This could be a problem if you were flying over the Soviet Union trying to stay ahead of fighter jets chasing you.

Because of the speeds flown by the SR-71, a much-more-complex inlet was needed to control the airflow into the massive jet engines over a range of speeds. The spike-shaped cone located at the front of the air inlet could be moved back and forth to control where the supersonic shock wave would enter the engine.

By carefully monitoring the aircraft speed, atmospheric conditions and engine parameters, the pilot could adjust the spike along with a series of doors located along the outer walls of the inlet. By doing this, a shock wave could actually be positioned in such a way that it would act as a speed bump of sorts and slow down the incoming air to Mach 0.6, the ideal speed for air to enter the jet engine.

The result was air would enter the inlet at approximately 2,100 mph, and within 20 feet, it would slow down to a speed of 600 mph. Of course this didn’t always go according to plan, and unstarts still happened in the SR-71. Pilots describe the unpleasant event as a violent jerk to the side of the stalled engine and continued shaking and unwelcome noises until the engine could be restarted.

The result of all of this complex air-inlet management was an engine that could push the SR-71 faster than any other jet aircraft. The official top speed was Mach 3.2, though occasionally pilots inadvertently flew as fast as Mach 3.5. Typical speeds during a mission would be around Mach 3.0.

For the New York–to–London flight, Sullivan flew the SR-71 through an imaginary gate 80,000 feet above New York. Heading east, he flew 3,461.528 miles until passing through another imaginary gate over London. The the trip lasted only 1 hour, 54 minutes, 56.4 seconds.

By comparison the Concorde typically flew from New York to London in around three hours, and a 747 makes the trip in about six hours. Of course the SR-71 did get a bit of a running start, but it also had to slow down over the Atlantic to refuel behind a special Boeing KC-135Q tanker.

After the end of the Farnborough Air Show where the SR-71 was on display outside the United States for the first time, it set another record on the way home. This time the spy plane flew from London to Los Angeles, a distance of 5,446.87 miles in just 3 hours, 47 minutes, 39 seconds. That flight required two refueling slowdowns as well as other speed zones when flying over major U.S. cities.

An SR-71 also set the coast-to-coast record when it flew from Los Angeles to Washington, D.C., in 64 minutes, 20 seconds in 1990.

The last flight of the SR-71 took place Oct. 9, 1999.

Source: Flying the SR-71 Blackbird, by retired Col. Richard H. Graham, U.S. Air Force; others

Photo: Lockheed Martin

See Also:

Aug. 31, 1920: News Radio Makes News

The radio staff of The Detroit News. Upper row: Edwin G. Boyes, Walter R. Hoffman and Keith Bernard, engineer-operators; Genevieve Champagne, secretary; E. Lloyd Tyson, assistant program director; Elton M. Plant, reporter. Lower row: Charles D. Kelley, department editor and supervisor; Howard E. Campbell, chief radio engineer; William F. Holliday, program director; G. Marshall Witchell, reporter. Photo courtesy earlyradiohistory.us

1920: A Detroit station airs what is believed to be the first radio news broadcast. The exact headlines of that day are of no historical significance, but with this local newscast a nascent medium finally conveys a message so compelling that it would soon capture the world’s imagination as only television and the internet would, many, many years later.

Radio had been around in a number of technical incarnations for decades, mostly for the enjoyment of hobbyists. Despite the general lack of public awareness — it was a technological contemporary of the telephone, mind you — radio was an obsession among an astonishingly large number of giant thinkers in the “Only One Name Is Necessary” club: Faraday, Maxwell, Hertz, Marconi, Tesla, Edison.

Radio’s commercial prospects were not yet fully appreciated, in part because wireless was considered primarily a “narrowcast” medium, a sandbox for the geeks of the day awed by the prospect of communicating over great distances over freely available spectrum. Radio communication was also standard aboard ships by the summer of 1920. Indeed, it was the unthinkable disaster that befell the unsinkable Titanic in 1912 which spurred widespread adoption of wireless at sea.

But on the cusp of the Roaring ’20s the notion that radio would be a mass medium and huge business was still a ways off. Stations in these loosely regulated early days broadcast in a metaphorical vacuum almost as large at the literal one which carried their sounds invisibly through the air.

Programming, such as it was, didn’t even have advertisers in the modern sense. Radio shows — all live, of course, and heavy on the music — were created and operated by radio-set manufacturers as a means of drumming up business, an early example of “software” driving sales of the “hardware” necessary to use it.

Also on the leading edge were newspapers, afraid that the immediacy of radio might someday render irrelevant their next-day coverage of … anything. (Why this history was not recalled later in the century when the internet actually was about to kill the newspaper business is anyone’s guess.)

In the case of what is now Detroit station WWJ, the strategy was all defense: The Scripps newspaper family sanctioned The Detroit News to start it up so the company could control what it thought in other hands might kill their dominance in the market.

Scripps was motivated to invent news radio, but didn’t exactly know how. And the company even wanted to hedge this bet, just in case radio turned out to be a passing fad with which they didn’t want their good name associated. So, in what would become a cliché of the internet age, they hired a teenager to build and explain it to them.

Scripps even instructed underage radio pioneer Michael DeLisle Lyons to obtain government permission for the station in his own name (there were no formal licensing rules yet), even though it was conceived of, owned and operated by The Detroit News and assembled in the newspaper building itself.

Lyons got permission to broadcast on Aug. 20, 1920, and for the next 10 days the station — what else? — played music to work out the kinks. “These concerts were enjoyed by no one save such amateurs as happened to be listening in,” The Detroit News reported about itself.

After 10 days of concerts, almost nobody had heard the station then called 8MK that was poised to make history. And in an amusingly self-congratulatory and hyperbolic story about itself — delivered, alas, no sooner than the next day — The Detroit News summed up the momentous event a few years later:

Everything was found to be satisfactory, and on Aug. 31, which was primary election day, it was announced that the returns — local, state and congressional — would be sent to the public that night by means of the radio.

The News< on Wednesday, Sept. 1, 1920, carried the following announcement: "The sending of the election returns by The Detroit News' radiophone Tuesday night was fraught with romance and must go down in the history of man's conquest of the elements as a gigantic step in his progress.

In the four hours that the apparatus, set up in an out-of-the-way corner of The News Building, was hissing and whirring its message into space, few realized that a dream and a prediction had come true. The news of the world was being given forth through this invisible trumpet to the waiting crowds in the unseen market place."

History would prove Scripps correct, of course, in ways big and small.

Radio is still a force to be reckoned with — despite television and the internet. In 1947, The Detroit News would go on to launch Michigan’s first TV station, WWJ-TV, now WDIV-TV. The newspaper entered into a novel arrangement with its rival, The Detroit Free Press, in 1989 under which they share business operations but maintain separate editorial staff.

WWJ is still an all-news radio station, now owned and operated by CBS.

And, of course, you can listen to it live, on the internet, 24/7.

Source: Wikipedia, wwj.cbslocal.com and earlyradiohistory.us

See Also:

Aug. 30, 1954: Ike Inks Nuke Law

1954: President Dwight D. Eisenhower, acknowledging the United States no longer holds a monopoly on nuclear power, signs the Atomic Energy Act of 1954.

The act is best known for ushering in a civilian nuclear-power program in the United States.

Today, there are 104 active and shuttered nuclear reactors across the United States alone, generating about 20 percent of the nation’s electrical power.

Among other reasons, Eisenhower signed the legislation in reaction to the Soviet Union testing a thermonuclear device the year before. With that, there was no need for the government to keep all the ingredients of atomic power secret.

Westinghouse produced the first plant in Shippingport, Pennsylvania.

The act declared a nuclear policy that “the development, use and control of atomic energy shall be directed so as to promote world peace, improve the general welfare, increase the standard of living and strengthen free competition in private enterprise.”

But how Eisenhower described the 568-page act (.pdf) in theory, and how it ended in practice, are two different things.

Continue Reading “Aug. 30, 1954: Ike Inks Nuke Law” »

August 27, 1874: He’s Ammoniac, Ammoniac at the Fore

1874: Carl Bosch, a chemist whose work would transform agriculture and industry — and eventually enable the Green Revolution — is born.

Bosch’s contribution to humanity was the development of the Haber-Bosch process, a technique for creating ammonia in large quantities.

Ammonia is an essential component of agricultural fertilizers, because it’s rich in nitrogen — which makes plants grow bigger. Bosch’s work led directly to a massive increase in agricultural productivity in the 20th century, and at least one professor has estimated that 40 percent of the world’s food (.pdf) can now be traced back to the process.

Coupled with the development of plant varieties better able to absorb nitrogen, (spearheaded by Norman Borlaug in the 1960s) the Haber-Bosch process helped save many people from starvation. It also no doubt helped facilitate the population explosion of the past century.

And it won Bosch the Nobel Prize in chemistry in 1931.

Bosch worked for chemical manufacturer Badische Anilin- und Sodafabrik, the Baden Aniline and Sodium Factory or BASF, in Ludwigshafen, Germany. The company in 1908 acquired the rights to a process for synthesizing ammonia under high pressures that had been developed by Fritz Haber.

Haber’s process used osmium and uranium as catalysts, which were impractically expensive (not to mention radioactive, though that probably wasn’t a concern at the time). Bosch set to work on devising a more practical version of the process suitable for large-scale industrial production. He also developed safer high-pressure blast furnaces to contain the reaction.

The resulting process, known today as the Haber-Bosch process, involves cooking air and natural gas over an iron oxide catalyst and intense pressure and heat. Thanks to the widespread availability of cheap natural gas, the process is inexpensive and effective.

It transformed agriculture, which prior to the 20th century relied primarily on manure, not artificial fertilizers, to increase crop yields.

Now the world produces about 87 million tons of nitrogen-based fertilizers annually. This increase is primarily due to the Haber-Bosch process.

But the process is encountering major problems in the increasingly resource-constrained world.

That’s because the main reaction in the process is cooking N2 (the dinitrogen molecule, composed of two nitrogen atoms) and H2 (the dihydrogen molecule, composed of two hydrogen atoms) together at 500 degrees Celsius and 200 atmospheres of pressure. You need all that heat and pressure, because breaking apart an N2 molecule turns out to be incredibly difficult, due to the arrangement of electrons in the nitrogen atom’s outer shell.

The energy required to break the bond is 946 kilojoules of energy per mole of nitrogen, or twice the energy required to bust O2 (dioxygen) molecules.

Luckily, or so we thought, fossil fuels were cheap, widely available and incredibly energy-dense: A cubic foot of natural gas contains a bit more than a gigajoule of energy.

That’s enough energy to convert about 30 kilograms (66 pounds) of nitrogen into 36 kilograms (79 pounds) of ammonia.

So, once the Haber-Bosch process established it could be done, chemists across the world began to burn a lot of natural gas to get dinitrogen to react with hydrogen. And where do we get the hydrogen? Why, we use the natural gas for that too, naturally: It’s CH4 (methane) after all.

Taken together, there’s a lot of natural gas going into the production of nitrogen fertilizer.

And that’s why today, many people are starting to think that someone needs to invent a better way of making nitrogen fertilizer, one that doesn’t require so much fossil-fuel consumption.

Whoever comes up with a low-energy alternative to the Haber-Bosch process might even win a Nobel Prize.

Source: Various. Adapted in part from a Wired Science post by Alexis Madrigal, “How to Make Fertilizer Appear Out of Thin Air.”

Image: A worker makes a weld in the ammonia-synthesis system at a Tennessee Valley Authority plant near Muscle Shoals, Alabama, in June 1942.
Alfred T. Palmer/TVA/U.S. Office of War Information

See Also: