Original photo by cineclassico / Alamy Stock Photo

During Hollywood’s Golden Age in the 1930s and ’40s, MGM star Hedy Lamarr was considered one of the world’s most beautiful women. Her appearance was reportedly the model behind Walt Disney’s Snow White, as well as Batman’s nemesis, Catwoman. Yet there’s much more to Lamarr’s life than her beauty. She was also an inventor, who had an idea amid World War II that would have later implications for both the U.S. military and technology such as GPS and Bluetooth. Learn more about Lamar with these eight fascinating facts.

A shot of Hedy Lamarr in the 1933 film Ecstazy.
Credit: ScreenProd / Photononstop/ Alamy Stock Photo

Lamarr Was Often Associated With a Racy Early Film

Born Hedwig Eva Maria Kiesler in Vienna, Austria, Lamarr starred in the Czech movie Ecstasy (1933) when she was a teenager. The silent film featured Lamarr swimming in the nude, as well as simulating what’s thought to be cinema’s first female orgasm. Given its content, Ecstasy was criticized by Pope Pius XI and Adolf Hitler banned the movie due to Lamarr’s Jewish background. Even after Lamarr became a star in Hollywood, people often called her “Ecstasy Girl.” The title for her problematic autobiography, Ecstasy and Me: My Life as a Woman, was also inspired by this risqué film.

Hedy Lamarr posing with ex husband W. Howard Lee.
Credit: ullstein bild Dtl via Getty Images

Lamarr Was Married Six Times

Lamarr never had much success in marriage, with six failed unions under her belt. Lamarr was not yet 20 when she wed her first husband, Friedrich Mandl, in 1933. Mandl was a munitions dealer who worked with Nazis, dined with people like Italian dictator Benito Mussolini, and was also extremely controlling of Lamarr. In fact, Mandl tried (unsuccessfully) to purchase all copies of Lamarr’s film Ecstasy. It took multiple escape attempts before Lamarr was able to get out of the marriage in 1937. In one telling, she says she had to drug her maid and disguise herself in the servant’s uniform to flee.

Lamarr posting for a photo shoot in a silk robe.
Credit: cineclassico / Alamy Stock Photo

Lamarr Negotiated Her Own Hollywood Contract

After the end of her first marriage, Lamarr wanted to go to Hollywood. While in London, she met Louis B. Mayer, the head of MGM Studios. Mayer, aware of the controversy surrounding Ecstasy, offered Lamarr a contract with a salary of only $125 per week, which she turned down. Still determined to go to Hollywood, Lamarr managed to board the ship Mayer was taking back to the United States. During the voyage, Lamarr charmed her fellow passengers, demonstrating the pull she could exert on audiences. By the time the ship had arrived stateside, Lamarr had a contract with MGM for $500 a week.

Lamarr models a long flowing dress whilst reclining on a crescent moon.
Credit: Clarence Sinclair Bull/ Moviepix via Getty Images

Lamarr’s Stage Name Was Inspired By a Dead Movie Star

Signing with MGM required Lamarr to change her last name from Keisler, since German names were not in vogue by the late 1930s. Mayer was inspired by deceased silent film star Barbara La Marr when creating the actress’ new last name. Although it was fake, Lamarr became attached to her new name. When the Mel Brooks comedy Blazing Saddles (1974) had a character named “Hedley Lamarr,” Lamarr sued for the unauthorized use of her name and received a small settlement.

Lemarr working with science equipment.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Lamarr’s Inventor Side Was Encouraged by Howard Hughes

When Lamarr was 5, she’d taken apart and then rebuilt a music box to discover how it worked. Her interest in understanding how things functioned, along with a desire to create her own inventions, continued even as she began to make her name in Hollywood. In this, Lamarr was supported by movie mogul and aerospace innovator Howard Hughes. Lamarr aided Hughes in return; by studying the anatomy of fish and birds, she came up with an idea for an airplane wing that he embraced as “genius.”

United States Army Signal Corps in France operating a field radio station.
Credit: Print Collector/ Hulton Archive via Getty Images

Lamarr’s World War II Invention Was Initially Dismissed

During World War II, Lamar and modernist composer George Antheil came up with a “secret communication system” that used “frequency hopping” between radio signals to direct torpedoes without enemy interference. She and Antheil received a patent in August 1942 and offered their invention to the U.S. military. But the government wasn’t interested in the invention or Lamarr’s intelligence. Instead, the actress was informed that her beauty was the best way to help the war effort. Instead of rejecting this sexist suggestion, Lamarr went on to sell millions in war bonds. She also took shifts at the Hollywood Canteen, where soldiers could relax and spend time with movie stars.

Lamarr standing on her dining room table and measuring a lamp shade.
Credit: Gene Lester/ Archive Photos via Getty Images

Lamarr Invented Many Everyday Items

In addition to the frequency-hopping system, Lamarr had a slew of other inventions, including a light-up dog collar, improvements for a traffic signal, tablets to transform water into soft drinks, and a new Kleenex box.

Part of Lamarr's joint patent application grant for a frequency hopping spread spectrum.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

Lamarr’s Frequency-Hopping System Was Used Globally, But She Didn’t Receive Credit

The frequency-hopping system that Lamarr and Antheil invented during World War II was adapted by the U.S. Navy and used during 1962’s Cuban Missile Crisis. Later it contributed to technological innovations such as Bluetooth and GPS. Yet Lamarr’s contribution was ignored. She expressed her feelings about this in a 1990 interview: “I can’t understand why there’s no acknowledgment when it’s used all over the world.” Lamarr was slightly mollified when she was recognized by the Electronic Frontier Foundation with a Pioneer Award in 1997.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Entertainment Pictures/ Alamy Stock Photo

More than 50 years after it premiered on June 30, 1971, Willy Wonka and the Chocolate Factory continues to treat kids and adults alike to a deliciously bizarre viewing experience. Here are a handful of facts you might not know about this candylicious classic.

A view of Wonka Bar.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

The Movie Was Part of a Massive Advertising Scheme

At the time Willy Wonka was being developed, Quaker Oats was tinkering with a new chocolate bar. After discussions with the production company, Quaker realized that the movie could serve as a massive marketing machine for a new line of sweets. They went all in, funding the whole project. Quaker’s involvement is why the movie was called Willy Wonka and the Chocolate Factory instead of Charlie and the Chocolate Factory (the title of the 1964 book by Roald Dahl); because of the $2.9 million investment, Quaker wanted the brand name right in the movie title. Ironically, Quaker ran into some issues with the chocolate formula they were developing, so their Wonka Bar wasn’t released until four years after the movie came out. The only Wonka-related products Quaker had on the market around the time the movie was released were Peanut Butter Oompas and the Peanut Butter Super Skrunch.

film with Gene Wilder in top hat as Willy Wonka and the cast.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

The Movie Wasn’t a Huge Hit

It’s hard to imagine Wonka as anything but a runaway success these days, but the initial response to it was a bit mediocre. After earning just $4 million at the box office (compared to a budget just under $3 million) and seeing little public interest in the film in the years that followed, Paramount failed to renew the distribution rights and Warner Bros. scooped them up in 1977. Warner Bros. knew just what to do with them — once they brought the movie to TV and VHS, the film gained a new audience and went on to become a cult classic.

Author Roald Dahl.
Credit: Library of Congress/ Corbis Historical via Getty Images

Author Roald Dahl Was Not a Fan

It’s not an uncommon phenomenon for authors to be disappointed in the movie adaptations of their books. Roald Dahl, for one, called Wonka “crummy.” He didn’t care for the music, the director, or the casting choice of Gene Wilder. “I think he felt Wonka was a very British eccentric,” Dahl’s friend and biographer Donald Sturrock has said. “Gene Wilder was rather too soft … His voice is very light and he’s got that rather cherubic, sweet face. I think [Dahl] felt … there was something wrong with [Wonka’s] soul in the movie – it just wasn’t how he imagined the lines being spoken.” It’s said that Dahl eventually grew to “tolerate” the movie, but never liked it. In turn, Gene Wilder wasn’t a fan of Tim Burton’s 2005 remake of the movie, calling it “an insult.” He later clarified, “Johnny Depp, I think, is a good actor, but I don’t care for that director. He’s a talented man, but I don’t care for him doing stuff like he did.”

This photo shows director Mel Stuart shooting a film.
Credit: Sueddeutsche Zeitung Photo/ Alamy Stock Photo

Director Mel Stuart Was a Documentarian

Stuart was well-known in cinema circles for films like 1963’s The Making of the President (the story of the 1960 U.S. presidential election) and 1968’s The Rise and Fall of the Third Reich. So, how did he end up directing one of the most beloved children’s movies of all time? At the insistence of a child, of course. Roald Dahl’s Charlie and the Chocolate Factory was a favorite of his daughter Madeline’s, and she told him what a great movie it would make. She eventually even landed a small speaking role in the film. “I’m very proud of that movie,” Madeline said when her father died in 2012. “I think it’s absolutely brilliant and charming and a bit dark and very funny — and all those things describe my father.”

The chocolate room of candy-maker supreme, Willy Wonka.
Credit: Mirrorpix via Getty Images

Much of the Chocolate Factory Set Really Was Edible

According to Gene Wilder, about a third of the candy factory floor set truly could have been eaten, including the chocolate river. But although the river certainly looked dreamy, none of the actors were too tempted to eat it, what with all the people walking through it during filming. One element that wasn’t edible? The yellow tea cup flower Wilder sinks his teeth into during the song “Pure Imagination.” It was wax, which Wilder had to chew on until the take was complete.

The Fake-Out Stumble Was Done at Gene Wilder’s Insistence

Many actors wanted the part of Willy Wonka, but director Mel Stuart desperately wanted Wilder. After reading the script, Wilder agreed to the role — but only if he could orchestrate Wonka’s grand entrance. “I will do it if I can come out, and all the crowd quiets down, and I am using a cane,” Wilder told Larry King in 2002, recalling his conversation with Stuart. “And I walk slowly and you can hear a pin drop. And my cane gets stuck in a brick. And I fall forward onto my face and do a forward somersault and jump up, and they all start to applaud.’”

Stuart agreed, but didn’t quite understand the motivation behind the grand deception. “I said, ‘because no one will know from that point on whether I am lying or telling the truth,’” Wilder explained.

Actor Gene Wilder as Willy Wonka on the set of the film 'Willy Wonka & the Chocolate Factory'.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Gene Wilder Was Very Specific About His Wardrobe

Wonka’s entrance wasn’t the only part of his iconic character that Wilder vividly envisioned. He also had very specific thoughts on Wonka’s wardrobe, which he revealed in a letter to director Mel Stuart after seeing the initial costume sketches. Some of the highlights:

“Slime green trousers are icky. But sand colored trousers are just as unobtrusive for your camera, but tasteful.”

“The hat is terrific, but making it 2 inches shorter would make it more special.”

“Also a light blue felt hat-band to match with the same light blue fluffy bow tie shows a man who knows how to compliment his blue eyes.”

“To match the shoes with the jacket is fey. To match the shoes with the hat is taste.”

Perhaps most revealing about how Wilder viewed his portrayal are his views on keeping the costume timeless: “I don’t think of Willy as an eccentric who holds on to his 1912 Dandy’s Sunday suit and wears it in 1970, but rather as just an eccentric — where there’s no telling what he’ll do or where he ever found his get-up — except that it strangely fits him: Part of this world, part of another. A vain man who knows colors that suit him, yet, with all the oddity, has strangely good taste. Something mysterious, yet undefined.”

Violet Beauregarde,played by Denise Nickerson, blows up like a blueberry in a scene.
Credit: Michael Ochs Archives/ Moviepix via Getty Images

Violet Really Did Turn Violet

Life imitated art when Violet Beauregard actress Denise Nickerson couldn’t seem to ditch her blueberry hue. Two days after shooting the famous scene where she goes full berry, Nickerson was sitting in math class when a friend looked at her, alarmed. “You’re turning blue,” she said. The blue makeup had been so thoroughly applied that it was resurfacing through her pores and took another 36 hours to disappear again. “Needless to say, I didn’t get asked out for a date in that school,” Nickerson later said. “They thought, ‘If I take her out, she could turn polka dots!’”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Patti McConville/ Alamy Stock Photo

It’s difficult to believe that it’s been more than 25 years since Murder, She Wrote ended its 12-year run on May 19, 1996, and mystery writer and amateur sleuth Jessica Fletcher (mostly) disappeared from our lives. A quarter of a century later, the fictional village of Cabot Cove, Maine, still holds a special place in many of our hearts. Let’s revisit it below with some fun facts about the show, but watch your back, and remember — everyone’s a suspect.

Angela Lansbury holds a book in publicity portrait for the television series.
Credit: Archive Photos/ Moviepix via Getty Images

Angela Lansbury Wasn’t the First Choice for Jessica Fletcher

It’s nearly impossible to imagine anyone but Angela Lansbury playing Jessica Fletcher, but she wasn’t a shoo-in for the job. Doris Day turned it down; Jean Stapleton (aka Edith Bunker) also declined, partly because she didn’t feel ready to jump into another series so soon after wrapping up the 1970s sitcom All in the Family. “Every time I saw Angela during those years, she’d say, ‘Thank you, Jean,’” Stapleton once said.

Angela Lansbury and Bea Arthur.
Credit: Ron Galella, Ltd./ Ron Galella Collection via Getty Images

Jessica’s Middle Name Was a Nod to Lansbury’s Real-Life BFF

Before landing Murder, She Wrote, Angela Lansbury was perhaps best known for her Broadway prowess. After costarring with Bea Arthur in the musical Mame in 1965, the two actresses became very close. Lansbury has said that Arthur was “a rare and unique performer and a dear, dear friend.” Jessica Fletcher’s middle name, Beatrice, pays homage to their real-life friendship.

Emmy Statue is seen in front of the Television Academy during the red carpet.
Credit: ANGELA WEISS/ AFP via Getty Images

Lansbury Never Won an Emmy for “Murder, She Wrote”

The extraordinarily talented Lansbury — who died October 11, 2022, at the age of 96 — was nominated for a total of 18 Emmys, including one for every season of Murder. She never won, though, making the elusive Emmy the only part of the EGOT (the Emmy, Grammy, Oscar, and Tony Awards) she doesn’t have. “It bothers the hell out of me,” she once said. (She was, however, honored in the Emmy Hall of Fame in 1996; her 2013 Oscar is also honorary.)

Actress Octavia Spencer poses for a portrait.
Credit: Larry Busacca/ Getty Images Entertainment via Getty Images

A Remake Starring Octavia Spencer Almost Happened

In 2013, Octavia Spencer was slated to take the lead role in a reboot that would have aired on NBC. Described as a “light and contemporary take,” it focused on a hospital administrator who publishes her first mystery novel and then starts solving crimes. Lansbury was skeptical, saying, “I think it’s a mistake to call it Murder, She Wrote, because Murder, She Wrote will always be about Cabot Cove and this wonderful little group of people who told those lovely stories and enjoyed a piece of that place, and also enjoyed Jessica Fletcher, who is a rare and very individual kind of person. … So I’m sorry that they have to use the title Murder, She Wrote, even though they have access to it and it’s their right.” The reboot was cancelled in 2014.

Actor Tom Selleck and actress Angela Lansbury.
Credit: Ron Galella/ Ron Galella Collection via Getty Images

There Was a Very Special Crossover Episode

Any criminal would be a fool to try to pull something off with both Thomas Magnum and Jessica Fletcher on the case, but that’s exactly what happened in this third-season television event featuring Jessica in Hawaii. The first part (“Novel Connection”) aired during Magnum P.I.’s time slot, with the story concluding in an episode called “Magnum on Ice” during Murder’s Sunday-night airing.

Close up of the character Jessica Fletcher.
Credit: RGR Collection/ Alamy Stock Photo

Lansbury Identifies With Fletcher the Most Out of All Her Roles

“The closest I came to playing myself … was really as Jessica Fletcher,” Lansbury told Parade magazine in 2018. “Obviously, if I had been able to do that earlier in my career, I would have had a different career really.”

However, in 1985  — a year after the show began — she also told The New York Times: “Jessica has extreme sincerity, compassion, extraordinary intuition. I’m not like her. My imagination runs riot. I’m not a pragmatist. Jessica is.”

The CBS headquarters in NYC.
Credit: Andrew Burton/ Getty Images News via Getty Images

CBS Pulled the Plug on “Murder” Rather Unceremoniously

In 1995, the network switched Murder from its long-standing Sunday-night spot to run directly against NBC’s Friends in the Thursday-night “Must See TV” lineup. The move dropped Lansbury and co. from the top 20 in the Nielsen ratings to No. 67, and CBS bid them adieu. “Murder most foul,” the Washington Post declared the schedule change and cancellation.

TV series "Murder, She Wrote" DVDs.
Credit: Patti McConville/ Alamy Stock Photo

The Final Episode Made a Point

Titled “Death by Demographics,” the finale was set at a San Francisco radio station changing from classical to rock against their will. “You realize we’re going to lose our entire audience,” the son of the station owner says. “Yes, and replace it with 12-to-18-year-olds, the ones who spend serious money on new products and new ideas, and the ones that advertisers pay big bucks to reach,” responds a producer. The implication was clear — Murder had been killed because it couldn’t reach the youth market when it went up against Friends.

It wasn’t the only time during the season that the Murder writers thumbed their noses at CBS. There was also an episode called “Murder Among Friends,” in which the producer of a hit TV show called Buds is killed right before a big cast change.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Archive Images/ Alamy Stock Photo

We’ve all seen our share of talented children — the ambidextrous baseball pitchers, the ones who knock out “Für Elise” on the piano with surprising ease, or impress with a recitation of obscure facts from their favorite subjects.

Chances are, we’re witnessing something promising but hardly unusual; adept kids emerge in every generation. However, once in a blue moon, a youngster unleashes such a mind-blowing show of talent that global recognition becomes a distinct possibility. Here are eight such prodigies who quickly dispensed with the training wheels before zooming to the top of their respective fields.

Leopold Mozart playing the violin with his seven year old son Wolfgang Amadeus Mozart.
Credit: Rischgitz/ Hulton Archive via Getty Images

Wolfgang Amadeus Mozart

Intrigued by the harpsichord at age 3, Austrian Wolfgang Amadeus Mozart accelerated through lessons and delivered his first musical composition in 1761 at age 5. That was enough for his father, who sent young Mozart and his older sister — also a gifted musician — on a tour of European cities over the next decade. Mozart thrived despite the grueling traveling conditions, dashing off his first symphony at age eight and his first operas not long after. At age 14, he transcribed Gregorio Allegri’s “Miserere” from memory after hearing it performed at the Sistine Chapel, and returned a few weeks later to make only minor corrections to his notes. Mozart, of course, went on to become one of the greatest composers of the classical period, and the early realization of his abilities allowed him the time to create more than 600 works despite an early death at age 35.

Shirley Temple as a child star wearing accordion pleated dress.
Credit: Bettmann via Getty Images

Shirley Temple

Few child stars in history have as much notoriety as Shirley Temple. When she was 4 years old, Temple was already lighting up the screen in a series of film shorts called Baby Burlesks (1932). By age seven, she had already appeared in more than 10 feature films and earned a special juvenile Academy Award, and that was before she became Hollywood’s No. 1 box office draw for four years running. Temple eventually aged out of her bread-and-butter roles as America’s dimple-cheeked sweetheart, and her film career was over by the time she legally became an adult. Fortunately, she avoided the tragedies that plagued many of the child stars who followed in her footsteps by launching a successful second act as a prominent diplomat. Temple, who eventually went by her married name, Shirley Temple Black, was a delegate to the U.N. General Assembly from 1969 to 1970, served as U.S. ambassador to Ghana from 1974 to 1976, was the chief of protocol for President Gerald Ford, and served as ambassador to Czechoslovakia from 1989 to 1992, among other diplomatic roles.

13 year old Bobby Fischer of Brooklyn, as the youngest member of the Manhattan Chess Club.
Credit: Bettmann via Getty Images

Bobby Fischer

Born in Chicago, Illinois in 1943, Bobby Fischer began playing chess at age 6 after his big sister purchased a $1 set. His talent had blossomed by age 13 when Fischer defeated former U.S. champion Donald Byrne in the “game of the century.” He went on to become the youngest national champion at age 14, the game’s youngest grandmaster at age 15, and the first American to claim the world championship. Unfortunately after these early successes, an increasingly erratic Fischer became better known for his bigoted rants and troubles with the law, though his place in history is secure thanks to the early show of brilliance that popularized the insular game of kings.

Retrato de Sor Juana Inés de la Cruz.
Credit: Art Collection 4/ Alamy Stock Photo

Sor Juana Inés de la Cruz

There weren’t many pathways to success for girls born to unwed parents in 17th-century Mexico, but Sor Juana Inés de la Cruz managed to transcend her origins with a dazzling mind and a deft pen. Largely self-taught, she wrote her first dramatic poem at age eight, studied the Greek classics, and was instructing children in Latin by age 13. A few years later, she joined the court of the Viceroy Marquis de Mancera, where she famously wowed a panel of professors with her expertise in numerous subjects. Sor Juana then entered a convent, where she enjoyed the freedom to pen numerous plays, poems, and carols, as well as the proto-feminist manifesto Respuesta a sor Filotea de la Cruz. A clash with authority figures forced her to abandon her creative pursuits shortly before her death in 1695, but she endures as one of the most important literary figures of the New Spanish Baroque.

John Stuart Mill, British philosopher and social reformer.
Credit: Print Collector/ Hulton Archive via Getty Images

John Stuart Mill

English philosopher John Stuart Mill’s legacy as one of the great writers and thinkers of the 19th century was forged by a childhood devoted to academia. Undertaking a rigorous curriculum, Mill was studying ancient Greek by age three, wrote a history of ancient Rome by age six, and mastered Latin by age 8. The training left him positioned to aid his philosopher father’s intellectual pursuits, but it also produced an inner turmoil that manifested in a nervous breakdown and a period of depression in his early 20s. It wasn’t until he started reading poetry that Mill began understanding the feelings that had been repressed since childhood, paving the way for his groundbreaking works on utilitarianism, intellectual freedom, capitalism, and gender equality.

Jascha Heifitz, Child Prodigy Violinist.
Credit: Buyenlarge/ Archive Photos via Getty Images

Jascha Heifetz

In 1903, at just 2 years old, Jascha Heifetz began learning the violin and rapidly developed fluency with the instrument that would carry him from his native Russia to all corners of the world. He made his formal public debut at age eight, performed before a reported 8,000 people at age 10, and played with the Berlin Philharmonic as an 11-year-old. A seasoned pro by his teenage years, Heifetz made his long-awaited Carnegie Hall debut at 16 and launched a prolific recording career shortly afterward. Heifetz was also a gifted pianist, and he enjoyed success as a Tin Pan Alley composer under the pseudonym of Jim Hoyl, though he remained most beloved for the violin wizardry that was apparent from the very beginning.

Dr. Von Neumann receives Freedom Medal.
Credit: Bettmann via Getty Images

John von Neumann

While not nearly as well-remembered as fellow European emigree and scholar Albert Einstein, John von Neumann was also a certifiable genius who made an enormous imprint on the world around him. Born in 1903 in Budapest, Hungary, his turbo-charged intellect was apparent by the early stages of grade school. Von Neumann could converse in ancient Greek and multiply two eight-digit numbers in his head by age 6 and within two years he was already learning calculus. His dad tried to dissuade his son from a career in mathematics over fears that it was an unsustainable career, but von Neumann not only proved he could make a comfortable living in the field, he also showed his training could be applied to the development of game theory, personal computers, weather forecasting, and other real-world applications.

Willie Mosconi, of Philadelphia, youngest and the fastest pocket billiards player of all time.
Credit: Bettmann via Getty Images

Willie Mosconi

Billiards legend Willie Mosconi got his start playing the game in his father’s Philadelphia pool hall, even as his father tried to steer him toward a stage career. After the boy kept sneaking in to practice with a potato and broom handle, a resigned papa figured he could make the most of his son’s determination. In 1919, at age 6, Mosconi more than held his own in a match against world champion Ralph Greenleaf, and at age 11, he became the juvenile champ. From there, there was no slowing the man The New York Times called the Babe Ruth of his sport, who once sunk a record 526 shots in a row and won the world billiards title 13 times over 15 years.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by New Africa/ Shutterstock

How often do you really think about wind — other than how to get out of it? One of the most basic atmospheric forces on Earth, wind powers farms, carves landscapes, carries ships, and holds up kites. But what causes it? Why does it sometimes blow in different directions? What’s a trade wind, anyway? From the early origins of wind power to record-breaking wind speeds, these eight facts about wind might just blow you over.

A scarf blowing in the wind.
Credit: aldomurillo/ iStock

Wind Is Caused by Uneven Heating on the Earth

Ever wonder where the breeze on the beach comes from? It happens because during the day, the water warms up more slowly than the land next to it. This uneven heating causes changes in atmospheric pressure; warm air expands and rises, and the cool air from the water blows in to take its place. This is a smaller-scale example of how wind works throughout the world — on a grander scale, the difference in temperature between the equator and the North and South poles cause large, powerfully windy areas banding across the Earth.

A night photo of wind at the beach.
Credit: Anna_Anikina/ iStock

Some Winds Reverse Course at Night

When wind rushes between water and land, the pattern is different depending on what time it is. During the day, the wind rushes inland — but at night, the land cools faster than the water, causing the wind to head back in the direction of the water. Pay attention to this phenomenon on your next long walk on the beach.

Heavy rain clouds over the ocean, doldrums.
Credit: daguimagery/ Shutterstock

There Are Five Major Wind Zones on Earth

Prevailing winds, such as trade winds, blow in one direction without stopping. There are five major wind zones on the planet, each with their own behaviors of prevailing winds: polar easterlies, westerlies, horse latitudes, trade winds, and the doldrums.

Polar easterlies are winds that blow from the east around the North and South poles. Westerlies blow in the other directions at midlatitudes, around the middle points between the poles and the equator — strongest at around 40 to 50 degrees latitude in the Southern Hemisphere, blowing past New Zealand and the lower edges of Australia and South America. Horse latitudes, at about 30 degrees on either side of the equator, are warm areas with calm winds. Trade winds are incredibly predictable, powerful, easterly winds that run through the tropics, named because of how vital they’ve been to seafaring, including trading ships, throughout history.

The doldrums, also known as the intertropical convergence zone, is a calm area where two bands of trade winds meet. The winds here are weak, and ships have been known to get stuck there.

Beautiful tall ship sailing deep blue waters.
Credit: JamesBrey/ iStock

Wind Energy Is Ancient Tech

Wind energy doesn’t just refer to turbine-generated wind power — it also refers to the sails of ships and the windmills that pump water or mill grain. Thousands of years ago, wind energy was propelling boats along the Nile River; ancient Egyptian art shows images of sailboats as early as 3300 BCE. Before that, sails made from animal hide still probably powered single-log rafts.

Panoramic view of a wind farm, with high wind turbines.
Credit: Vladimka production/ Shutterstock

The First Windmills Were in Asia

Windmills may conjure images of rural European areas, but the earliest windmills were water pumps in ancient China and grain mills in ancient Persia around 200 BCE. Windmills were in heavy use in the Middle East in the 11th century CE, when traders brought the technology up north to Europe The iconic windmills in the Netherlands started cropping up around 1200 CE.

anemometer used for measuring the speed of wind.
Credit: Arthorn Saklang/ Shutterstock

The Fastest Recorded Wind Was 253 MPH

In 1996, during Hurricane Olivia, an Australian wind meter recorded a wind speed of a whopping 253 miles per hour. Oddly, it was during a Category 4 storm, not the more severe Category 5, although with sustained 140-mile-per-hour winds, the weather wasn’t exactly calm. The meter, which was extremely heavy-duty, was determined to be working properly, and experts believe the measurement was caused by a powerful mesovortex, a violent swirl of wind, passing directly over it. It’s comparable odds to a tornado directly hitting a weather meter. The previous record-holder was a 231-mile-per-hour gust in New Hampshire. Notably, these records are only for land-based anemometers, not Doppler-estimates speeds from tornadoes or measurements taken by dropsondes, which drop from aircraft to gauge conditions inside storms.

The wind raises the dust in the Sahara Desert.
Credit: Vova Shevchuk/ Shutterstock

Wind Carries Dust From the Sahara Desert All Over the World

The Sahara Desert is unfathomably massive, covering 3.3 million square miles in northern Africa — but its impact spreads even farther. Pushed by powerful trade winds, dust from the desert can hit halfway around the world in Texas and Florida (among other states), usually in the summertime. It arrives in quantities large enough to cause health problems, especially in people with asthma or other respiratory conditions.

Sand dunes on Libyan Desert, part of Sahara Desert.
Credit: hadynyah/ iStock

Wind-Created Geographical Features Are Called Aeolian Landforms

The best-known landscape features caused by wind are dunes, mounds of sand that are critical to the ecosystems in coastal areas. But dunes are only one example of aeolian landforms  (named for the Greek god of wind, Aeolus). Some are soft, like loess, collections of yellow or tan sediment usually deposited by wind, such as the notable loess deposits along the Missouri River in Iowa. Others are more dramatic, like ventifacts, which are rocks that are shaped by the wind and can form amazing shapes and structures.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by shaunl/ iStock

The modern world runs on electricity. This subatomic movement of electrons between atoms crisscrosses along millions of miles of transmission and distribution lines throughout the U.S., powering everything from our cars to our toasters. Although humans have been witness to electricity’s awesome power in the form of lightning since time immemorial, scientists have only truly probed the nature of electricity in the last few centuries — and completely transformed society in the process. These eight facts explore the history of electricity, how it’s integrated into our everyday lives, and what the future of this vital resource holds.

Thales of Miletus (C624-546 BC).
Credit: Universal History Archive/ Universal Images Group via Getty Images

Human Study of Electricity Goes Back to the Seventh Century BCE

Although electricity powers many modern inventions, the concept of electricity is an old one. The Greek thinker Thales of Miletus experimented around 600 BCE with static electricity by rubbing an amber (known in Greek as elektron) against hemp or cat fur and noting the attractive forces produced against materials such as dust. Today, scientists call the phenomenon of friction causing a differential charge between objects the “triboelectric effect.” After Thales’ time, electricity remained largely a curiosity until electrostatic experiments made a comeback in the 17th century, when Isaac Newton’s onetime lab assistant Francis Hauksbee created an “electrostatic engine” and reignited interest in a hair-raising marvel that had fascinated the Greeks so many centuries ago.

Battery of 18 Leyden jars.
Credit: Science & Society Picture Library via Getty Images

Benjamin Franklin Coined the Term “Battery”

Benjamin Franklin is a towering figure in American history, but his revolutionary efforts are rivaled by his extensive contributions to the study of electricity. Of course, Franklin’s most “electrifying” episode is his famous kite-flying adventure (though the kite was never struck by lightning, and only picked up ambient electric charge from a passing storm; some historians have even questioned whether the experiment ever actually happened). Franklin’s interest in electricity far exceeded this well-known experiment, however. He helped develop the idea of positive and negative charge as it relates to “electric fire,” as he called it. He also coined the term “battery” to describe a group of connected Leyden jars, a kind of 18th-century proto-capacitor (true batteries didn’t arrive until the early 19th century). Similar to how military artillery functions together to form a battery, so too did these individual Leyden jars, working together, attain a greater electric charge.

Power towers along U.S. Route 50.
Credit: mark higgins/ Shutterstock

The U.S. Electric Grid Is the Largest Machine in the World

The world is full of some truly massive machines (ever laid eyes on NASA’s rocket-ferrying crawler-transporter?), but none comes even close to the largest machine in the world — the U.S. electric grid. With the creation of the first U.S. power station on September 4, 1882, the country quickly electrified, and stations popped up throughout cities and suburbs. But it wasn’t until 1967 that these power stations became truly interconnected. Thomas Edison’s first New York power plant, Pearl Street Station, initially powered only 400 lamps and served a measly 82 customers. Today, the U.S. electric grid actually contains three self-contained grids — the eastern, western, and Texas interconnections — composed of 7,300 power stations that service more than 100 million American homes. That’s a pretty impressive expansion from Edison’s initial 400 lamps.

View of an electric wall in a home.
Credit: Anastasija Vujic/ Shutterstock

The Electricity In Your Home Was Generated Milliseconds Ago

Although the U.S. grid has incredible power generation capacity, the ability to store that energy is less robust. Although engineers are designing ways to store energy (especially renewable energy for when the sun doesn’t shine or the wind doesn’t blow), most electricity is generated on demand, meaning the power flooding your home was likely generated many miles away only milliseconds ago.
In many cases, electricity isn’t generated directly, but is instead a secondary energy source. Primary energy sources include things like coal, natural gas, wind, and nuclear fission, which create steam that turns turbines or manipulate turbines directly (in the case of wind). These turbines use the underlying principle of electromagnetic induction, which transforms kinetic energy into electrical energy. So more likely than not, the light filling your room or powering your television was steam only milliseconds ago.

The peripheral nervous system connects the body to the brain.
Credit: Graphic_BKK1979/ iStock

The Human Body Contains Electricity

Although electricity makes the artifice of our technological world possible, it’s also an important biological process. For example, electric fish (think electric eels) motivated early electrical pioneers such as Italian inventors Alessandro Volta and Luigi Galvani to investigate both biological electricity as well as the means to create it artificially. In the human body, electricity is the main ingredient of the nervous system, which sends electrical impulses throughout the body, traveling between 156 and 270 miles per hour. Our very cells are purposefully designed to transfer electricity and it’s vital to our health that they do so, as our very heartbeats require electrical impulses to time them correctly. In fact, all living things produce an electric field.

Lightning strikes the ground in a summer storm.
Credit: James Whitlock/ Shutterstock

There Are Roughly 8 Million Lightning Strikes on Earth Every Day

Without a doubt, humanity’s very first shocking run-in with electricity likely happened observing lightning during a storm. Lightning forms when the attraction between the negative charge in the bottom of a cumulonimbus cloud and the positive charge on the ground becomes so great that they connect in an explosive display of electricity. Although people typically think of lightning striking the ground, these impressive arcs of light can also dance within a cloud or between two clouds as well. Lightning strikes can travel around 270,000 miles per hour while briefly superheating the surrounding air to a blistering 50,000 degrees Fahrenheitfive times hotter than the sun. Although some 40 million lightning strikes hit the U.S. every year (and roughly 8 million lightning strikes happen around the world per day), the chances of getting struck by lightning are roughly one in a million.

An electric motor cab and driver in London.
Credit: Heritage Images/ Hulton Archive via Getty Images

The First Electric Car Was Invented in the 1870s

Electric cars aren’t some new fad. In fact, the first EVs were invented during the dawn of the automobile itself in the mid-1800s. Most point to Scottish inventor Robert Anderson as the first person to build an electric car, in 1832, though its crude design made an impractical replacement for the reliable horse and buggy. By the end of the 19th century, U.S. inventors including Iowa’s William Morrison had begun developing more reliable electric cars, and about one-third of all vehicles on the road were electric in that era. Although electric vehicles had distinct advantages over gasoline cars as they were quiet and pollutant-free, the discovery of oil in Texas spelled the end of the U.S.’s short-lived electric car era. It would be decades later, during the midst of a climate crisis largely perpetuated by those early EVs’ gas-guzzling competitors, that electric cars would once again rise in popularity.

The surface of a star, like our sun, burning through nuclear fusion.
Credit: Broadcast Media/ Shutterstock

Electricity Could One Day Come From Artificial Suns

Glimpsing the future of electricity on Earth also means doing a little bit of stargazing. For nearly a century, scientists have investigated the energy-producing physics that power the sun and all other stars in the known universe. This process, known as nuclear fusion (not to be confused with fission, which powers current nuclear reactors), occurs when two light elements, such as hydrogen or helium, fuse together under immense heat and pressure, which produces tremendous amounts of energy. In December 2022, scientists for the first time produced more energy than they put into a fusion system, a nuclear milestone known as “ignition.” In 2025, an international collaboration of universities and governments hopes to ignite the International Thermonuclear Experimental Reactor (ITER) megaproject, which will essentially create electricity via an artificial, terrestrial-bound sun.

If only Thales could see us now.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Allstar Picture Library Limited./ Alamy Stock Photo

Sunday, Monday, happy days! It turns out any day of the week is just fine to tune in to this small-screen favorite of the 1970s and early ’80s, which introduced viewers to the coming-of-age exploits of Richie, Potsie, Ralph, and the rest of the gang at the diner in Milwaukee. The past is rarely quite as simple or rosy as we remember, but let’s don our poodle skirts and letterman jackets to take a look back at seven fun facts about one of the longest-running series in TV history.

Henry Winkler and Scott Baio rehearse on the set of 'Happy Days'.
Credit: Michael Ochs Archives via Getty Images

A Revival of 1950s Nostalgia Ushered “Happy Days” to the Small Screen

As described in his memoir My Happy Days in Hollywood, series creator Garry Marshall initially drew little interest in his TV show about a family in 1950s middle America. The pilot, starring future regulars Ron Howard (Richie), Anson Williams (Potsie), and Marion Ross (Mrs. Cunningham), and featuring Harold Gould instead of Tom Bosley as Mr. Cunningham, was relegated to an episode of the ABC anthology series Love, American Style in February 1972. However, the off-Broadway production of Grease was gaining steam around that time, and when American Graffiti hit movie theaters the following summer, with the same Ron Howard leading the cast, the 1950s nostalgia wave was primed to carry Happy Days to its stand-alone premiere on January 15, 1974.

Scene of Ron Howard from Happy Days.
Credit: Bettmann via Getty Images

Ron Howard Accepted His “Happy Days” Offer to Avoid the Draft

Howard was ambivalent about accepting an offer to headline what became Happy Days, as he’d already experienced sitcom success with The Andy Griffith Show and was looking forward to starting film school at USC. However, he’d also been saddled with a “horrible draft number,” and given that he stood a better chance of avoiding the Vietnam War through work than a college deferment, he elected to roll the dice with the good-natured ’50s sitcom.

Winkler as 'Arthur Fonzarelli', popularly known as 'The Fonz.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Henry Winkler Went Against the Grain to Land the Part of Fonzie

When auditioning for the role of tough guy Arthur “Fonzie” Fonzarelli, Henry Winkler decided to eschew the usual greaser moves of rolling cigarettes into his sleeve and combing his hair. When told he had to follow the script and comb his hair anyway, Winkler unveiled what became a trademark Fonzie move by reaching for his coif, before stopping to admire its perfection. Those efforts helped Marshall overcome his reservations about the actor’s physical appearance — he wanted the Fonz to be tall and blond, not short with dark hair — and Winkler demonstrated he was the right man for the part by devising catchphrases like “ayyyy” and “whoa” that proved irresistible to audiences.

The “Jump the Shark” Episode Was Instigated by Winkler’s Father

The show’s most infamous episode, in which a waterskiing Fonzie soars over a shark (thus launching a pop culture phrase about the decline of a once-prominent program), apparently stemmed from the insistence of Winkler’s dad, who pushed his son to show his waterskiing skills on camera. Winkler brought the idea to his “Happy Days” producers, who proceeded to incorporate it into the 1977 three-part season premiere that brings the gang to sunny California. Episode writer Fred Fox Jr. isn’t exactly sure who decided that Fonzie would jump over a shark, but he is adamant that it wasn’t the moment when the show started going downhill; given that Happy Days aired another 164 episodes to mostly high ratings, he may have a point.

Promotional photo of the actors of Happy Days.

Chuck Cunningham Was Deemed Expendable During Season 2

Fans who watched the series from the beginning surely noticed the unexplained disappearance of the older Cunningham son, Chuck. Played by Ric Carrott in the pilot, the role was passed to Gavan O’Herlihy, who grew frustrated with simply bouncing a basketball in his limited screen time and quit halfway through the show’s first season. Randolph Roberts briefly picked up the reins for season 2, but it was becoming clear, as Marshall later explained, that “Fonzie was like the older brother and that was the relationship that was working.” Chuck was quietly written out of the show after the season 2 Christmas episode, the point driven home when Mr. C declared how proud he was of his “two kids” (referring to Richie and Joanie) at the conclusion of the series.

The “Happy Days” Theme Song Didn’t Open the Show Until Season 3

The famed “Happy Days” theme song, written by Norman Gimbel and Charles Fox and originally sung by Jim Haas, wasn’t used for the opening credits in seasons 1 and 2. That spot was reserved for a rerecorded take of Bill Haley’s “Rock Around the Clock,” with the similar-sounding Gimbel-Fox composition on the closing credits. However, an updated version of “Happy Days,” by Truett Pratt and Jerry McClain, accompanied the opening credits for season 3, and eventually made its way to No. 5 on the Billboard charts. “Happy Days” was later recorded again by Bobby Arvon and used to open the show for its final season in 1983-84.

A Happy Days tv sitcom photoshoot with the high school actors.
Credit: Allstar Picture Library Limited./ Alamy Stock Photo

Original Episodes Aired After the Finale

Speaking of conclusions, Happy Days delivered a memorable ending with the Joanie-Chachi wedding and Bosley breaking the fourth wall to thank viewers for sharing the experience. Of course, this wasn’t actually the final episode to air. Because ABC interrupted its usual schedule to broadcast the 1984 Winter Olympics, there were several unseen episodes by the time the emotional “Passages” story seemingly wrapped things up that May. ABC proceeded to broadcast the remaining episodes over the course of the summer, until “Fonzie’s Spots,” in which the Fonz, Roger, and Chachi try out for Mr. C’s Leopard Lounge club, formally brought Happy Days to an anticlimactic end on September 28, 1984.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by xalien/ Shutterstock

For most of recorded human history, time has been carved up into various numbers of days, months, and years. Some ancient cultures relied on the moon to note the passage of days, and this ancient tradition still impacts the way we talk about the calendar (the words “moon” and “month” are actually related). Eventually, mathematicians and astronomers encouraged counting the days using another prominent feature of Earth’s sky — the sun.

Over the course of a few millennia, the calendar has been shaped and rearranged to fit fleeting political whims, religious observances, bureaucratic challenges, and bizarre superstitions. The story of the calendar is the story of humanity, and the answers to these questions show why.

The months and days of the year on calendar paper.
Credit: JLGutierrez/ E+ via Getty Images

Why Are There 12 Months?

At its start in the eighth century BCE, Rome used a 10-month calendar traditionally believed to be created by its legendary wolf-suckling founder, Romulus. This was a lunar calendar: The beginning of a month, or a new moon, was called the “kalends,” while a waxing half-moon around the seventh of the month was called the “nones,” and a full moon around the 15th of the month was called the “ides.” In this calendar, the year started with March, ended in December, and only added up to about 304 days. So what happened to the 60 or so days between December and March? Well, nothing — Romans just waited for the first new moon before the spring equinox to start the new year, meaning that much of the winter passed in a period without a calendar.

This system, understandably, didn’t work well, and was soon reformed by Rome’s second king, Numa Pompilius, around 713 BCE. Pompilius added additional months — now called January and February — to the end of the year, creating a 12-month calendar (they eventually moved to the front of the year by 450 BCE). The months totaled 354 days, but because of a Roman superstition around even numbers, an extra day was added to January. Since 355 days is still out of sync with the solar year and thus the seasons and celestial events, the king then added extra days, called intercalation, to the latter part of February in certain years. This made the Roman calendar’s average length 366.25 days long — still off, but much better than Romulus’ temporal train wreck.

Pompilius’ creation was eventually undermined by Roman pontifices, or priests, who wielded intercalation like a political cudgel — extending the rule of favored politicians while curtailing the term limits of enemies. After 700 years, the Roman calendar was a mess, and the powerful general and statesman Julius Caesar decided to fix it. Following consultation with Rome’s greatest mathematicians and astronomers, he implemented the Julian calendar in 45 BCE. Influenced by the 365-day Egyptian calendar and the mathematics of the Ancient Greeks, this calendar discarded Pompilius’ even number superstition and added extra days equaling 365. But the most notable advancement of Caesar’s calendar was that it embraced the sun as the basis of the calendar rather than the moon. Finally, after 700 long, horribly mismanaged years, the calendar was divided into our modern 12 months.

Monthly hand writing on colorful notes on a wooden board background.
Credit:nuntarat eksawetanant/ Shutterstock

Where Do the Names of the Months Come From?

The short answer is Rome, but the long answer is much more interesting. Remember Romulus’ 10-month calendar? Well, September, October, November, and December simply mean “seventh month,” “eighth month,” “ninth month,” and “tenth month” in Latin, respectively. But these names no longer made sense after the later additions of January, named after the Roman god Janus, and February, named after the Roman purification festival Februa. As for the rest of the months, March is named for the Roman god Mars, April after the Greek goddess Aphrodite (though there’s some debate about whether it might be based on the Latin word aperio, which means “I open” in relation to spring flowers), May after the Greek deity Maia, and June in honor of the powerful Roman goddess Juno.

The names of the last two months come from a few powerful Romans who got a little full of themselves. In 44 BCE, the month Quintilis (which means “fifth” in Latin) was changed to July in honor of Julius Caesar. His heir, Augustus, received the same honor in 8 BCE, when Sextilis (you guessed it, meaning “sixth” in Latin) was changed to August.

Aerial view of a calendar desk 2022 on February month.
Credit: Pakin Songmor/ Moment via Getty Images

Why Is February the Shortest Month of the Year?

February has fewer days because of the superstitions of ancient Rome. In the late eighth century BCE, Romans — including their king Numa Pompilius — held a superstition that even numbers were somehow unlucky. Although he created a version of a 12-month calendar, Pompilius realized there was no mathematical way for every month to have an odd number of days and for the total number of days in the year to also be odd. So while the other months were either 29 or 31 days long, February became the unlucky month to have only 28 days, making Pompilius’ calendar the apparently-less-scary number of 355.

In 45 BCE, Caesar — disregarding Pompilius’ fear of even numbers — added days to a number of other months, but not February. Some experts believe Caesar didn’t want to disrupt the important festivals that took place in that month and so he just let it be. But with the introduction of the Julian calendar, February did receive a consolation prize in the form of an additional day every four years. Speaking of which …

February 29 written on chalk board, with a yellow post-it with text Leap Day.
Credit: Brigitte Pica2/ Shutterstock

Why Do We Need a Leap Day?

A year isn’t 365 days, it’s actually 365.24219 days. Because of our planet’s frustratingly imperfect solar orbit, calendars need small adjustments as the years pass to keep in alignment with equinoxes and solstices. Ancient astronomers and mathematicians figured that waiting four years and then adding a day made the most sense. In 45 BCE, Julius Caesar introduced the modern leap year, which added an extra day in February every four years (though originally that extra day was added between the 23rd and the 24th). This moved the calendar closer to solar reality at 365.25 days. Close, but not close enough which is where the pope comes in.

Smartphone on screen with calendar for 2022, with a pen in a female hands.
Credit: megaflopp/ iStock

Who Made the Modern Calendar?

In 1582, Pope Gregory XIII had a problem. As head of the Catholic Church, he realized that Easter — his religion’s holiest day — had drifted 10 days off in relation to the spring equinox, which is supposed to be used to calculate Easter day. That’s because Caesar’s small mathematical error had grown exponentially larger when stretched across 1,600 years. Gregory XIII needed a very slight adjustment to the calendar, just enough to nudge it closer to that magical 365.24219 number. First, Gregory XIII lopped 10 days off the calendar to set things straight, then tweaked the leap year. Now, whenever a new century began that wasn’t divisible by 400 (i.e. 1700, 1800, 1900), no extra day was added. This edged things just enough in the right direction that this new calendar, named the Gregorian calendar, was now 365.2425 days long — close enough. Catholic nations adopted this new calendar immediately, but the Protestant British Empire, along with its American colonies, didn’t sign on until 1752. Today, the Gregorian calendar is used in nearly every country.

An idyllic view of the ancient stone portico in the square of San Pellegrino.
Credit: Photo Beto/ iStock

When Did We Start Using B.C. and A.D.?

Before the invention of A.D. (“anno domini,” which means “in the year of our Lord”) and B.C. (“before Christ”), years were often tracked by the reigns of pharaohs, kings, and emperors. In a way, B.C. and A.D. still reflect this system but focus on just one moment — the birth of Jesus. It’s difficult to trace the exact origins of this system, but one of the earliest recorded uses of “anno domini” occurs in 525 with the work of Dionysius Exiguus, a monk who was trying to determine what days Easter would fall in future years. Crucially, he started his tables with the year 532, stating that this year was “from the incarnation of our Lord Jesus Christ.”

The conception of “B.C.” is slightly murkier. Some believe the Venerable Bede, the famous medieval English historian, was the first to use it, or at least greatly popularized it in his 731 work, Ecclesiastical History of the English People. Others point to a 1627 work by a French Jesuit who used “ante Christum” to describe the pre-Jesus years. The terminology became more widespread during the reign of the Holy Roman Emperor Charlemagne, who used it as a standard form of dating across Europe in the ninth century.

Within the last few decades, more publications and organizations have opted to strip the years of their religious connotation, preferring BCE (Before the Common Era) and CE (Common Era) over the traditional B.C./A.D. system, although the move is not without some controversy. But this subtle change in phrasing doesn’t alter the fact that the world still counts the years in accordance with the birth of Jesus.

Aerial view of a person planning out their weekly calendar.
Credit: seb_ra/ iStock via Getty Images Plus

Why Is a Week Seven Days?

The seven-day week is a timekeeping oddity. Unlike days, months, and years, the week doesn’t align with any celestial reality, and it doesn’t divide elegantly into existing periods of time. For example, there aren’t 52 weeks in an average year — there are 52.1428571429. So how did this happen? Babylonians, the ancient superpower of Mesopotamia, put a lot of stock in the number seven thanks to the seven observable celestial bodies in the night sky — the sun, moon, Mercury, Venus, Mars, Jupiter, and Saturn. This formed the seven-day week, which was adopted by the Jewish people, who were captives of the Babylonians in the sixth century BCE. Eventually, it spread to ancient Greece and elsewhere thanks to the battle-happy Macedonian Alexander the Great. Efforts have been made throughout history to reform the seven-day week, but this oddball unit of time has become ingrained in many religions, including Judaism, Christianity, and Islam, rendering any sort of tweak pretty unlikely.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Varavin88/ Shutterstock

As more private vehicles than ever hit crowded freeways and spend more time on the road, more of them get stuck in traffic. Over 90% of American households have access to at least one car, so chances are you’ve been caught in a gridlock a few times before. What causes congested traffic — especially the jams that seem to come from nowhere? Is traffic getting worse? How much time do we spend in traffic, anyway? These seven facts about traffic will give you something to think about when the freeway slows down to a crawl.

Aerial view of a traffic jam during rush hours.
Credit: Jens Herrndorff/ Unsplash

The Average American Lost 51 Hours in Traffic in 2022

According to the traffic analytics firm INRIX, Americans spent, on average, 51 hours stuck in traffic in 2022. That may seem like a lot, but the United Kingdom had it worse: Brits lost 80 hours on average. Americans’ traffic delays also cost an average $546 in fuel costs, INRIX says. All of those numbers are a huge jump from the year before, but that doesn’t mean much, because …

A view of a empty street with no cars during Covid-19 lockdown.
Credit: Education Images/ Universal Images Group via Getty Images

Traffic Dropped During the COVID-19 Pandemic

With many workers required to stay home at the beginning of the COVID-19 pandemic, fewer cars were on the road, and congestion dropped. The Texas A&M Transportation Institute noted that in 2020, January and February were pretty normal — but in March, after COVID lockdowns started to take effect in much of the country, traffic looked more like it did in the early 1990s. By the fall, traffic had started to creep back up to around 2005 levels.

Even after congestion started to ramp back up, however, the effects of the pandemic lingered. According to INRIX, some major American cities, including New York, Boston, and Philadelphia, still saw decreases in 2022 compared to 2019.

Traffic in downtown Chicago with people, train, car and bus.
Credit: f11photo/ Shutterstock

Chicago May Have the Worst Traffic in the U.S.

Unlike some cities, Chicago’s traffic is slightly worse than it was before 2020 — and according to INRIX’s annual global rankings, it’s the most congested city in the United States by a significant margin, with the average commuter spending 155 hours in traffic delays in 2022. On a global scale, it’s nearly tied with London, which INRIX analyzed to be the highest-traffic city in the world.

A woman with a stressed look, sitting in traffic.
Credit: stefanamer/ iStock

Traffic Really Does Come Out of Nowhere Sometimes

If you drive, chances are you’ve ended up in traffic with no clear cause — there are no accidents or construction, and yet somehow you’re at a total standstill. These are called phantom traffic jams or jamitons, and experts say you can help avoid them by not riding anyone’s bumper.

These jams happen when one driver abruptly slows or brakes, and the next driver slows or stops to avoid a collision. This travels like a wave, usually 100 to 1,000 yards long, even as the drivers at the front of the line return to normal speeds.

The easiest way to prevent phantom traffic jams, according to multiple studies, is to keep about an even distance between the car in front of you and the car behind you — so avoid tailgating, but also keep an eye on your rearview mirror. Giving cars more space to gradually adjust speed won’t completely eliminate phantom jams, but it will make them less likely to occur.

Traffic street light at a city intersection.
Credit: Pgiam/ iStock

Traffic Lights Used To Be Manually Operated

The first traffic light was gas-powered and installed in London on December 9, 1868, just outside the Houses of Parliament. Automated lights were still several decades away, so the signal had to be manually operated by a police officer 24 hours a day. This first light lasted only a month. While London wouldn’t attempt to have traffic lights again until the city installed automatic ones, other cities adopted manual traffic lights and spent decades staffing them. The first electric light was installed in Cleveland, Ohio, in 1914.

A man on the phone while looking at a scratch on his car door from a fender bender.
Credit: MarianVejcik/ iStock

Car Accidents Are More Likely Right After the Daylight Saving Switch

If you’re not a morning person, the days immediately following the switch to daylight saving time — or “spring forward” — might be a struggle, especially that first Monday morning. With so many sleepy drivers on the road, accidents are more likely. The average American’s risk of getting in a car accident rises by around 6% that day, along with an increase in other potentially fatal events such as heart attacks and strokes.

Cars line up at a road during a traffic jam in China.
Credit: China Photos/ Getty Images News via Getty Images

A Traffic Jam in China Lasted More Than a Week

In August 2010, motorists outside Beijing got stuck in a real doozy of a traffic jam, caused by a harrowing conjunction of a construction project and too many cars on an already overburdened freeway. The jam stretched for 60 miles and lasted around 11 days. Some individual cars were stuck on the road for more than five days. Local villagers took advantage of the situation by selling food and water to drivers at a premium, including water bottles marked up to around 10 times their original price. Officials estimated the jam could have lasted a couple of weeks longer, which thankfully did not come to fruition — although major traffic jams continued to be a problem.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by microgen/ iStock

Harry Houdini and David Blaine may be household names, but their success has been built on the shoulders of magicians who haven’t received much recognition. The “celebrity magician,” after all, is a recent phenomenon: For centuries, illusionists and escape artists were impugned as lowlifes (at best) and criminals (at worst). But none of that would stop these magic-makers, who helped pave the way for our modern superstars. From mythical sorcerers to skeptical writers, here are some of the most influential magicians in history.

Enigmatic surrealistic optical illusion.
Credit: edcophotogrphy/ iStock

Djedi: History’s Most Captivating Decapitator

An Egyptian magician who purportedly lived 4,700 years ago, Djedi may have been history’s first illusionist. According to the Westcar Papyrus, an ancient Egyptian text, Djedi could magically remove — and reattach! — the heads of living animals: Geese, waterfowl, and even bulls. (Centuries later, David Blaine would re-enact the stunt with a chicken.) Historians, however, caution that the magician’s greatest trick was fooling us to believe he existed: Dejdi might be a work of fiction.

The moment at the feast of Belshazzar.
Credit: Three Lions/ Hulton Archive via Getty Images

Belshazzar’s Incompetent Magicians: The Reason There’s Writing on the Wall

The Bible contains dozens of references to sorcerers, necromancers, and conjurers. In the First Book of Samuel, the Witch of Endor summons the spirit of a prophet. In the apocrypha, Simon Magus is able to levitate and even fly. But one of the most famous references to magicians appears in the tale of King Belshazzar’s Feast. As the story goes, the King was enjoying an opulent meal when a hand mystically appeared and began to write a cryptic message on a nearby wall, spelling out his doom. A panicked Belshazzar asked his magicians to interpret the message — but the magicians failed, and Belshazzar soon died. The scene is now immortalized in the idiom: “To see the writing on the wall.”

Portrait of Luca Bartolomeo de Pacioli or Paciolo.
Credit: DEA / A. DAGLI ORTI/ De Agostini via Getty Images

Luca Pacioli: The Accountant Who Could Breathe Fire

An Italian mathematician and friar who lived in the 15th century, Luca Pacioli is widely considered the “Father of Accounting.” But his skills expanded beyond bookkeeping: He’s also one of the earliest writers on the art of magic. His unpublished 1508 book De Viribus Quantitatis discusses an array of magic tricks: how to make an “egg walk over a table,” how to make a “cooked chicken jump on the table,” and how to “make a snow torch that burns.” He’s also the first to discuss various card tricks, coin tricks, and fire-eating techniques.

Photograph of Ching Ling Foo.
Credit: The History Collection/ Alamy Stock Photo

Ching Ling Foo: America’s First Chinese Superstar

The first Chinese performer to hit it big in the U.S., Ching Ling Foo’s performances in 1899 routinely packed the house and made him a superstar. An expert in traditional Chinese illusions, Foo could throw a shawl into the air and — as it settled to the ground — conjure large objects out of thin air. Unfortunately, Foo would be the victim of a racist scam. An American magician named William Robinson stole Foo’s act, dressed in yellowface, called himself “Chung Ling Soo,” and billed himself as Foo’s competitor: “The Original Chinese Conjurer.” The two magicians would feud for the rest of their lives.

'Father of modern conjuring' performing at St James's Theatre in London.
Credit: Photo 12/ Universal Images Group via Getty Images

Jean-Eugène Robert-Houdin: The Clockmaker With Magic Hands

A French clockmaker, Robert-Houdin developed fine-motor skills fixing cogs and gears in his family’s shop — and then began using them to learn sleight-of-hand tricks. He used this know-how to build androids and other mechanical wonders, which helped him build audiences in the mid-19th century. It wasn’t long before Robert-Houdin was performing conjuring acts for mass audiences. Today, Robert-Houdin is widely recognized as the father of modern magic, having transformed it from a low-class artform to something the theater-going wealthy could enjoy. He’d also inspire a young Ehrich Weiss, a Hungarian-American escapologist who you might know by a different name: Harry Houdini.

Herrmann the Great Co. 3rd annual tour of the Herrmann the Great.
Credit: Buyenlarge/ Archive Photos via Getty Images

Alexander and Adelaide Herrmann: Magicians with a Funny Bone

Few people have shaped our definition of a magician more than Alexander Herrmann. Called “Herrmann the Great,” the Victorian-era Frenchman was one of the first people to pull a live rabbit out of a hat. But Hermman’s most important contribution to modern magic was his performing style: He was one of the first magicians to make a comedy routine central to his performance. His wife, Adelaide, was no slouch, either. Called the “Queen of Magic,” she’s believed to be the first woman to ever perform the dreaded “bullet catch trick,” and she continued to tour internationally for another 25 years after Alexander’s death.  

Escapologist and conjuror Jasper Maskelyne after escaping from a coffin at the Kingscourt Hotel.
Credit: Keystone/ Hulton Archive via Getty Images

Jasper Maskelyne: The Illusionist Who Deceived the Nazis

Every magician, at their core, is a master of deception. But when Jasper Maskelyne moved his act from the stage to the theater of war, his deception skills were used to save lives. During World War II, Maskelyne joined the British military and used his knowledge as an illusionist to trick the Nazis. His team took camouflage to a new level, creating deceptive decoys to trick enemy fighter pilots: fake harbors filled with phony boats and dazzling light-displays that, from above, looked like cities. The illusions reportedly caused the enemy to waste tons of ammunition.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.