Original photo by Piotr Wytrazek/ Shutterstock

Decades ago, the world didn’t just look different; it sounded different, too. We communicated, watched our favorite movies, and did mundane tasks using different devices, and as technology has progressed, so has the noise we hear every day. A smartphone buzzing on a table would have been an unfamiliar noise 20 years ago — and a lot of stuff we used back then has fallen silent today.

Take a listen down memory lane with these eight sounds that we don’t hear much anymore, from old-timey internet accessories to vintage AV equipment.

An old, external dial-up modem.
Credit: Doug McLean/ Shutterstock

Dial-Up Modem

Back in the early days of the internet, your connection worked through your landline phone. Instead of having your internet on most of the time, you had to deliberately connect by asking your computer to dial in. That started a telltale series of intense-sounding noises, beginning with a dialing sound and escalating into bouncing beep-boops and several pitches of static. This song-and-dance served a purpose: The sounds were the various complicated steps of computers trying to talk to one another using borrowed infrastructure.

Because the connection tied up your phone line, if you didn’t have a second line and somebody tried to call you, they’d get another sound you don’t hear too often nowadays …

Detail of a woman's hand picking up an old white corded telephone.
Credit: Alicia Fdez/ Shutterstock

Busy Signal

It’s now really easy to put someone on hold to answer another call. But back when nearly everyone had a landline, it was common to call someone and hear a series of beeps indicating that they were on another call. Call waiting eventually became available for nonbusiness landlines, but it still wasn’t as easy to switch over as it is on a smartphone, since there wasn’t any visual interface to guide you.

You’ve got mail message concept with computer keyboard.
Credit: Eviart/ Shutterstock

“You’ve Got Mail!”

America Online, better known as AOL, used to be America’s biggest internet provider, and was so ubiquitous in everyday culture that you didn’t have to be a subscriber to know what it sounded like to get an email via the service. A male voice semi-enthusiastically stating, “You’ve got mail!” was so well known that it even lent its name to an A-list rom-com.

Closeup of Old fax machine.
Credit: Takaeshiro/ Shutterstock

“Accidentally Called a Fax Machine” Sound

Having to key in a number every time you called someone — as opposed to just finding someone in your contact list or making Siri call someone for you — meant that mistakes were inevitably made. Sometimes you’d read the wrong line of a business card, dial the wrong number, or just catch someone at the wrong time and get a screeching ringing sound, indicating that there was a fax machine on the other end.

Vintage television with test patter.
Credit: shaunl/ iStock

TV Test Pattern Beep

If you still have TV service, there’s something on 24/7, even if it’s infomercials. Years ago, however, channels would eventually pack it in for the night and display a test pattern — a series of colorful bars designed for calibrating a color TV or, on the other end, a camera. (Before color TVs, they looked much different.) This was often accompanied by an obnoxious long beep for calibrating audio.

man rewinding a cassette tape.
Credit: Pingun/ Shutterstock

Rewinding Tape Noise

From the 1970s until DVDs took over, most home video was on VHS tapes, which used a length of magnetic tape to store audio and video. Tape moved from one spool to the other as the video played, so if you wanted to go back to the beginning, you’d have to rewind it, which made a distinct whirring sound. The same thing applied to audiocassettes, although you could flip those over and play the other side to get back to square one.

Retro rotary telephone.
Credit: WPixz/ Shutterstock

Rotary Telephone Dials

When you dial a phone — even a landline — you’re typically pressing buttons, not actually dialing. Rotary telephones predate the touch-tone models most people are used to, and had an actual round dial, with different points corresponding to different numbers. To call someone, you had to turn the dial from each number, let go, and wait for the dial to return to the starting point before putting in the next digit. The rotation of the dial made a kind of rapid clicking sound.

Close-up of an Adding Machine.
Credit: Lloyd Paulson/ Shutterstock

Adding Machine Typing and Tape Sounds

If you needed to crunch some numbers on a calculator and required a record of your work, you used to need an adding machine — a calculator that printed out each equation and sum as you typed. Then you could use it as a receipt or go back and check your work. It made a distinct series of sounds: an electric typewriter-esque tapping as you entered the numbers, then a big crunch when you told it to add or subtract and it went to the next line.

It was a pretty common sight (and sound), especially in stores, in banks, and around tax time, until everybody had a computer in their pocket that could do the same thing.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Trinity Mirror / Mirrorpix/ Alamy Stock Photo

He was an irresistibly compelling actor who exploded from the stage in A Streetcar Named Desire and the screen in On the Waterfront and The Godfather, before seemingly rejecting the ability and beauty that had made him so famous. But even in a career marked by as many disappointments as triumphs, Marlon Brando was never anything less than an original character. Here are nine facts about the life of a Hollywood icon who raised the bar for all the leading men who followed.

Young Marlon Brando writing at a desk.
Credit: Pictorial Parade/ Archive Photos via Getty Images

A Teenage Brando Was Expelled From Military School

The youngest child of a strict father and an alcoholic mother, and hamstrung by dyslexia, Brando acted out in school. According to Peter Manso’s biography, Brando orchestrated an endless series of pranks while accumulating just six of 15 possible credits over three years at Libertyville (Illinois) High School, forcing his dad to arrange a transfer to Minnesota’s Shattuck Military Academy. But the disruptions continued in Shattuck’s hallways, with Brando at one point hiding the dining room silverware before he was kicked out at the end of his second year. Amazingly, the cadets who were often the butt of his jokes threatened to boycott classes over what they felt was an unfair expulsion, and Brando later framed the letter of support they wrote to him.

Scene From "A Streetcar Named Desire" featuring Marlon Brando.
Credit: Bettmann via Getty Images

He Nearly Blew the Opportunity for His Breakout Stage Role

After witnessing Brando’s impressive audition for A Streetcar Named Desire in August 1947, director Elia Kazan gave the magnetic young actor money to take a bus to Massachusetts for a further tryout with playwright Tennessee Williams. Brando instead spent the cash on party supplies, before hitchhiking his way to Massachusetts a week later. Upon reaching Williams’ home, Brando smoothed over any bad feelings about his late arrival by fixing a blown fuse and broken toilet. A quick read for the part sealed the deal, and Brando was on his way to revealing his preternatural talent to the world.

Marlon Brando In One-Eyed Jacks film.
Credit: Silver Screen Collection/ Moviepix via Getty Images

He Directed One Feature Film

Taking on an outsized role in the production of One-Eyed Jacks (1961), Brando drove out original helmer Stanley Kubrick and took over double duty as director and star of the Western. That worked out fine for his artistic sensibilities, but Brando’s habit of letting the camera endlessly roll as characters improvised their way through scenes took its toll on time and budget constraints. After viewing the costly, 4.5-hour director’s cut, producer Frank P. Rosenberg complained, “That’s not a picture. That’s just an assemblage of footage.” One-Eyed Jacks was whittled down to 141 minutes, and while the still-meandering final product has its admirers, the experience was apparently off-putting enough to discourage its star from returning to the director’s chair.

Marlon Brando's bungalow with his HAM radio equipment on Tetiaroa Beach circa 1979.
Credit: Images Press/ Archive Photos via Getty Images

Brando Owned an Atoll in French Polynesia

After frolicking in the tropical locale of Tahiti during the filming of Mutiny on the Bounty (1962), Brando decided to take a slice of paradise for himself by buying the nearby atoll of Tetiaroa in 1966. Although the initial plan was to build a hotel as part of what would become a self-sustaining community, Brando preferred using the property as a private retreat for himself, family, and friends, though he neglected to put in the time and money needed for its upkeep. He steered clear of Tetiaroa following a tragedy involving his son and a daughter’s boyfriend in the early 1990s, and a section of the atoll was leased to a developer after the actor’s death in 2004. That area now boasts the Brando Resort, the sort of exclusive vacation destination its namesake was reluctant to develop while still alive.

Marlon Brando and Robert Duvall in a scene from Francis Ford Coppola's 1972 'The Godfather'.
Credit: Screen Archives/ Moviepix via Getty Images

He Spontaneously Created Vito Corleone’s Persona During a Screen Test

Although Paramount Studio executives were loath to cast Brando in The Godfather (1972) following his string of poorly received films, director Francis Ford Coppola convinced them to at least consider a screen test. He subsequently brought a camera to Brando’s home, upon which the just-awakened host, realizing this was his audition, quickly slipped into his interpretation of Mafia boss Vito Corleone. Suggesting that Corleone should “look like a bulldog” and talk in a peculiar way, Brando stuffed tissues into his mouth and began acting out the character, even delivering that now-famous mumbling when answering a phone call. The once-leery execs were floored by the footage, paving the way for Brando’s highly celebrated comeback.

Sacheen Littlefeather refuses the Academy Award for Best Actor on Marlon Brando's behalf.
Credit: Bettmann via Getty Images

Brando Surprised Sacheen Littlefeather With His Plan for the 1973 Oscars

In one of the more infamous moments of his career, Brando sent actress and activist Sacheen Littlefeather to the 1973 Academy Awards to decline his Best Actor Oscar over “treatment of American Indians today by the film industry.” Littlefeather, who had struck up a friendship with the actor via their shared passion for Native American rights, reportedly wasn’t aware of the full scope of Brando’s plan until the afternoon of the Oscars telecast. She then waited as Brando worked on a lengthy speech, leaving her barely enough time to make it to the ceremony, and endured harassment in the parking lot before making it back to the safety of the actor’s home. According to Manso’s biography, Brando was happy with her effort, although he later noted that he’d “probably handle it differently” were he to do it all over again.

Marlon Brando And Jack Nicholson In 'The Missouri Breaks'.
Credit: Archive Photos/ Moviepix via Getty Images

He Was Close Friends With Fellow Star Jack Nicholson

While Brando and Jack Nicholson made for a fun pairing in the 1976 Western The Missouri Breaks, the two were far closer than your typical co-stars. The actors shared a driveway as Los Angeles-area neighbors for about 30 years, and at one point even lived together while Nicholson was going through a divorce. Nicholson helped care for Brando toward the end of his life, after which he penned a heartfelt obituary in Rolling Stone magazine. He also purchased the late actor’s mansion with the hope of making it available to Brando’s children, but reportedly turned it into a garden when none of them showed any interest in the property.

Marlon Brando plays the bongos in his Hollywood Hills home, 1955.
Credit: Bettmann via Getty Images

Brando Received Four Patents for a Drum Tuner

An enthusiastic percussionist with an ear for Afro-Cuban music and an innovative mind, Brando devoted much of his final years to developing a new and improved conga drum. Collaborating with a custom drum parts maker and a patent attorney, Brando obtained four patents for his drum tuner, a single lever and linkage system designed to replace the five or six bolts normally used for the purpose. He even produced a few working prototypes, but was unable to get the design licensed before his passing.

Close-up of Marlon Brando.
Credit: Ron Galella Collection via Getty Images

More Than 300 Hours of Confessional Audio Tapes Were Found After His Death

Although Brando published an autobiography in 1994, that book provided only a partial reveal of a celebrity who increasingly shunned the spotlight. Additional insights arrived around 20 years later, when a production team gained access to more than 300 hours of audio footage of the actor waxing on his troubled upbringing, his own struggles as a father, his relationship with fame, and much more. Producers also found rudimentary 3D scans of Brando’s head, and used updated technology to match some of the audio with his animated, talking face. The result was the 2015 documentary Listen to Me Marlon, a life story narrated solely by the enigmatic star, between old film and interview clips, that marked one final posthumous screen performance.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by Photo 12/ Alamy Stock Photo

Most of us are familiar with holiday-themed movies like It’s a Wonderful Life and A Christmas Story, whether they’re remembered from childhood or part of an annual tradition today. But even the die-hard enthusiasts who’ve committed the dialogue to heart may not know the behind-the-scenes stories that helped bring these heartwarming films to life. Here are nine facts about some of the classics that regularly show up on our TVs in November and December, but of course can be enjoyed at any time of year.

James Stewart and Donna Reed in It's a Wonderful Life.
Credit: Screen Archives/ Moviepix via Getty Images

A New Kind of Fake Snow Was Created for “It’s a Wonderful Life”

The problem that plagues many holiday movies is how to create convincing snow when there isn’t any, and It’s a Wonderful Life (1946) director Frank Capra wasn’t satisfied with the bleached cornflakes that had been used to middling effect in other Hollywood features. Fortunately, RKO special-effects man Russell Shearman found a solution by mixing the carbon dioxide foam found in fire extinguishers with soap, sugar, and water. The resulting mix not only looked the part (and was much less noisy than cornflakes), but it also held up through fan-controlled applications that could be sped up to simulate a blizzard. More than enough of this “snow” was created to give the fictional Bedford Falls a wintry backdrop despite the film’s summertime shoot, and Shearman later received a technical achievement Oscar for his contribution to movie magic.

Parade scene in Miracle on 34th Street.
Credit: Collection Christophel/ Alamy Stock Photo

The Thanksgiving Day Parade in “Miracle on 34th Street” Was Real

Staging a parade in a movie can be an arduous undertaking with all the performers, set pieces, and choreography involved, but the creators of Miracle on 34th Street (1947) were fortunate to gain permission to hitch their wagons to New York City’s annual Macy’s Thanksgiving Day Parade in 1946. As co-star Maureen O’Hara recalled in her memoir, the experience of working around the event’s schedule was stressful for everyone involved: “They weren’t going to run the parade more than once on our account … It was a mad scramble to get all the shots we needed and we got to do each scene only once.” Nevertheless, the cameras got enough footage of Edmund Gwenn’s Kris Kringle waving to fans as he rode through Manhattan in Santa’s sleigh, and the authenticity of the scene set the tone for what became a true holiday classic of Hollywood’s golden age.

Actors singing in 1954's White Christmas.
Credit: John Swope/ The Chronicle Collection via Getty Images

“White Christmas” Was Supposed To Pair Fred Astaire With Bing Crosby

Following the success of 1942’s Holiday Inn and 1946’s Blue Skies, 1954’s White Christmas was meant to once again pair the singing and dancing talents of Bing Crosby and Fred Astaire. When Astaire declined to participate over his dissatisfaction with the script, the role of Phil was offered to Donald O’Connor. When he was stricken with illness before production began, the casting merry-go-round ended with Danny Kaye stepping in. Crosby at one point also backed out of the movie following the death of his wife in 1952, before returning to play the part of Bob the following year.

RUDOLPH THE RED-NOSED REINDEER 1964.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

A Pioneering Japanese Stop-Motion Animator Was Behind “Rudolph the Red-Nosed Reindeer”

Tadahito Mochinaga created China’s first stop-motion puppet animation with a 1940s propaganda film mocking nationalist leader Chiang Kai-shek, and he created Japan’s first stop-motion puppet animation the following decade for a beer company. Those pioneering efforts caught the attention of American TV producers Arthur Rankin and Jules Bass, who tapped the Tokyo-based filmmaker to animate an adaptation of a Depression-era Christmas story turned hit holiday tune. Mochinaga brought his trademark detail to the project, even spending time in a Japanese deer sanctuary to better render the distinct features of the main characters. The mesmerizing result can still be witnessed many years later, as Rudolph the Red-Nosed Reindeer (1964) became the first in a string of popular Rankin/Bass seasonal holiday programs, en route to becoming the longest-running Christmas special in TV history.

A Charlie Brown Christmas scene.
Credit: Photo 12/ Alamy Stock Photo

Head Animator Bill Melendez Voiced Snoopy in “A Charlie Brown Christmas”

A Charlie Brown Christmas (1965) marked the Peanuts gang’s first major entry into the world of animated television. This brought numerous questions about how to translate the popular comic strip to the screen, among which was what to do about the voice of Snoopy. Although Peanuts creator Charles Schulz wanted to downplay Snoopy’s role, head animator Bill Melendez insisted on enhancing the beagle’s personality through his voice, and set about recording a series of noises that he hoped could be replicated by a trained voice actor. With time running out to finish the special, Melendez went with the sped-up, higher-pitched recordings he had been tinkering with instead of hiring another actor. Schulz was amused by Snoopy’s nonsensical ramblings, and Melendez was rewarded with the responsibility of voicing Charlie Brown’s pet for subsequent TV specials and animated features.

Tongue pole scene in A Christmas Story (1983).
Credit: Moviestore Collection Ltd/ Alamy Stock Photo

Flick’s Tongue Wasn’t Really Stuck to the Flagpole in “A Christmas Story”

You may have already figured there was nothing approaching actual danger for the actor in this enduring scene of A Christmas Story (1983), although clever set design ensured that the visual of a tongue stuck to a frozen flagpole seemed real enough. According to actor Scott Schwartz, the pole was wrapped with a layer of plastic, through which a clear tube ran down to a motorized vacuum buried in the snow. When Schwartz’s Flick plugged his tongue into a tiny hole in the plastic, the tube’s suction was strong enough to keep his organ in place, but mild enough to be easily withdrawn. All in all, it was painless enough for Schwartz to shoot the entire scene twice — after the first round of footage was damaged by underdeveloped film.

Car scene in Planes, Trains and Automobiles film.
Credit:United Archives GmbH/ Alamy Stock Photo

“Planes, Trains and Automobiles” Was Based on a True Story

Back when acclaimed screenwriter and director John Hughes was an unknown advertising copy man, he regularly traveled from Chicago to New York City on behalf of a client. During one blustery winter day, strong winds nixed the return flight to Chicago and forced him to find a hotel for the night. More cancellations awaited the following day due to deteriorating weather in the Midwest, and Hughes wound up on a flight that was rerouted to Des Moines, Iowa, and then Denver, Colorado, before he decided to remain on board for the sunnier destination of Phoenix, Arizona. Hughes eventually made it to Chicago five days later than originally planned, the torturous experience leaving a lasting imprint that became the basis of his 1987 Thanksgiving travel comedy Planes, Trains and Automobiles.

MACAULAY CULKIN in HOME ALONE, 1990.
Credit: AJ Pics/ Alamy Stock Photo

Macaulay Culkin’s Iconic Facial Gesture in “Home Alone” Was Improvised

Even fans who haven’t seen Home Alone (1990) in eons can recall the image of Macaulay Culkin, as the abandoned Kevin McCallister, slapping aftershave on his face and screaming into the mirror. However, that scene didn’t quite go according to plan; most people would move their hands after creating a burning sensation on their face, and director Chris Columbus instructed his young star to do so. Instead, Culkin kept his hands glued to his face as he screamed at his reflection, prompting everyone else to break up in laughter. Although different reactions were tried in subsequent takes, it was that first one that stuck and became a defining moment of the immensely successful comedy despite encompassing a tiny fraction of the 103-minute running time.

Zooey Deschanel in Elf film.
Credit: Album/ Alamy Stock Photo

The Shower-Duet Scene in “Elf” Was Written for Actress Zooey Deschanel

While Elf (2003) was built around the physical comedic chops and man-child persona of star Will Ferrell, the endearing scene of Ferrell’s Buddy discovering Jovie singing in the shower didn’t take shape until Zooey Deschanel joined the production. According to Deschanel, the scene was initially a fluid one that would showcase the individual talent of the actress cast as Jovie; once her crooning abilities became apparent, the specifics of Buddy naively wandering into the women’s changing room fell into place. An added bonus was Ferrell’s surprisingly solid pipes, which brings a layer of sweetness to the building tension until Jovie inevitably realizes Buddy’s presence and orders him out.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by katalinks/ Shutterstock

When you think of vampires, what thoughts come to mind? Do you think of Dracula or Count von Count from Sesame Street? Or perhaps you think of more recent books, television series, and movies such as Twilight, Buffy the Vampire Slayer, and Blade? Once known as terrifying beings that would suck the lifeblood from people, these creatures somehow made the shift to become romantic and appealing. So what’s up with our collective fascination with vampires, and why do vampires keep appearing in pop culture?

Portrait of Vlad III the Impaler, or Dracula (1431-1476).
Credit: Stefano Bianchetti/ Corbis Historical via Getty Images

Origins of the Vampire

Long before Brad Pitt made vampires look sexy, tales of the creatures had been around for centuries, but they were feared. Vampires have popped up in mythology as far back as the ancient Egyptians. But most historians agree that the vampire as we know it today got its start in Europe sometime during the 17th and 18th centuries. According to scholars, Bram Stoker's Dracula novel was inspired by the real Romanian Prince Vlad Tepes, who lived during the 15th century around Transylvania.

While Romania sometimes looks fondly on his legacy, he was known to be very cruel to those he conquered, earning himself the nickname “Vlad the Impaler.” Some stories go so far as to say that he even dined with his dying victims, dipping his bread in their blood. Tales of vampire-like creatures also come from Asia; in Chinese mythology they’re known as jiangshi (pronounced chong-shee, and meaning “stiff corpse”).

Old copy of Dracula book.
Credit: Torontonian/ Alamy Stock Photo

Vampires in Literature

One of the best-known works about vampires — and the one that is credited with propelling vampires into the realm of popular culture — is the 1897 book Dracula. Stoker’s version of vampirism — a blood-sucking ghoul who preys on the innocent in order to prolong its own immortal life — was burned into the collective psyche. Notably, it also kept with the then-common belief that vampires were dangerous and unholy, although there’s a case to be made that the Victorian-era novel is full of innuendo and is, in fact, a heavily sexual piece.

But through this novel, we get several characters who continually pop up in future works by other authors and television and movie directors. You might be familiar with names like Van Helsing, the vampire slayer who is the central character portrayed by Hugh Jackman in a 2004 action movie, or Mina Murray, a love interest who features prominently in Dracula romance novel spin-offs.

Austrian-Hungarian born actor Bela Lugosi clenches his hand in the air.
Credit: Universal Pictures/ Moviepix via Getty Images

Vampires Get a Makeover

It wasn’t until 1931 that the vampire transitioned from being a vicious-looking monster into a handsome rogue who just so happened to also suck people’s blood. You can thank the film Dracula that was released that year, and actor Bela Lugosi for playing the titular role in a suave manner. Through the decades, vampires stayed attractive yet fearsome, until Sesame Street’s fourth season in 1972.

Best known as the Count von Count who likes to count, the friendly character manages to straddle popular vampire tropes such as wearing a cape, living in a decrepit castle, and laughing dramatically with a Transylvanian accent, while also delighting small children and teaching them how to count their numbers. He’s probably the only family-friendly vampire most people can name, though Disney’s Vampirina also proves that Transylvania’s most famous cultural export can be for kids.

But vampires didn’t take a decidedly sexy turn until the 1970s when Anne Rice began writing her Vampire Chronicles novel series that centered around the handsome French vampire Lestat. The most famous book in the series was Interview With the Vampire, which in 1993 was turned into a movie starring Brad Pitt and Tom Cruise. It’s safe to say that after this movie was released, the interest in vampires in pop culture experienced a rebirth, and there were plenty of people who were open to the idea of literally being bitten by love.

Nosferatu, a horror film directed by F. W. Murnau and starring Max Schreck as Count Orlok.
Credit: Buyenlarge/ Archive Photos via Getty Images

In the Art House

Arguably the most influential vampire movie ever made is 1922's Nosferatu, F.W. Murnau's silent fantasia, which belongs to the German Expressionist movement. Max Schreck stars as Count Orlock in the loose (and unofficial) adaptation of Stoker's novel, with a number of names and other details being changed for legal reasons. Its legacy is massive, so much so that none other than Werner Herzog remade it as Nosferatu the Vampyre in 1979. The filmmaker's frequent collaborator Klaus Kinski starred in that version, which is even stranger than its source material and just as worthwhile — not least because of Isabelle Adjani's performance.

Both films are part of a proud tradition of avant garde vampire movies that continues today. French auteur Claire Denis threw her proverbial hat in the ring with 2001’s Trouble Every Day, in which American newlyweds find themselves among many tantalizing necks in Paris; Jim Jarmusch did likewise with Only Lovers Left Alive, a romantic drama starring Tilda Swinton and Tom Hiddleston as a bloodsucking couple whose centuries-long affair has made them as prone to waxing philosophical as they are to sucking blood. Similarly artful films are made all over the world, from Let the Right One In (Sweden) and The Transfiguration (America) to A Girl Walks Home Alone at Night (Iran) and Thirst (South Korea), all of them demonstrating how many different approaches there are to depicting these creatures of the night.

Buffy the Vampire Slayer film poster.
Credit: MARKA/ Alamy Stock Photo

Vampires Go Mainstream

Although Interview With the Vampire was a racy novel and movie, a more family-friendly version of vampire relations also hit the big screen a year earlier with Buffy the Vampire Slayer. The 1992 movie centered on a cheerleader who discovers that she’s a vampire hunter and was successful enough to be adapted into a beloved television series five years later. The “Buffyverse” is home to one of the most dedicated fan-bases around, not least because of Sarah Michelle Gellar’s inspired performance in the title role — she even received a Golden Globe nomination. Buffy proved both that bloodsuckers can draw ratings and that a strong female lead could be accepted by a diverse audience. It also paved the way for the likes of True Blood and The Vampire Diaries, both of which spawned devoted followings of their own and suggest that, like the creatures themselves, this genre refuses to die.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by Antares Light/ Shutterstock

Everybody knows the stories of Cinderella, Aladdin, and Sleeping Beauty. These centuries-old fairy tales have been immortalized in every art form imaginable, from books and ballets to musicals and movies. What’s often forgotten, however, is where these stories came from — and who was responsible for writing them down. Here’s a look at eight of history’s most important fairy-tale tellers.

Aesop, ancient Greek writer of fables.
Credit: Edward Gooch Collection/ Hulton Archive via Getty Images

Aesop: A (Literal) Legend

If you’ve ever taken “the lion’s share” or claimed that “necessity is the mother of invention,” then thank Aesop. The Greek fabulist — purportedly born around 620 BC — is responsible for some of our most famous phrases and fables, including The Hare and the Tortoise. Greek authors like Herodotus and Plutarch claim that Aesop was a slave who became an adviser to Croesus, the King of Lydia. The accuracy of their accounts, however, is disputed, and it’s possible that Aesop was never a real person.

Portrait of Marie-Catherine Le Jumel de Barneville.
Credit: DEA PICTURE LIBRARY/ De Agostini via Getty Images

Marie-Catherine le Jumel de Barneville: Pioneer of the Fairy Tale

Countess d’Aulnoy’s life is like a folktale — difficult to parse fact from fiction. A French author who lived during the 17th century, de Barneville may have been a spy who accused her husband of high treason. True or not, she established a literary salon later in life and published at least two collections of tall tales. Her works, like “The White Cat,” were famously conversational in style and were lauded for being popular with adults and children alike. In fact, she even coined the term “fairy tale.”

Antique artisanal Aladdin Arabian nights genie.
Credit: zef art/ Shutterstock

Hanna Diyab: The Man who Conjured Aladdin

The brain behind Aladdin and Ali Baba and the Forty Thieves, Diyab was a Syrian storyteller who lived during the early 18th century. When Diyab was young, he bumped into a French collector of antiquities who hired him to become his traveling assistant. Diyab visited Paris and met the folklorist Antoine Galland, who he entertained with folktales from home. Years later, Galland published some of Diyab’s tales in his famous translation of The Thousand and One Nights. Diyab wouldn’t receive credit until centuries later.

Portrait of Jean de La Fontaine.
Credit: API/ Gamma-Rapho via Getty Images

Jean de la Fontaine: The Editor Who Turned Fairy Tales into an Art Form

In 1668, Frenchman Fontaine released the first volume of Fables, a literary landmark that would lay out a formula for centuries of European folk and fairy tales. Born to a well-to-do family, de la Fontaine became interested in writing upon being inspired by the work of the French poet Malherbe. Between 1668 to 1694, he released six volumes of fables — a total of 239 stories — that drew from diverse sources, from the Roman fabulist Phaedrus to the Panchatantra, an Indian book of fables. De la Fontaine’s fresh and artful retellings of stories such as “The Grasshopper and the Ant” and “The Raven and the Foxturned Fables into an instant classic.

Portrait of Charles Perrault.
Credit: Bettmann via Getty Images

Charles Perrault: The Original Mother Goose

A major influence on the Brothers Grimm, Perrault — hailing from France as well — helped transform tales like “Puss in Boots,” “Cinderella,” “Blue Beard,” “Sleeping Beauty,” and “Little Red Riding-Hood” into cultural touchstones. His 1697 book Histoires ou Contes du Temps Passe — better known as The Tales of Mother Goose — was an unexpected departure from his life’s work. Perrault had spent decades working as a government official, but when political bickering forced him to change careers, he turned to writing literary fairy tales for aristocratic literary salons. The career change at age 67 is what made him famous.

Portrait of Wilhelm and Jacob Grimm.
Credit: Bettmann via Getty Images

The Brothers Grimm: Disney before Disney

Jacob and Wilhelm Grimm didn’t write “Rapunzel” or “Snow White,” but they did popularize the tales among the masses. The German-born brothers attended college with the intention of becoming civil servants, but a pair of influential teachers changed their minds — and inspired a love of folk poetry (or naturpoesie) and the arts. The duo gave up any hopes of a law career and began collecting literature that, they believed, emphasized the character of German culture and people. The brothers didn’t view themselves as writers, but as preservationists and historians who were saving common tales from extinction. Published in 1812, their first edition contained 156 fairy tales, including “Hansel and Gretel,” “Rumpelstiltskin,” “The Elves and the Shoemaker,” and “The Fisherman and His Wife.”.

Hans Christian Andersen, Danish author and poet.
Credit: Print Collector/ Hulton Archive via Getty Images

Hans Christian Andersen: The Original Ugly Duckling

The Danish writer of over 150 fairy tales — including “The Emperor’s New Clothes,” “The Little Mermaid,” “The Princess and the Pea,” and “Thumbelina” — Andersen, born in 1805, came from humble beginnings. His mother was illiterate and his father only had an elementary school education. And when his dad died, Andersen started working at a factory at the age of 11. But he always had an artistic side, and he tried to express his struggles through his work. As a teenager, for example, Andersen was routinely harassed by other boys because he had a high voice, and that abuse inspired him to write “The Ugly Duckling.” “The story is, of course, a reflection of my own life,” he once wrote.

Portrait of Alexander Nikolayevich Afanasyev.
Credit: DE AGOSTINI PICTURE LIBRARY via Getty Images

Alexander Afanasyev: From Bureaucrat to Bard

Russia’s answer to the Brothers Grimm, Afanasyev was a 19th century Slavic folklorist who published nearly 600 folk and fairy tales. (His works include “The Firebird,” which was famously transformed into a ballet by composer Igor Stravinsky in 1910, and “Vasilisa the Beautiful and Baba Yaga.”) Much like Charles Perrault, Afanasyev spent decades clocking in at a normal day-job for the government. But while working at the Ministry of Foreign Affairs of the Russian Empire, he developed an obsession with collecting and preserving local fairy tales. Unlike many of the other folklorists on this list, Afanasyev regularly cited his sources and often tried to pinpoint where the tale originated.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Prostock-studio/ Shutterstock

Some problems are eternal — like finding the best gifts for all the people on your holiday list. You have to make a list, set a budget… and potentially fight for the last item at the department store (or rush to beat the millions of other buyers online). While shoppers of yore weren’t brawling in the aisles over Beanie Babies, the act of picking out the right item has long been stressful. Let these popular gifts from the last few decades — and their modern upgrades — give your holiday shopping some inspiration this season.

An opened packet with tobacco filter cigarettes with a yellow lighter.
Credit: Thanasis F/ Shutterstock

1950s: Cigarettes

The years immediately after World War II created a Christmas craze more aligned with the holiday celebrations we know today, compared to the reserved celebrations of decades prior. Aluminum Christmas trees, holiday paper crafts, and outdoor lights sprung to life during Christmases of the 1950s, and with them came the idea of spreading cheer to everyone you know through an extensive holiday shopping list. While toy producers were rolling blockbusters off their assembly lines for kids — think Mr. Potato Head, Frisbees, Hula Hoops, Barbies, and other toys that defined the decade — one adult-oriented gift became so popular that Santa Claus himself endorsed it: cigarettes. Cartons of smokes were cheaply priced and came in festive boxes (with a convenient gift tag attached), making it easy to swing by a drug store for a prewrapped holiday gift. So long as shoppers knew what brand their secretary, father-in-law, or friend smoked, they had what was then considered a stress-free, solid gift. Cigarette companies went all in, marketing rosy-cheeked Santas with sleighs full of cigarettes while celebrities such as Joan Crawford and Ronald Reagan attested to their affinity for the last-minute gift idea.

Modern Update: Dishing out cigarettes is decidedly outdated, but needing to pick up an affordable gift for your coworker or brother’s new girlfriend is a Christmas conundrum that will likely never be resolved. Virtually anything you can toss in your cart for $5 that doesn’t cause cancer is probably a winner, though a solid, popular choice is themed holiday socks. One YouGov poll reports 56% of Americans appreciate gifted socks, with the holidays being the main source of new socks for one in eight people. Sockmaker Bombas offers higher-end gift boxes, while sites like Goodly feature inexpensive holiday motifs that will warm the recipient’s heart (and feet).

Close-up of a person holding slime.
Credit: Tsyb Oleh/ Shutterstock

1960s: Wham-O SuperBall

The space race didn’t only occur at the most classified levels — some of the scientific breakthroughs that propelled humans to the moon were transformed into toys, too. Take Wham-O’s SuperBall, which debuted in 1964. Creator Norman Stingley, a chemical engineer who created a form of artificial rubber, transformed the material into an ever-bouncing ball that became an instant hit. At top production, Wham-O rolled 170,000 SuperBalls off assembly lines per day, with more than 20 million sold before the end of the decade. SuperBalls reached peak fad when they influenced the name of an even bigger cultural icon: the Super Bowl. Lamar Hunt, founder of the American Football League, admitted that when it came to naming the first championship football game, his inspiration came from the top-selling toy, which just so happened to be his children’s favorite.

Modern Update: While bouncing balls are unlikely to ever go out of style, Super Balls were inexpensive gifts that could round out a kid’s holiday haul. Today’s top science-adjacent stocking stuffer? Slime. The main appeal of this trendy goop is its sticky sensation and neon coloring, though physical therapists and psychologists say it has hidden benefits such as encouraging mindfulness and fine motor skills. Slime has moved from the DIY project of its early days to a commercially available goo — you can order a multipack from Play-Doh or give a creative kid their own slime-making kit. (For parents who don’t want to spend the holidays picking pet hair and crumbs from a slime ball, a squishable bead-foam ball is a mess-free alternative.)

tar Wars action figures, Boba Fett and Luke Skywalker.
Credit: Portland Press Herald via Getty Images

1970s: “Star Wars” Action Figures

Star Wars is one of the most successful sci-fi universes out there, but initially there wasn’t much confidence in its potential. Even creator George Lucas was expecting a box-office flop for his 1977 space opera. After the film’s surprising hyperjump to blockbuster status, Lucasfilm shopped for toy producers to move hit characters like Luke Skywalker, Han Solo, and Princess Leia from the big screen to under the Christmas tree. They settled on Kenner, a smaller manufacturer who crafted odds-and-ends merchandise such as the Escape from Death Star board game, puzzles, and posters. Alas, sure-sellers like action figures couldn’t be made and shipped in the seven months between the film’s release and the winter holidays, thanks to the extensive process of designing, molding, and producing the moveable toys.

In an attempt to still make holiday sales and keep up interest, Kenner debuted a risky idea: the “Early Bird Certificate Package,” a cardboard envelope that included some stickers, a Star Wars Space Club membership card, a cardboard display stand, and a certificate redeemable for four figurines that would be shipped out in early 1978. While some stores refused to stock the packages on the basis that they weren’t actually toys, the certificates sold out in many areas and Kenner’s gamble paid off.

Modern Update: The Star Wars franchise practically sells itself, and toys from its live-action spinoff, The Mandalorian, are still going strong. The six-inch, armor-clad Black Series Mandalorian Figure gets top reviews and is officially licensed by Disney. On the other end of the spectrum, this LEGO Millennium Falcon set, at more than 1,300 pieces, would delight diehard fans who are young or young at heart.

Two Cabbage Patch Kids.
Credit: Barbara Alper/ Archive Photos via Getty Images

1980s: Cabbage Patch Dolls

We’ve become accustomed to massive Black Friday crowds and fighting in aisles over limited-stock items, but 30-some years ago, seeing parents tear into each other over a toy wasn’t only unheard of, it was truly shocking. That’s why the Cabbage Patch Doll craze lives on in infamy, and hot-selling toys are often compared to the frenzy for the stuffed, one-of-a-kind dolls. While they had been available at some stores across the country earlier, peak demand hit in the winter of 1983. Despite the limited stock, eager-to-please parents scooped up more than 3 million of the dolls by the end of the holiday shopping season, with many shoppers waiting in line for hours to snag one, or nearly rioting when stores ran out. But what made Cabbage Patch Dolls the perfect gift? Probably a mix of exclusiveness and marketing. The Cabbage Patch concept centered around not purchasing, but adopting one of its unique dolls, complete with an included adoption certificate. And it worked well — by the end of the 1980s, Cabbage Patch Kids had made around $2 billion thanks to its dolls and add-on accessories.

Modern Update: Dolls remain a tried-and-true holiday gift, but the modernized take on gifting a lifelike best friend is less about changing wet diapers (we’re looking at you, Baby Alive) and more about matching the doll to its prospective owner. Our Generation dolls are upsized, measuring 18 inches, and come in male and female dolls with a variety of hair types and skin tones. The dolls also have hobbies, jobs, and interests instead of needing round-the-clock parenting. While similar in size and appearance to the more pricey American Girl dolls that were a 1990s staple, Our Generation dolls have become popular in part because of their cost — around $25 compared to American Girl’s $100-plus price tag. And with endless accessories, such as a hot dog cart and even a tiny, ergonomic neck pillow, it’s one doll that can grow with your little one.

A Hatchimals Egg Surprise is displayed.
Credit: Tristan Fewings/ Getty Images Entertainment via Getty Images

1990s: Furby

The 1990s were all about pets — primarily ones that you didn’t have to actually feed, walk, or clean up after. Digital creatures like Giga Pets and Tamagotchi ruled the decade, but one robotic toy took its place as the most loved (and hated) animatronic critter on every kid’s wishlist: Furby. The fur-covered robot could chirp, speak “Furbish,” and dance, and supposedly even learn language and tricks with regular interaction over time. Furby fever had parents stampeding displays only to find limited stock, in part because the gadgets were first released in October 1998, far too close to the holiday shopping season for retailers to build a sizable supply. Within two months, parents had snapped up 1.8 million Furbies, with another 14 million sold the next year. Furby’s run, like that of most fad toys, was short-lived — it ended by the early 2000s, amid a storm of concerns about the robot’s supposed artificial intelligence and potential ability to retain information. (At one point, the National Security Agency banned Furbies over concerns they could act as spying devices.)

Modern Update: Hatchimals are the newest iteration of an animatronic companion. Part of the appeal is waiting for their jumbo-sized eggs to actually hatch (hence the catchy name) and reveal the stuffed robotic plush inside. Despite a limited number first rolling into stores in 2016, Hatchimals unexpectedly became a top holiday contender. Ambitious resellers purchased large quantities that they hocked online at steep markups, creating a Hatchimal black market that the company spoke out against. In the years since, Hatchimals have expanded to include miniature figurines and characters that arrive in plastic eggs but don’t hatch on their own (which is great for parents who are still creeped out by a battery-operated furball).

A new iPod mini displayed at Macworld January 6, 2004 in San Francisco.
Credit: Justin Sullivan/ Getty Images News via Getty Images

2000s: iPod Mini

The year was 2004: Steve Jobs was rocking a black turtleneck, the original iPod had already been out for three years, and Apple’s brightly colored dancing silhouette commercials were getting regular airtime. And that’s when Apple dropped its iPod Mini — the downsized, candy-colored music player that came with 4 (or an upgraded 6) GB of space for all your favorite songs. That holiday season, Apple sold more than 4.5 million iPods between their regular and Mini editions — more than half its total iPod sales for the entire year. Like any hot gift, iPod Minis could be difficult to come by; even with so many in production, many models were put on backorder while desperate gift-givers scoured eBay’s marked-up options. While the Mini (and its whopping $249 price tag) was quickly left behind for improved models, some people still swear by the simple interface and classic click-wheel design (which brings on some good ’00s nostalgia).

Modern Update: The iPod Mini (and nearly every other version of the iPod) faded into the background as faster smartphones began to provide an all-in-one experience. And along with the ability to listen on demand came the ability to pick and choose new music without paying for every individual song piecemeal. Music streaming services such as Spotify and Apple Music have made it easier to listen to all the newest beats and old favorites on loop without shelling out for entire albums. So while you can’t give the gift of an aluminum-backed mp3 player any longer, a music subscription gift card for ad-free listening will be just as prized. (Gift cards to the online music store Bandcamp also allow music fans to directly support the artists they love.) Because what says “Happy Holidays” better than a gift you didn’t have to fight for?

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Pictorial Press Ltd/ Alamy Stock Photo

Countless movies are based on real events, and most of them are quick to let you know it. Whether the plot is ripped from the headlines or merely adapted, taking inspiration from real-world happenings can confer a sense of legitimacy — the implication being that even if creative license was taken (and it almost certainly was), the filmmakers are performing a kind of public service by bringing these stories to the big screen. Not all of these movies advertise their pedigree, however, and there’s a good chance you didn’t realize these four movies were based on real events.

Robert Englund attacks Heather Langenkamp in a scene from A Nightmare On Elm Street.
Credit: Michael Ochs Archives/ Moviepix via Getty Images

A Nightmare on Elm Street (1984)

Freddy Krueger isn’t real and there have been zero confirmed cases of teenagers being murdered in their dreams (thankfully!), but that doesn’t mean that Wes Craven’s landmark slasher series wasn’t inspired by a real story. Years before dreaming up Elm Street, the horror maestro became fascinated by a series of newspaper articles about refugees from Laos, Cambodia, and Vietnam who were afflicted with nightmares so disturbing that they forced themselves to stay awake — and, in some cases, died upon finally falling asleep.

“It was a series of articles in the L.A. Times; three small articles about men from South East Asia, who were from immigrant families and had died in the middle of nightmares — and the paper never correlated them, never said, ‘Hey, we’ve had another story like this,’” Craven explained in a 2008 interview. Other research has shown that the phenomenon primarily affected Laotian male refugees from the Hmong ethnic group, an ethnicity that fought alongside the U.S. in the Vietnam war and was subsequently persecuted in Laos after the war ended. Many later suffered traumatic resettlements in the U.S. In the newspaper articles, there were no reports of a man wearing a striped red-and-green sweater — but the core of the idea was the same.

A close-up of Joe Pesci, Ray Liotta, and Robert De Niro in Goodfellas.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

Goodfellas (1990)

Whether or not the mobsters in Martin Scorsese’s crime classic are actually good fellas is debatable, but one thing is certain: They were at least based on real fellas. Adapted from Nicholas Pileggi’s book Wiseguy: Life in a Mafia Family, Goodfellas envisions mafiosa-turned-informant Henry Hill as a made man whose life of crime represents a fulfillment of his childhood dream — there’s a reason the movie’s first line is, “as far back as I can remember, I always wanted to be a gangster.”

The fact that Scorsese had already directed revered crime pictures such as Mean Streets and Taxi Driver made him reluctant to make another, but coming across Wiseguy was more than enough to change his mind. “I just read your book. I’ve been looking for it for years,” Scorsese told Pileggi over the phone when pitching the idea of adapting it. “Well, I’ve been waiting for this call all my life!” Pileggi replied. The rest, as they say, is history.

A screen grab of Frances McDormand in Three Billboards Outside Ebbing, Missouri.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Three Billboards Outside Ebbing, Missouri (2017)

If you believe that truth is stranger than fiction, you won’t be surprised to learn that Three Billboards Outside Ebbing, Missouri’s inventive premise was borne of more than writer-director Martin McDonagh’s imagination. The Oscar-winning drama stars Frances McDormand as a grieving mother who, months after the rape and murder of her daughter, takes matters into her own hands by calling out law enforcement’s lack of progress on the case with a series of accusatory billboards.

McDonagh revealed how the idea came to him in an interview conducted shortly after the film’s release: “Twenty years ago I was on a bus going through the southern states of America, and somewhere along the line, I saw a couple of billboards in a field that were very similar to the billboards that we see in the start of our story,” he told Deadline in 2018. “They were raging and painful and tragic, and calling out the cops.” McDonagh received an Academy Award nomination for his screenplay, and a number of protest groups have since used similar billboards to make their voices heard.

Kitty Winn holds a flashlight at Linda Blair, as Jason Miller watches, in The Exorcist.
Credit: Bettmann via Getty Images

The Exorcist (1973)

Plenty of people consider The Exorcist the scariest movie ever made, and the fact that it’s based on a true story only adds to the terror. The actual practice of exorcism is highly controversial, so when writer William Peter Blatty based his 1971 novel on a particularly disturbing episode he’d first heard about in college, it was perhaps surprising that it was so well received. Blatty adapted the story of a 14-year-old boy whose family had believed he was possessed by a demon. A number of Jesuit priests performed the exorcism in 1949, which one account claims was witnessed by at least 48 people.

“The little boy would go into a seizure and get quite violent,” one of the priests recalled, even going so far as to break that priest’s nose, and he had words like “hell” etched into his skin. Skeptics doubt that the teenager was ever actually possessed, of course, and the boy reportedly went on to lead “a rather ordinary life.” Blatty wrote the script for William Friedkin’s hugely successful adaptation of his novel, and the author-turned-screenwriter won an Academy Award.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by cineclassico / Alamy Stock Photo

History is filled with the stories of amazing women, from female pirates to tireless civil rights advocates. And while many of these stories are important feminist firsts, some feature the quirkier side of women’s history — like the famed mystery writer who helped popularize surfing. From women who authored important legal arguments to those whose inventions made our lives easier, here are 25 facts that will help you celebrate Women’s History Month.

Low angle view of female craft beer expert sampling beer.
Credit: xavierarnau/ iStock

Women Were the First Beer Brewers

On the list of things women don’t get enough credit for, being the first to brew beer might not seem like the most important. But fermented beverages have played a vital role in human culture for almost as long as society has existed, providing nutrients, enjoyment, and often a safer alternative to drinking water before the advent of modern sanitation. Scholars disagree over exactly when beer was first introduced — the earliest hard evidence for barley beer comes from 5,400-year-old Sumerian vessels that were still sticky with beer when archaeologists found them — but one thing has never been in question: “Women absolutely have, in all societies, throughout world history, been primarily responsible for brewing beer,” says Theresa McCulla, who curates the Smithsonian’s American Brewing History Initiative.

Ada Lovelace seated at retro computer.
Credit: HappySloth/ Shutterstock

Ada Lovelace Is Often Considered the World’s First Computer Programmer

Ada Lovelace followed a path many considered impossible for a woman in the early 19th century. Encouraged by her mother, Lady Byron, Lovelace developed a passion for mathematics at a young age. In 1833, a 17-year-old Lovelace met British mathematician Charles Babbage at a party, and he told her about a calculating machine he’d created called the Difference Engine. Fascinated, Lovelace eventually began a regular correspondence with Babbage.

About a decade later, while translating a French text regarding Babbage’s proposed Analytical Engine — often considered the first mechanical computer — Lovelace added a few notes of her own. “Note G” detailed a method through which Babbage’s creation could calculate complex numbers called Bernoulli numbers. This is often considered the world’s first computer program, making Lovelace the first computer programmer. And while Babbage was the brains behind the machine, Lovelace was the one who truly grasped its wider importance, foreseeing a future where engines could use the “abstract science of operations” to do things beyond mere computation.

In fact, many early computer programmers were women. In the 1940s and ’50s, engineering computers was perceived as a man’s profession, but programming them was considered secretarial. As a result, many women took jobs as programmers — helping Alan Turing crack the Enigma Machine during World War II, writing instructions for the world’s first general-purpose computer called ENIAC, and creating the world’s first compiler (a program that translates programming languages into machine languages). According to government data, around 27% of programmers in 1960 were women. In 2013, that number was 26% and falling. Today, many leading universities are working hard to reverse that trend.

portrait of actress Lucille Ball circa 1950's.
Credit: Archive Photos via Getty Images

Lucille Ball Helped Get “Star Trek” on TV

As the first female head of a major Hollywood studio — Desilu Productions, which Lucille Ball formed with then-husband Desi Arnaz but took over by herself after their divorce in 1960 — Ball helped produce some of the most influential television shows of all time. She was particularly instrumental in getting Star Trek on the air. There was apparently some trepidation by Desilu board members when it came to the budget of the ambitious series, leaving Ball to personally finance not one but two pilots of the science fiction mainstay. One studio accountant, Edwin “Ed” Holly, even claimed: “If it were not for Lucy, there would be no Star Trek today.”

Ching Shih (1775-1844) was a prominent pirate in middle Qing China.
Credit: Science History Images/ Alamy Stock Photo

Ching Shih Was a Legendary Female Pirate

Not all pirates were men: Ching Shih was a fearless female pirate from China. Following the 1807 death of her husband Cheng I, who was head of the powerful Red Flag Fleet, she unofficially commanded a fleet of 1,800 pirate ships and approximately 80,000 men. She also took control of the Guangdong Pirate Confederation and spent the following years waging battle — and winning — against the Portuguese Empire, the Chinese Navy, and Britain’s East India Company. She’s widely considered one of the most successful pirates of all time.

American Civil rights activist Claudette Colvin.
Credit: The Washington Post via Getty Images

Before Rosa Park, Claudette Colvin Refused to Give Up Her Seat on the Bus

Nine months before Rosa Parks was arrested for refusing to surrender her bus seat to a white passenger in Montgomery, Alabama, the same thing happened to 15-year-old Claudette Colvin. So why was the Parks incident the one that ignited the Montgomery bus boycott and transformed the issue into a national story? As Colvin herself later conceded, the then-42-year-old Parks, a secretary for the NAACP, was considered by some to be a more respectable symbol for the boycott, particularly after it was discovered that the unwed Colvin had become pregnant.

Nevertheless, Colvin wound up playing a crucial role as events unfolded: She was named a plaintiff in the 1956 Browder v. Gayle case that challenged the constitutionality of Alabama’s segregated buses and provided the legal backbone for the boycott’s triumph. Colvin left Alabama soon after and spent most of the following decades living anonymously in New York City, though her contributions have finally earned some long-overdue recognition in recent years.

Statue of St Lucy of Syracuse.
Credit: Science & Society Picture Library via Getty Images

St. Lucia Is the Only Country Named After a Woman

While Ireland is named after the mythical goddess Éiru, there’s only one sovereign nation in the world named for a real-life woman. That distinction lies with St. Lucia, a Caribbean island nation christened in honor of St. Lucy of Syracuse, patron saint of the blind, who died around the fourth century CE.

St. Lucia was initially called Louanalao (meaning “Island of the Iguanas”) by the Indigenous Arawak people as early as 200 CE. It was in 1502 that the origins of its current name formed, when shipwrecked French sailors dubbed the place “Sainte Alousie.” It was a common practice at the time to name islands after saints, and legend has it that the sailors reached the island on December 13 — St. Lucy’s feast day. Given the date’s significance, December 13 is now celebrated in the country as the National Day of St. Lucia. The Spanish who arrived around 1511 named the island “Sancta Lucia”; the current name formed after waves of colonization by the English and French.

While female namesakes are rare on a national level, one woman has lent her name to dozens of smaller locations. The name of Queen Victoria, the U.K.’s reigning monarch from 1837 to 1901, appears in the titles of locations around the globe, such as the provincial capital of British Columbia, Canada, and Zimbabwe’s breathtaking Victoria Falls. You’d be hard-pressed to find an American woman with influence so vast. Even in the U.S., only a handful of places are named for women, including Barton County, Kansas — named after Clara Barton, founder of the American Red Cross — and Dare County, North Carolina, honoring Virginia Dare, the first child of English parents to be born in the New World.

Cleopatra and Octavianus, Portrait.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Cleopatra Was a Victim of Roman Propaganda

Cleopatra’s legacy is so complicated because it tangles with historical biases against strong, female rulers and the propaganda of the early Roman Empire. Today, most people know Cleopatra as a seductress, one who had romances with two of the most powerful Roman leaders in the first century BCE, and who used her sex appeal to manipulate geopolitics in her favor. However, the source of many of these colorful tales is Octavian’s (later Caesar Augustus’) propaganda machine; he launched the equivalent of a fake news campaign to discredit the foreign queen and his rival Mark Antony. When Octavian proved victorious against Antony and Cleopatra at the Battle of Actium in 31 BCE, the victors became the authors of history, and it has taken millennia for scholars to learn more about the real life of this fascinating final pharaoh.

Amelia Earhart and Eleanor Roosevelt.
Credit: Bettmann via Getty Images

Amelia Earhart Once Took Eleanor Roosevelt on a Nighttime Joyride

Although her aviation career lasted just 17 years, Amelia Earhart remains one of the most famous people ever to take to the sky. In addition to being renowned for her many firsts — including being the first woman to fly solo across the Atlantic and the first person to fly alone from Hawaii to the mainland U.S. — she’s known for her 1937 disappearance and the many theories it spawned. Less well known but considerably more fun to imagine is the time she took Eleanor Roosevelt on a nighttime joyride from Washington, D.C., to Baltimore on April 20, 1933. The brief flight took place with both of them in their evening wear following a White House dinner party.

“I’d love to do it myself. I make no bones about it,” the First Lady told the Baltimore Sun after the flight. “It does mark an epoch, doesn’t it, when a girl in an evening dress and slippers can pilot a plane at night.” In fact, Roosevelt herself had recently received a student pilot license and briefly took over the controls of the twin-engine Curtiss Condor, borrowed from Eastern Air Transport at nearby Hoover Field. Eleanor’s brother Hall also ditched the dinner party in favor of the flight that night, as did Thomas Wardwell Doe, the president of Eastern Air Transport, and Eugene Luther Vidal (head of the Bureau of Air Commerce) and his wife Nina Gore, parents of author Gore Vidal. When the plane returned after the short journey, the Secret Service guided everyone back to the White House table for dessert. Roosevelt and Earhart remained friends for the rest of Earhart’s life, sharing an interest in women’s causes, world peace, and of course, flying.

Bentz, Melitta the inventor behind coffee filters.
Credit: INTERFOTO/ Alamy Stock Photo

A Woman Invented Disposable Coffee Filters

Melitta Bentz’s invention is one coffee drinkers now take for granted, but it was revolutionary in the early 1900s. At the time, other home brewing methods required a lot of time and cleanup — not to mention a tolerance for bitter coffee and sludgy grounds at the bottom of your mug. While pricey cloth coffee filters were available, they were used like tea bags, steeping grounds in hot water that produced a subpar cup and a mess. Many coffee connoisseurs brewed their morning java in percolators, but those could leave a burnt taste and failed to filter out smaller grounds.

Bentz, a German woman with an affinity for coffee, was determined to find a better brewing process that didn’t require extensive cleanup. During one experiment, she reached for notebook paper as a potential liner, filling the makeshift filter with coffee grounds. She placed the filter inside a pot she had punched holes in and poured hot water over the grounds, allowing brewed coffee to cleanly drip through to a cup below. With the creation of drip coffee brewing, Bentz began producing the paper filters at home, and was granted a patent for her drip-cup apparatus in 1908. With help from her family, she launched a line of drip-coffee makers and filters in 1909, branding the items with her own first name. Bentz died in 1950, but her company — now run by her grandchildren — produces nearly 50 million coffee filters each day.

Demonstrators from the National Women's Liberation Movement at 1968 Miss America Pageant.
Credit: Bettmann via Getty Images

The Origin Story of the “Bra-Burning Feminist” Is a Myth

Think of the Swinging ’60s and you might imagine one of the most popular stereotypes: women burning their brassieres to protest society’s rigid rules. The stunt has been referenced in discussions about gender equity for decades — but it turns out it never actually happened at the event most often mentioned in connection with it. Here’s what did: On September 7, 1968, members of the group New York Radical Women gathered outside the Miss America Pageant on New Jersey’s Atlantic City boardwalk to protest the event. Their argument? The pageant degraded women by promoting unrealistic beauty standards and strict social expectations. The protest was also meant to highlight larger issues American women faced, such as being denied their own credit cards or the right to continue working during pregnancy. Protest organizers originally planned to burn items related to their discontent, such as bras and makeup, but local police shut down the stunt, citing safety concerns around a fire on the boardwalk. Instead, the group hauled out a metal “freedom trash can,” which became a disposal site for undergarments, cookware, wigs, and issues of Playboy magazine — all items participants deemed “instruments of female torture.” The gathering also crowned a live sheep in a mockery that compared beauty pageant participants to fairground show animals.

Even without a blaze, bra burning became synonymous with the women’s liberation movement. A New York Post article linking the pageant protest with draft card burning misconstrued events, an error some historians say popularized the belief that feminists were setting fire to their undergarments. And while it’s possible that later demonstrations inspired by the fictitious fire actually torched a bra or two, large-scale bra burnings weren’t recorded events. Some activists believe the lingerie legend overshadowed the event’s larger message, but that it wasn’t all bad — the famed protest helped catapult the women’s equality movement into mainstream conversations.

Famous world traveler and journalist Nellie Bly.
Credit: Bettmann via Getty Images

Nellie Bly’s Reporting Improved Mental Illness Treatment

In 1887, Nellie Bly launched her first undercover story for The New York World, becoming a “girl stunt reporter,” part of a then-popular movement of female reporters who embedded themselves in investigations to expose dangerous working conditions, corrupt public figures, and social atrocities. Bly’s initial investigation involved a 10-day stay at the infamous Blackwell’s Island Asylum in New York City, where women experiencing mental health crises (as well as others sent there for a variety of reasons, including not speaking English) were subjected to cruel “treatments,” rotten food, and abuse. After her release, Bly penned a story that exposed the institution’s horrors and led to public calls for improved conditions, including a grand jury investigation and budget increases to properly house and help patients. It also became one of her most famous books, titled Ten Days in a Mad-House.

State Flag of Idaho with seal.
Credit: Joseph Sohm/ Shutterstock

Idaho Has the Only State Seal Designed by a Woman

State seals are often crimped or stamped on legal documents, lending them authenticity. Yet these small symbols have another role, as miniature visual histories specific to each state, often simultaneously representing hopes for the future. At least that’s how artist Emma Edwards Green viewed the seal she created for Idaho in 1891 — which just so happens to be the only state seal designed by a woman.

Idaho became the 43rd state on July 3, 1890, formed from a territory that had once included land in present-day Montana and Wyoming. Upon statehood, Idaho legislators looked to commission the state seal’s design by way of a competition, with a generous $100 prize (about $3,300 today) for the winning artist. Green, an art teacher who had relocated to Boise after attending school in New York, was in part inspired by the fact that it seemed Idaho would soon give women the right to vote. In March 1891, Green’s work was selected as the winner, beating out submissions from around the country.

The final design, which is also featured on Idaho’s flag, is packed with symbolism. Worked into the design are cornucopias and wheat to represent Idaho’s agriculture, a tree meant to be reminiscent of the state’s vast timberlands, and a pick and shovel held by a miner. Green’s most forward-thinking detail, however, is a man and woman standing at equal heights in the seal’s center, a symbol of gender equality that would eventually come with voting rights for all. True to their word, Idaho legislators passed women’s suffrage in 1896 — five years after Green’s seal became the state’s official symbol — making Idaho the fourth state to enfranchise women, more than 20 years before the 19th Amendment gave the same right to women nationwide.

Close up of a security alarm keypad.
Credit: gchutka/ iStock

The Inventor of the Home Security System Was a Nurse

Necessity is the mother of invention, and that can certainly be said of Marie Van Brittan Brown and her home security system. In the mid-1960s, Brown lived in a rough neighborhood in Queens, New York, while working as a nurse. She was often alone at night, so she decided to design her own peace of mind. Her invention featured four peepholes on the front door and a motorized camera that could look through the holes at varying heights. The camera was connected to a television inside the home, and a microphone both inside and outside the door allowed her to interrogate uninvited visitors. For added security, Brown also devised a way to alert police via radio. This ingenious use of cameras and closed-circuit television helped Brown score a patent for her security system in 1969. Today, Brown’s invention is widely regarded as the cornerstone of modern home security systems.

Close-up of Pauli Murray.
Credit: Bettmann via Getty Images

Pauli Murray was enormously influential as a lawyer, writer, and teacher. She became California’s first Black deputy attorney general in 1945, as well as the first African American to earn a Doctor of Juridical Science from Yale Law School two decades later. Additionally, the acclaimed scholar saw her legal arguments used in the groundbreaking cases of Brown v. Board of Education (1954), which struck down segregation in public schools, and Reed v. Reed (1971), which extended the rights under the 14th Amendment’s Equal Protection Clause to women.

Publicly critical of the sexism rife within the ranks of the Civil Rights Movement, Murray helped launch the National Organization for Women (NOW) in 1966. Eventually, she found herself out of step with its leadership and stepped away. On her own once again, Murray resigned from her teaching post and entered New York’s General Theological Seminary, en route to one final historic achievement in 1977 as the first African American woman to be vested as an Episcopal priest.

English detective novelist, Agatha Christie.
Credit: ullstein bild Dtl via Getty Images

Agatha Christie Helped Popularize Surfing

Agatha Christie’s characters have done it all — survived attempted murder, traveled to far-off lands, and solved mystery after mystery. But the bestselling author didn’t just write about adventure; she also sought it out, sometimes on a surfboard. Two years after publishing her first novel, Christie embarked on an international trip with her first husband, Archibald. Their 1922 stop in South Africa included an attempt at surfing, where it’s possible she may have become the first Western woman to stand up on a surfboard. The globetrotting couple quickly fell in love with the sport, and went on to catch swelling waves off the coasts of Australia, New Zealand, and Hawaii. Christie, in letters to her mother, recounted the tricky experience of learning to surf, describing the sport as “occasionally painful” thanks to a “nosedive down into the sand.” But the writer eventually became more skilled, detailing in her 1977 autobiography that nothing could compete with the rush of approaching shore at high speeds. She also wrote about surfing in her novel The Man in the Brown Suit, in which her protagonist, nicknamed “Anna the Adventuress,” goes surfing in Cape Town.

Christie’s pursuit of the perfect wave was unusual for an Englishwoman of her time. The Museum of British Surfing suggests she and her husband may have been two of the earliest Brits to attempt the activity. However, they did have regal company: Prince Edward, the British royal who would eventually abdicate the throne in 1936 to marry Wallis Simpson, was photographed surfing in Hawaii two years before Christie rented her first surfboard.

Close-up of a dishwasher with clean plates and cups.
Credit: Lilkin/ Shutterstock

A Busy Socialite Invented the Modern Dishwasher

Clearing away dinner dishes is easier (and faster) today than it was in 1886, when Josephine Cochrane patented the first mechanical dishwasher. As a frequent host of dinner parties at her Shelbyville, Illinois, mansion, Cochrane was concerned about maintaining her fine dishware’s pristine condition. But as a busy socialite, she didn’t want to do the tedious work of scrubbing each piece herself to ensure it stayed that way; she relegated the task to servants, whose work occasionally caused chips and cracks. Cochrane’s solution was to create a dishwashing unit that kept her costly tableware out of the slippery sink and instead stationary while being sprayed with jets of water.

Cochrane, the daughter of an engineer and granddaughter of a steamboat innovator, was likely familiar with inventive tinkering despite lacking formal education in science or math. But after her husband’s death in 1883 left her with looming debt and few resources to pay it off, her dishwashing contraption transformed from a time-saving idea into a path for financial security. Cochrane was awarded a patent for her dishwasher design three years after being widowed and displayed her innovation at the World’s Columbian Exposition of 1893, where visitors marveled at the event’s only machine created by a woman. With exposure from the fair, Cochrane began marketing her contraptions to hotels, restaurants, and hospitals. (The cost was often too much for homemakers.) After her death in 1913, Cochrane’s company was purchased by Hobart Manufacturing Company, the original producer of KitchenAid-brand products.

Vintage typewriter with unique detail.
Credit: agcuesta/ iStock

Mary Katharine Goddard Was the First Known Female Postmaster in Colonial America

Mary Katharine Goddard was among the first female publishers in the U.S., a socially precarious venture for a colonial woman during the country’s fight for independence. Working with her mother, Sarah, and brother, William, Mary Katharine founded multiple publications starting in the 1760s. William frequently traveled between cities to establish new papers, leaving the bulk of news collecting and printing to his sister. In 1774, he appointed Mary Katharine to run The Maryland Journal while he focused on other pursuits (such as lobbying for a national postal service) and served time in debtor’s prison. During the height of the Revolutionary War, Mary Katharine made a name for herself with fiery anti-British editorials. In 1775, she was appointed Baltimore’s first postmaster — likely the first woman to hold such a position in colonial America — and in 1777, Congress commissioned her to print copies of the Declaration of Independence. (Surviving copies feature her printer’s mark at the bottom.) Despite her success, however, Mary Katharine was pushed out of both roles at the war’s end. In 1784, William rescinded her title as publisher, creating a lifelong rift between the siblings. Not long after, she was also removed from her postmaster job on the basis of sex. She wrote to George Washington asking to be reinstated, but the President passed her complaint to the postmaster general, who left her plea unanswered.

Jennifer Lopez in a green silk chiffon dress by Versace.
Credit: Scott Gries/ Hulton Archive via Getty Images

Jennifer Lopez Inspired the Creation of Google Images

Jennifer Lopez has worn a lot of memorable dresses on a lot of red carpets over the years, but only one broke the internet to such an extent that it inspired the creation of Google Images. The multi-hyphenate entertainer first wore the plunging leaf-print silk chiffon Versace gown to the 2000 Grammy Awards in L.A., which former Google CEO Eric Schmidt later revealed led to “the most popular search query we had ever seen.” The problem was that the then-two-year-old search engine “had no surefire way of getting users exactly what they wanted: J.Lo wearing that dress.” Thus, in July 2001, “Google Image Search was born.”

Two decades later, to the delight of everyone in attendance, Lopez also closed out Versace’s Spring 2020 show in Milan by wearing a reimagined version of the dress, after other models walked the catwalk to the tune of her hit 2000 single “Love Don’t Cost a Thing.” After a projected montage of Google Image searches for the original dress and a voice saying, “OK, Google. Now show me the real jungle dress,” J.Lo herself appeared in an even more provocative and bedazzled rendition of the gown.

View of the US Supreme Court building.
Credit: Douglas Rissing/ iStock

Lyda Conley Was the First Native American Woman to Argue a Supreme Court Case

Lyda Conley’s legacy was preserving that of her ancestors — specifically their final resting place. Conley acted as a staunch (and armed) defender of the Wyandot National Burying Ground, a Kansas cemetery at risk of sale and destruction some 60 years after its creation. The cemetery was established in 1843 following typhoid and measles outbreaks that took hundreds of Wyandot lives; the loss was a particular blow to an Indigenous community that was forcibly relocated thanks to broken treaties with the U.S. government and the cruel Indian Removal Act of 1830. In 1890, Kansas senators introduced legislation to sell the burial ground. Although it failed, the effort encouraged Lyda Conley to attend law school to defend the cemetery in which her own parents, siblings, and grandparents were interred. Conley was admitted to the Missouri Bar in 1902, and within four years put her legal skills to work as the federal government moved to sell the cemetery. Conley and her sister Lena began a legal and physical siege for its protection, building an armed watch station called Fort Conley on the grounds and warning, “Woe be to the man that first attempts to steal a body.” In 1910, her legal fight made its way to the U.S. Supreme Court, where she became the first Native American woman (and third woman ever) to argue a case before the judges. While the court ruled against her, years of media coverage about the cemetery worked in her favor. In 1913, the Kansas Senate passed legislation protecting the cemetery, which was designated a National Historic Landmark in 2017.

Jacqueline Kennedy Onassis attends Save Grand Central Rally circa 1978.
Credit: Images Press/ Archive Photos via Getty Images

Jackie Kennedy Helped Save Grand Central Terminal From Being Demolished

Much like she did in preserving the history of the White House, Jackie Kennedy played a key role in maintaining one of New York City’s most prominent landmarks. In the mid-1970s, developers hatched a plan to demolish part of Grand Central Terminal to build an office tower. The former First Lady was among a group of notable New Yorkers who objected to the plan, and in 1975, she spoke at a press conference at Grand Central’s famed Oyster Bar restaurant to protest the destruction of the beaux arts-style structure. She and other preservationists worked to ensure the building’s protection, which was ultimately assured by the U.S. Supreme Court decision Penn Central Transportation Co. v. New York City. A plaque dedicated in 2014 at the entrance on 42nd Street and Park Avenue honors Jacqueline Kennedy Onassis for her role in saving the indelible Manhattan icon.

And Grand Central Terminal isn’t the only NYC landmark to commemorate her legacy. Located at the northern end of Central Park, where Jackie was known to jog, the Jacqueline Kennedy Onassis Reservoir pays homage to the former First Lady’s contributions to the city. The artificial body of water, constructed between 1858 and 1862, spans 106 acres and was the largest human-made body of water in the world at the time of its creation.

Hedy Lamarr portrait.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Hedy Lamarr Invented a Frequency-Hopping System

During World War II, movie star Hedy Lamarr and modernist composer George Antheil came up with a “secret communication system” that used “frequency hopping” between radio signals to direct torpedoes without enemy interference. She and Antheil received a patent in August 1942 and offered their invention to the U.S. military. But the government wasn’t interested in the invention or Lamarr’s intelligence — instead, the actress was informed that her beauty was the best way to help the war effort. Instead of rejecting this sexist suggestion, Lamarr went on to sell millions in war bonds.

The frequency-hopping system that Lamarr and Antheil invented during World War II was adapted by the U.S. Navy and used during 1962’s Cuban missile Crisis. Later it contributed to technological innovations such as Bluetooth and GPS. Yet Lamarr’s contribution was ignored. She expressed her feelings about this in a 1990 interview: “I can’t understand why there’s no acknowledgment when it’s used all over the world.” Lamarr was slightly mollified when she was recognized by the Electronic Frontier Foundation with a Pioneer Award in 1997.

Scott Fitzgerald with Zelda on the French Riviera, 1926.
Credit: Photo 12/ Universal Images Group via Getty Images

The Legend of Zelda Video Game Was Named for F. Scott Fitzgerald’s Wife

Video games aren’t often associated with literary figures, but The Legend of Zelda has always been unique. Take, for instance, the fact that its title character was named after writer, artist, and Jazz Age icon Zelda Fitzgerald, whose marriage to The Great Gatsby author F. Scott Fitzgerald generated nearly as many headlines as his professional output. Zelda, who’s been described as the first flapper of the Roaring ’20s (and the inspiration for Gatsby’s Daisy Buchanan), was chosen because a Nintendo PR rep suggested that the eponymous princess should be “a timeless beauty with classic appeal” and that Zelda Fitzgerald was one such “eternal beauty.”

Shigeru Miyamoto, the game’s creator, agreed: “She was a famous and beautiful woman from all accounts, and I liked the sound of her name,” he has said. The name chain didn’t end there; actor Robin Williams was such a fan of the series that he named his daughter after the Princess of Hyrule. As for Zelda F. herself, she was — rather fittingly — named for the fictional heroine of a 19th-century novel.

 1896 $1 Silver Certificate with Martha Washington.
Credit: Bettmann via Getty Images

Five Presidents are featured prominently on U.S. bills currently in circulation — George Washington, Thomas Jefferson, Abraham Lincoln, Andrew Jackson, and Ulysses S. Grant. Yet only one First Lady has been given the same honor: Martha Washington. She also happens to be the only real-life woman (as opposed to mythical figures representing abstract concepts such as liberty) to have her portrait printed on U.S. paper currency. In 1896, Martha appeared alongside her husband on the back of the $1 note in a design commemorating 120 years of American history, but a decade prior she had her own bill — the U.S. Treasury’s $1 silver certificate. First released in 1886 — 84 years after her death and 17 years after $1 bills began featuring George Washington — the silver certificate could be exchanged for precisely one dollar’s worth of silver. The bills were eventually discontinued in 1957, yet the design featuring Martha remains the second-longest-issued paper money in U.S. history.

Eleanor Roosevelt Writing at Desk.
Credit: Bettmann via Getty Images

Eleanor Roosevelt Wrote a Newspaper Column for Nearly 30 Years

Starting at the very end of 1935 and continuing until her death in 1962, Eleanor Roosevelt kept a regular, nationally syndicated newspaper column called “My Day.” Eventually, it appeared in 90 different U.S. newspapers, detailing both her actions of the day and causes she supported — including ones that perhaps diverged a little from FDR’s views. After her husband’s death, she spoke even more freely about her viewpoints, and chose to keep advocating through her writing instead of running for office herself. Some newspapers dropped her column after she advocated for the election of Adlai Stevenson II in his run against Dwight D. Eisenhower in 1956, leading United Features Syndicate to instruct her to limit her support for candidates, which she did not do. For the majority of the run, Eleanor published six columns a week; only after her health began to decline in the last couple of years of her life did she cut that down to three.

Opening a Mother's Day card with flowers.
Credit: shine.graphics/ Shutterstock

Mother’s Day Was Originally an Anti-War Protest

Following four bloody years of the U.S. Civil War, two women called for a “mother’s day” to push for peace. In the summer of 1865, Ann Jarvis created Mothers’ Friendship Days in West Virginia that aimed to bring together Americans from all political backgrounds, and she continued the annual tradition for years. Inspired by Jarvis, Julia Ward Howe — who famously penned the lyrics to “The Battle Hymn of the Republic” — also wrote an “Appeal to Womanhood Throughout the World” in 1870, highlighting men’s role in war and calling on women to resist being “made a party to proceedings which fill the world with grief and horror.” She also tried to establish June 2 as “Mother’s Day for Peace.” However, it wasn’t until 1908 that Anna Jarvis (the daughter of the West Virginia peace activist) celebrated a “Mother’s Day” in May in honor of her deceased mother. Within a decade, the observance became a nationally recognized holiday.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Album/ Alamy Stock Photo

Take a trip down memory lane with this collection of nostalgic TV show facts from around the website. Did you know that M*A*S*H was based on a true story? How about the fact that the pilot of I Love Lucy was lost for over 40 years? Dig deeper into these stories and more with 25 of our favorite vintage TV show tidbits.

A publicity still issued for the US television series 'M*A*S*H'.
Credit: Silver Screen Collection/ Moviepix via Getty Images

The “M*A*S*H” Finale Was Watched by More People Than Any Other Series Finale

After 11 years on the small screen, M*A*S*H aired its series finale on February 28, 1983 — and made history in the process. More than 106 million people tuned in to watch “Goodbye, Farewell and Amen,” making it the most-viewed series finale ever. Until Super Bowl XLIV in 2010, which saw the post-Hurricane Katrina New Orleans Saints defeat the Indianapolis Colts, it was the most-watched television broadcast in U.S. history. No episode of a scripted series has come close in the decades since. The series finale of Cheers earned 80.4 million viewers, Seinfeld got 76.3 million, and Game of Thrones — the most-talked-about show on television for years — had 19.3 million.

Golden Girls Roe McClanahan, Estelle Getty, Betty White, and Beatrice Arthur.
Credit: Steve Fontanini/ Los Angeles Times via Getty Images

More Than 100 Cheesecakes Were Eaten on “The Golden Girls”

On The Golden Girls, there were very few problems that a slice of cheesecake couldn’t solve, from small scuffles to big life crises. Throughout seven seasons, more than 100 cheesecakes were eaten during the ladies’ late-night kitchen table commiserations.

However, if you look closely, you’ll notice that Dorothy rarely takes a bite. In real life, Bea Arthur reportedly hated cheesecake.

Lucille Ball & Desi Arnaz In 'I Love Lucy'.
Credit: Hulton Archive via Getty Images

Lucille Ball Was Only the Second Woman to Appear Pregnant on Network TV

When Lucille Ball became pregnant in real life, she and her husband and co-star, Desi Arnaz, considered taking a hiatus from I Love Lucy — but then thought it would be an opportunity to break the mold. “We think the American people will buy Lucy’s having a baby if it’s done with taste,” Arnaz said. “Pregnant women are not kept off the streets, so why should she be kept off television? There’s nothing disgraceful about a wife becoming a mother.” Ball ended up being the one of the first women to appear pregnant on a major television network and received more than 30,000 supportive letters from fans, despite the fact that the cast wasn’t allowed to say the word “pregnant” on-screen.

Angela Lansbury In 'Murder, She Wrote'.
Credit: Archive Photos/ Moviepix via Getty Images

Angela Lansbury Wasn’t the First Choice for Jessica Fletcher in “Murder, She Wrote”

It’s nearly impossible to imagine anyone but Angela Lansbury playing Jessica Fletcher, but she wasn’t a shoo-in for the job. Doris Day turned it down; Jean Stapleton (aka Edith Bunker) also declined, partly because she didn’t feel ready to jump into another series so soon after wrapping up the 1970s sitcom All in the Family. “Every time I saw Angela during those years, she’d say, ‘Thank you, Jean,’” Stapleton once said.

Out of all of her roles, Lansbury ended up identifying the most with Fletcher. “The closest I came to playing myself … was really as Jessica Fletcher,” Lansbury told Parade magazine in 2018. However, in 1985 — a year after the show began — she also told The New York Times: “Jessica has extreme sincerity, compassion, extraordinary intuition. I’m not like her. My imagination runs riot. I’m not a pragmatist. Jessica is.”

Old 1970s television.
Credit: son Photo/ Shutterstock

“Masterpiece Theatre” Is the Longest-Running Prime-Time Drama in the History of U.S. Television

Masterpiece Theatre premiered its first episode on January 10, 1971, following the success of a 1967 adaptation of John Galsworthy’s The Forsyte Saga. Stanford Calderwood, who was then the president of WGBH, Boston’s PBS affiliate, saw that success and wondered whether there might be a growing American appetite for British drama. His instincts proved spot-on. While on vacation in London, he convinced the execs at BBC that a partnership could prove fruitful for both networks; now, 50 years later, American viewers continue to clamor for classic British stories told with beautiful sets and elaborate costumes.

Actress Valerie Harper who appears in TV series The Mary Tyler Moore Show.
Credit: Bettmann via Getty Images

Valerie Harper Almost Didn’t Get the Role of Rhoda on “The Mary Tyler Moore Show” Because She Was Too Pretty

Rhoda Morgenstern, Mary Tyler Moore’s Bronx-born sidekick, was the last major role to be cast in the series, with more than 50 actresses reading opposite Moore for the part. Valerie Harper nailed her audition as Rhoda and even brought her own cloth for washing Mary’s apartment window in her first scene. But the producers weren’t sure she matched their vision.

“She was something we never expected the part to be… which is someone as attractive as she was,” Burns said in Mary and Lou and Rhoda and Ted. “But you’ve got to go with the talent.” Director Jay Sandrich felt strongly Harper was right for the role and suggested she not wear any makeup for her callback. Producers immediately changed their minds when they brought Moore in to read a scene with Harper. Rhoda’s character switched gears a little bit — rather than being unattractive, which is subjective anyway, Rhoda just felt like she was unattractive.

“Rhoda felt inferior to Mary, Rhoda wished she was Mary,” Harper later recalled. “All I could do was, not being as pretty, as thin, as accomplished, was: ‘I’m a New Yorker, and I’m going to straighten this shiksa out.’”

“Rubber Duckie” Was a Billboard Hit Song

Of all the catchy and memorable songs on Sesame Street, the only one to ever become a certified Billboard hit was “Rubber Duckie,” which was on the Hot 100 for seven weeks in 1970, topping out at No. 16. The tune was performed by Jim Henson himself, in character as Ernie — and was also nominated for a Grammy for Best Recording for Children that year. Little Richard covered the song in 1994, and an all-star version for National Rubber Duckie Day, featuring Tori Kelly, James Corden, Sia, Jason Derulo, Daveed Diggs, and Anthony Mackie, was released in 2018.

Ron Howard, the clean-cut All-American boy of the "Happy Days" series.
Credit: Bettmann via Getty Images

Ron Howard Accepted His “Happy Days” Offer to Avoid the Draft

Ron Howard was ambivalent about accepting an offer to headline what became Happy Days, as he’d already experienced sitcom success with The Andy Griffith Show and was looking forward to starting film school at USC. However, he’d also been saddled with what he called a “horrible draft number,” and given that he stood a better chance of avoiding the Vietnam War through work than a college deferment, he elected to roll the dice with the good-natured ’50s sitcom.

An aerial view of the empty set after the filming of the last episode of Seinfeld.
Credit: David Hume Kennerly/ Hulton Archive via Getty Images

The Theme Song for Each “Seinfeld” Episode Is Different

For the first seven seasons of Seinfeld, every episode started with Jerry Seinfeld doing a stand-up routine. But what only eagle-eared listeners will notice is that the theme song was made to match those monologues, which means every single episode had a slightly different one. Composer Jonathan Wolff used instruments like the bass — plus his fingers and mouth — to improvise the sounds, and synced them to Seinfeld’s stand-up timing to build a simple melody that could easily start and stop for jokes.

“I have no idea how many themes we did for Seinfeld…” he told Great Big Story. “The timing, the length, had to be adjustable in a way it would still hold water and still sound like the Seinfeld theme.”

Robin Williams, US actor and comedian, in costume for Mork & Mindy.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Mork From “Mork & Mindy” Originated on “Happy Days”

Fans may remember that Mork from Ork initially appeared in Richie Cunningham’s dream during a February 1978 episode of Happy Days, a premise apparently conceived of by the 8-year-old son of series creator Garry Marshall. Although this seemed like a terrible idea to the writers, they quickly realized the potential of the situation when the little-known actor Robin Williams wowed during his audition and rehearsals. Mork then proved a hit after going toe-to-toe with the Fonz on-screen, prompting Marshall and his cohorts to devise a spinoff series about the character in time for the fall 1978 TV season. Meanwhile, the “My Favorite Orkan” Happy Days episode was reedited for syndication to show that the alien encounter was real.

Alan Alda, US actor, in a promotional portrait for the television series 'M*A*S*H'.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Only One Actor Appeared in Every “M*A*S*H” Episode

M*A*S*H experienced several significant cast changes, and a few favorite characters were replaced with equally dynamic new ones — a standard practice on long-running shows today, but rare back then. Of the many actors who appeared on the show, Alan Alda (Benjamin Franklin “Hawkeye” Pierce) was the only star to appear in every episode. Through its run, the actor took increasing creative control of the series, directing 31 episodes including the finale, and co-writing 13 episodes. He became the first person ever to win Emmy Awards for acting, directing, and writing for the same show. Loretta Swift (Margaret “Hot Lips” Houlihan) was a close second in terms of longevity; she appeared in all 11 seasons but missed a handful of episodes along the way.

Estelle Getty, Rue McClanahan, Bea Arthur and Betty White.
Credit: Jim Smeal/ Ron Galella Collection via Getty Images

Each of the Four “Golden Girls” Stars Won an Emmy Award

The Golden Girls was an Emmys darling from the start, eventually accumulating 68 nominations and 11 awards, with each of the four leads taking home a trophy at one point. Bea Arthur, Rue McClanahan, and Betty White all received Best Actress nods in 1986, with White winning the honors. The following year, it was McClanahan who clinched the title, and then in 1988, it was Arthur’s turn — as well as Estelle Getty’s, who earned the Supporting Actress honor. During her speech, Arthur noted that her thank-yous were from “the four of us” since “we’ve all won.”

American actress Lucille Ball (1911 - 1989) with her mother DeDe (Desiree).
Credit: Frank Edwards/ Archive Photos via Getty Images

Lucille Ball’s Mom Was at Every Single Taping of “I Love Lucy”

Ball’s mother, DeDe Ball, went to every single taping of her daughter’s sitcom. In fact, her laughter can often be heard coming from the live audience — and she can even be heard saying, “Uh oh!” at times.

Speaking of famous mothers, Cher’s mother appeared in one episode of the show, long before Cher became a household name. Her mother, Georgia Holt, was a model who made a few TV cameos, including one memorable — but brief — appearance in a 1956 episode of I Love Lucy where the crew goes to Paris and is baffled by the avant-garde fashion. At the end, Holt is seen walking by as a model in an outfit inspired by the potato sack.

Mary Tyler Moore and cast on set.
Credit: Bettmann via Getty Images

The Real Owner of Mary Tyler Moore’s Apartment Building Displayed Political Banners to Keep Producers From Coming Back

The 1892 home that provided the exteriors for Mary’s apartment became so famous that the owners were inundated with visitors and tour buses, and eventually, they’d had enough. When they got word that the crew was coming back to film more exterior shots in 1972, owner Paula Giese displayed a large “Impeach Nixon” banner across the front. (She was a prominent political activist, so it was a two-for-one deal.) It worked. They didn’t get their new shots, and Mary eventually ended up moving.

Cast members of M*A*S*H.
Credit: Archive Photos/ Moviepix via Getty Images

“M*A*S*H” Is Based on a True Story

M*A*S*H was loosely based on the 1970 Robert Altman film of the same name, which was an adaptation of the 1968 novel MASH: A Novel About Three Army Doctors, by Richard Hooker, the pen name of former U.S. Army surgeon H. Richard Hornberger. The Mobile Army Surgical Hospital, or MASH (the asterisks between the letters were a creative design element used in the fictional versions), was first deployed by the U.S. Army during World War II as an attempt to move surgical care closer to wounded soldiers.

The charismatic character of Benjamin Franklin “Hawkeye” Pierce (played by Alan Alda) was created by Hornberger based on his own medical heroics. During the Korean War, Hornberger was assigned to the 8055th MASH, which traveled the 38th parallel dividing the Korean Peninsula, now the demilitarized zone that divides North and South Korea. His novel took 12 years to write and five years more to find a publisher, and eventually, Hornberger sold the television rights for the incredibly low amount of $500 (still only a few thousand dollars today) per episode.

Sesame Street characters pose under a "123 Sesame Street" sign.
Credit: Astrid Stawiarz/ Getty Images Entertainment via Getty Images

The Original Name of “Sesame Street” Was “123 Avenue B”

While names like The Video Classroom and Fun Street were tossed around, the most serious contender for the name of what later became known as Sesame Street was 123 Avenue B, since it fit the vibe of the inner-city set of the show. But the name was abandoned because it was an actual street address — and also because there was concern that those outside of New York City may not relate. The show’s writer, Virginia Schone, came up with the name Sesame Street, though it wasn’t immediately embraced, as many worried it would be hard for young kids to pronounce. After a weekend of brainstorming and no better options, it became the official title. “We went with it because it was the least bad title,” Cooney told Sesame Workshop.

Maggie Smith in Downton Abbey.
Credit:PictureLux / The Hollywood Archive/ Alamy Stock Photo

In its 50-year history, no Masterpiece miniseries has drawn as much buzz as Downton Abbey, which debuted in the U.K. on September 26, 2010, and on PBS the following January. The series, which aired its final season in the U.S. in 2016, chronicled the lives of an aristocratic family and their domestic servants in the fictional Yorkshire county estate of Downton Abbey. It tackled historic events ranging from the First World War to the 1918 influenza pandemic to the Irish War of Independence, all through the lens of the highly hierarchical household. It’s the most nominated non-U.S. series in Emmy history, with a total of 59 nominations and 12 wins. In 2019, a full-length feature film was released due to popular demand, followed by another film in 2022.

FRED ROGERS OF PUBLIC TV'S 'MISTER ROGERS' NEIGHBORHOOD'.
Credit: H. Mark Weidman Photography/ Alamy Stock Photo

The Red Trolley on “Mister Rogers’ Neighborhood” Traveled 5,000 Miles Annually

The beloved children’s television program Mister Rogers’ Neighborhood wasn’t complete without the anthropomorphic Trolley, which helped transport viewers into the Neighborhood of Make-Believe. In a given year of the show, Trolley’s commutes covered 5,000 miles, according to PBS, more than the length of the world’s longest river, the 4,123-mile Nile.

Trolley’s precise origins are somewhat mysterious, but we do know the one-of-a-kind model was hand-built from wood by a Toronto man named Bill Ferguson in 1967, the year before Mister Rogers’ Neighborhood premiered. The TV host’s love for trolleys went all the way back to his own childhood; during one 1984 episode of Mister Rogers’ Neighborhood, he visited the Pennsylvania Trolley Museum and remembered accompanying his dad on long trolley trips. Today, Trolley is on permanent display at the Fred Rogers Center at Saint Vincent College in Rogers’ hometown of Latrobe, Pennsylvania.

Henry Winkler (left) and Ron Howard as Arthur 'The Fonz' Fonzarelli and Richie Cunningham.
Credit: Michael Ochs Archives/ Moviepix via Getty Images

The “Happy Days” Theme Song Didn’t Open the Show Until Season 3

The famed Happy Days theme song, written by Norman Gimbel and Charles Fox and originally sung by Jim Haas, wasn’t used for the opening credits in seasons 1 and 2. That spot was reserved for a re-recorded take of Bill Haley’s “Rock Around the Clock,” with the similar-sounding Gimbel-Fox composition on the closing credits. However, an updated version of “Happy Days,” performed by Truett Pratt and Jerry McClain, accompanied the opening credits for season 3, and eventually made its way to No. 5 on the Billboard charts. “Happy Days” was later recorded again by Bobby Arvon and used to open the show for its final season in 1983-84.

Moore And Knight In 'The Mary Tyler Moore Show,'.
Credit: Fotos International/ Archive Photos via Getty Images

“The Mary Tyler Moore Show” Was Likely the First American Sitcom to Feature Birth Control Pills

On The Dick Van Dyke Show, which Moore starred in from 1961 to 1966, the actress and her on-screen husband, Dick Van Dyke, slept in separate beds and couldn’t say the word “pregnant.” However, just a few years later on The Mary Tyler Moore Show, not only did Mary have sex out of wedlock, but she openly took birth control pills. In a 1972 episode — the same year that a Supreme Court decision made birth control available to unmarried women in all states — Mary is having dinner with her father when her mother shouts, “Don’t forget to take your pill!” Mary and her father both yell, “I won’t,” and the embarrassed look on Mary’s face shows that she doesn’t just take a pill, but The Pill.

'Sesame Street' Cast Members.
Credit: Hulton Archive via Getty Images

The Show Idea for “Sesame Street” Started at a Dinner Party

A producer at New York City’s Channel 13 public television station, Joan Ganz Cooney, was hosting a dinner party in 1966 when she chatted up Lloyd Morrisett, a Carnegie Corporation educator. He told her that one morning he found his 3-year-old staring at the television’s test pattern, waiting for something to begin. They started discussing whether there was any way for young minds to learn from the medium, and thus the entire concept of educational television — and Sesame Street — was born. It was first described as a preschool for those who couldn’t afford to attend.

Publicity Still from "Mork & Mindy" Robin Williams circa 1978.
Credit: PictureLux / The Hollywood Archive/ Alamy Stock Photo

Mork’s Spacesuit Was Recycled From an Episode of “Star Trek”

Since Mork was originally meant to be a one-off character, there wasn’t a whole lot of thought put into his appearance; someone simply grabbed a red spacesuit from the Paramount wardrobe collection, added a silver triangle, and the Ork uniform was born. It’s unknown whether anyone at the time caught the uncanny resemblance between Mork’s suit and the one worn by Colonel Green in the 1969 Star Trek episode “The Savage Curtain,” but we do know that Mork & Mindy dipped into the Star Trek archives at least one more time: The spaceman costume worn by Mindy’s father (Conrad Janis) in the “Mork Goes Public” episode of season 1 was comprised of a helmet and suit from two separate episodes of the sci-fi predecessor.

Lucille Ball (1910 - 1989) with her husband and co-star Desi Arnaz (1917 -1986), together.
Credit: MPI/ Archive Photos via Getty Images

The Pilot for “I Love Lucy” Was Lost for Four Decades

I Love Lucy’s pilot episode, shot March 2, 1950, couldn’t be found for about 40 years. But one of Arnaz’s collaborators, Pepito Perez, later found a 35-millimeter version of it in his house. Though some of it was damaged, most of the footage aired as part of a 1990 CBS special.

 Actresses Betty White, Rue McClanahan and Bea Arthur.
Credit: Carlo Allegri/ Getty Images Entertainment via Getty Images

The “Golden Girls” Cast Once Performed for the Queen Mother

Queen Elizabeth II’s mom, the Queen Mother, was such a fan of The Golden Girls that she had the four leads perform at the London Palladium in 1988 during the Royal Variety Performance. The cast performed two of their kitchen table scenes and made sure to censor a few things to not offend the royals in attendance.

That said, the Queen Mum did have a sense of humor. One joke that was left intact was Dorothy asking Blanche how long she waited to have sex after her husband died, with Sophia wittingly interjecting, “Until the paramedics came.” The response made the often-reserved royal laugh out loud.

The cast and crew of the hit television show "Seinfeld".
Credit: David Hume Kennerly/ Archive Photos via Getty Images

The First and Last Conversations Between Jerry and George in “Seinfeld” Were the Same

In a full-circle moment, the first scene of the series started in a coffee shop with Jerry telling George that a button on his shirt was too high and that it “makes or breaks” the shirt since it’s in “no man’s land.” And in the very last scene of the finale, when they’re all sitting in a jail cell, he alludes to it again, saying: “The second button is the key button. It literally makes or breaks the shirt.”

As the camera pans back, George says, “Haven’t we had this conversation before?” to which Jerry ends the series with “Maybe we have.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Origional photo by Buddy Mays / Alamy Stock Photo

Did you know that Ben and Jerry learned to make ice cream via a correspondence course? Or that before Lamborghini was famous for luxury cars, they sold tractors? Many of the world’s top brands have fascinating stories, and we’ve collected some of our favorites from around the site for your reading pleasure.

Close-up of a spread of Mcdonald's food.
Credit: Brett Jordan/ Unsplash

McDonald’s Once Tried Making Bubblegum-Flavored Broccoli

Kids weren’t lovin’ it when McDonald’s tried to add bubblegum-flavored broccoli to Happy Meals. In 2014, the fast-food giant’s then-CEO, Donald Thompson, revealed the bizarre experiment at an event hosted by a venture capitalist firm. Under pressure to make Happy Meals healthier, the company reflected on how toothpaste and amoxicillin producers had used artificial bubblegum flavoring to make their goods more palatable to children. McDonald’s decided to try a similar tactic with the divisive cruciferous veggie.

“Mickey D’s” food scientists did successfully make broccoli taste like bubblegum, likely by employing a combination of strawberry, banana, and cherry flavors. However, a focus group of kids was confused by the final product, which they enjoyed about as little as standard broccoli (we’re guessing it wasn’t pink). The item was never added to the McDonald’s menu, so parents who want to impress their kids with a tastebud switcheroo will have to settle on cotton candy grapes.

Marlboro Man billboard.
Credit: William Nation/ Sygma via Getty Images

The Marlboro Man Never Smoked

We all remember the Marlboro Man: an able-bodied outdoorsman, usually a cowboy, who enjoyed a hard-earned puff from his cigarette amid a day of honest labor, his steely gaze beckoning us to “come to where the flavor is” in the land of Marlboro Country. Except the real Marlboro Man never smoked — at least not the “original,” an individual by the name of Bob Norris who featured in the brand’s early TV commercials. A Colorado rancher who was offered the job after being seen in a photo with his friend John Wayne, Norris reluctantly became the face of an overwhelmingly successful advertising campaign by the Leo Burnett agency that made Marlboro the world’s top-selling cigarette brand by the 1970s.

But while Norris epitomized the Marlboro Man’s image of rugged individuality, he ultimately proved too principled to last in the role; when his children asked why he was promoting a product they were forbidden to try, he reportedly hung up his Stetson after 12 years of cigarette pitch work. Of course, Norris was an anomaly among his Marlboro brethren. While he lived to the ripe old age of 90, others who followed in his bootsteps learned the hard way what decades of smoking could yield, with several later publicly speaking out against the habit before dying from smoking-related illnesses.

Google Maps application on a phone.
Credit: Justin Sullivan/ Getty Images News via Getty Images

Google Maps Once Listed a Town That Never Existed

There’s off the map, and then there’s Argleton. The English town was visible on Google Maps until 2009, which is notable for one major reason: No such place exists. So how did it get listed? Though never confirmed by Google, it’s been speculated that Argleton may have been a trap street — a fictitious road used by cartographers to catch anyone copying their work. The reasoning is as simple as it is clever: If a street (or, in this case, town) that you made up ends up on another map, you’ll have caught its creator red-handed in copyright infringement.

Though little more than an empty field in West Lancashire, Argleton once had its own (presumably auto-generated) job listings and weather forecasts. Once its (non-)existence became known on the internet, humorous T-shirts with slogans such as “New York, London, Paris, Argleton” and “I visited Argleton and all I got was this T-shirt” appeared online, too. Google itself was tight-lipped on the subject, releasing a brief statement noting that “Google Maps data comes from a variety of data sources. While the vast majority of this information is correct there are occasional errors.”

View of the big Mall of America sign outside.
Credit: Mark Erickson/ Getty Images News via Getty Images

The Mall of America Is Owned by Canadians

Baseball, apple pie, and shopping — all three are American favorites. So it may be a bit surprising that one of the country’s largest shopping destinations is overseen by our neighbors to the north. That’s right: The Mall of America is owned by Canadians. Despite its name, the supersized shopping complex — found just outside Minneapolis in Bloomington, Minnesota — was developed by the Triple Five Group, a Canadian retail and entertainment conglomerate. (Notably, while the Mall of America is truly humongous, it was once surpassed in sheer size by the West Edmonton Mall, a Canadian shopping center built by the same company in the 1980s, and which reigned for decades as the largest mall in North America.)

In the decades since its opening, the Mall of America has grown, increasing to 5.6 million square feet and stuffed with 520 stores and 60 restaurants. For those who aren’t into shopping, there’s more to do than just wait around in the food court — today, the Mall of America is home to a 13-screen movie theater, an indoor theme park, a mini-golf course, and the largest aquarium in the state of Minnesota.

Lamborghini tractor in the outdoors.
Credit: David Fowler/ Alamy Stock Photo

Lamborghini Began by Making Tractors

Today, the name Lamborghini is synonymous with automotive opulence, but the Bologna, Italy-based company has an origin story that’s more humble than you might expect. Born in 1916, Ferruccio Lamborghini served in the Italian Air Force as a mechanic during World War II, learning the ins and outs of some of the most advanced vehicles in the world. Returning home after the war, Lamborghini knew his home country would need to increase agricultural output to recover from the devastation of the conflict. With other tractor companies (one of them being FIAT) too expensive for his war-weary compatriots, Lamborghini put his mechanical skills to work and created cheap-yet-powerful tractors salvaged from surplus military material.

Starting with its first tractor, named Carioca, in 1948, Lamborghini Trattori became an immensely successful business. Lamborghini’s fortune from the tractor business, along with other proceeds from his dabblings into air-conditioning and heating systems, provided enough capital for Lamborghini to buy his own Ferrari 250 GT sports car in 1958. Ever the mechanic, Lamborghini was unimpressed with his Ferrari (especially its less-than-luxurious clutch) and even began a feud with Enzo Ferrari himself. So, he decided to make his own sports car, and in 1963, Automobili Lamborghini launched a legacy of fine automobile craftsmanship that has lasted for 60 years and counting. (They also still make tractors.)

Pringles chip containers opened.
Credit: DIRK WAEM/ AFP via Getty Images

Pringles Inventor Fredric Baur’s Ashes Were Buried in a Pringles Can

When considering a final resting place, most people ponder the conventional options, such as a coffin or, for those who prefer cremation, an urn. That was not the case for Pringles inventor Fredric Baur, whose devotion to his innovative packaging method (which stacks his perfectly curved creations in a tall tube) was so intense that he had his ashes buried in a Pringles can.

“When my dad first raised the burial idea in the 1980s, I chuckled about it,” Baur’s eldest son, Larry, has said of his father’s wishes. But this was no joke. So after the inventor died in 2008, his children made a stop on their way to the funeral home: a Walgreens, where they had to decide which can to choose. “My siblings and I briefly debated what flavor to use,” Larry Baur added. “But I said, ‘Look, we need to use the original.’” Baur’s ashes now rest, in the can, at his grave in a suburban section of Cincinnati, Ohio.

Closeup of a Nintendo gaming system.
Credit: Jason Leung/ Shutterstock

Nintendo Was Founded Before the Fall of the Ottoman Empire

The Ottoman Empire feels like an entity of a time long past, while the name Nintendo conjures up images of modernity — electronics, video games, arcades, and mustachioed plumbers. However, Nintendo was actually founded before the Ottoman Empire ended, and this period of overlap isn’t measured in a matter of months or even a few years. When the Ottoman sultanate was eliminated in 1922 after the widespread geographic shuffle that followed World War I, Nintendo had already been in business for 33 years.

Of course, this wasn’t the Nintendo that many of us know today — Nintendo didn’t make its first electronic video game until 1975. Founded on September 23, 1889, Nintendo’s original mission was a humble one: selling playing cards, specifically Japanese-style cards called Hanafuda. The company did pretty well, but decided to expand further in later decades. Nintendo struck a deal with Disney in 1959 to create playing cards with Disney characters on them, and in the 1960s, Nintendo sold a series of successful children’s toys, including Ultra Hand and Home Bowling, before becoming the official Japanese distributor of the Magnavox Odyssey — the first commercial home video console. Seeing the promise of such a machine, Nintendo threw its weight behind this emerging entertainment category. The rest, as they say, is history.

Seatbelt in a car.
Credit: Alexandria Gilliott/ Unsplash

Volvo Gave Away Their Seat Belt Patent to Save Lives

Seat belts are a standard feature in today’s cars and trucks, but it hasn’t always been that way. In the 1950s and ’60s, car manufacturers weren’t required to include safety belts in vehicles. When they were built in, the earliest seat belts were simple two-point restraints that secured across the waist (aka lap belts). While a step in the right direction, lap belts had some downsides — they didn’t protect the upper body during a collision and could even cause injuries during high-speed crashes. Recognizing these issues, Swedish carmaker Volvo hired Nils Bohlin (a former aviation engineer who helped create pilot ejection seats) as the company’s safety engineer, and tasked him with a redesign. Bohlin’s creation — a more comfortable V-shaped belt that stays in position across both the chest and hips — was drafted in under a year, and is the style used in cars today. Volvo quickly added the belts to its cars in 1959, before the inventor even secured a patent. But when he did, Bohlin and Volvo didn’t look to profit off the safety feature. Instead, they released the design publicly, urging all car manufacturers to add the upgraded belts. After years of presentations and crash test dummy demos, Volvo eventually made headway — the evidence of which is found in our cars today and credited with saving lives around the world.

A Michelin tire being prepared.
Credit: Boyer/ Roger Viollet via Getty Images

Michelin Stars Were Originally Connected to an Effort to Boost Tire Sales

In the restaurant business, there is no greater honor than the Michelin star. Awarded on a ranking from one to three, Michelin stars are the standard of greatness when it comes to fine dining. Chefs pin their reputations on them, and having (or not having) them can make or break a business. So it might seem strange to discover that this culinary accolade is intimately entwined with… car tires. Brothers Andre and Edouard Michelin, founders of the Michelin tire company, created the Michelin Guide — a free booklet full of useful information for French motorists.

To help raise the guide’s prestige (and also help motorists explore Europe again following World War I), the brothers reintroduced the handbooks in 1920, featuring more in-depth hotel and restaurant information — and instead of being free, they now cost seven francs. Within a few years, Michelin also recruited “mystery diners” to improve its restaurant reviews (they still work undercover), and in 1926, they began handing out single Michelin stars to the very best restaurants. Five years later, Michelin upped the amount of possible stars to three, and they have continued searching for the world’s best food in the nearly a century since. Today, the guides — and stars — cover more than 30 territories across three continents.

Three Oxford English Dictionaries.
Credit: Tessa Bunney/ Corbis News via Getty Images

It Took the Editors of the Oxford English Dictionary Five Years Just To Reach the Word “Ant”

If you think reading the dictionary sounds exhausting, try writing one — largely by hand, no less. That’s what the editors of the original Oxford English Dictionary had to do after the Philological Society of London deemed existing dictionaries “incomplete and deficient” in 1857. They had their work cut out for them: In 1884, five years after beginning what they thought would be a decade-long project, principal editor James Murray and his team reached an important milestone — the word “ant.” That year, they began publishing A New English Dictionary on Historical Principles (as it was then known) in installments called fascicles, with the 10th and final fascicle seeing the light of day in 1928. To say that the project’s scope was larger than anticipated would be putting it mildly. What was intended as 6,400 pages spread across four volumes ballooned into a 10-volume tome containing 400,000 words and phrases. The dictionary took so long to finish, in fact, that Murray died 13 years before its completion.

Ben & Jerry's ice cream.
Credit: Hybrid Storytellers/ Unsplash

Ben and Jerry Learned How to Make Ice Cream by Taking a $5 Correspondence Course

The founders of the country’s leading ice cream brand spent only a pint-sized sum learning how to make their product. Both growing up on Long Island, New York, Ben Cohen and Jerry Greenfield became friends in seventh grade, back in 1963. Originally, they set their sights on being a doctor (Greenfield) and an artist (Cohen). But once they reached their 20s — a rejected medical school applicant and a potter who dropped out of college — they decided to enter the food industry instead. The duo came close to becoming bagel makers, but realized that producing ice cream was cheaper (bagel-making equipment can be pretty pricey). Their dessert education arrived through a Penn State College of Agricultural Sciences correspondence course, which sent them a textbook in the mail and required only open-book tests.

All of the ice cream was made in a 5-gallon machine, and Ben & Jerry’s shop originally sold eight flavors: Oreo Mint, French Vanilla, Chocolate Fudge, Wild Blueberry, Mocha Walnut, Maple Walnut, Honey Coffee, and Honey Orange. However, as the flavors got wilder — think Chunky Monkey, Cherry Garcia, and Phish Food — many more outposts and a wholesale delivery business followed, as did an IPO. In 2000, Unilever — the parent company of Breyers and Klondike — paid $326 million to acquire Ben & Jerry’s.  

A can of Pepsi.
Credit: NIKHIL/ Unsplash

Pepsi Was Originally Called “Brad’s Drink”

Pepsi has been nearly synonymous with cola for more than a century, but it wasn’t always called that. We have pharmacist Caleb Bradham to thank for the bubbly beverage, as well as its original name: Brad’s Drink. Believing that his concoction had digestive benefits, Bradham sold it at his pharmacy in New Bern, North Carolina. Brad’s Drink didn’t last long, however — it was renamed Pepsi-Cola in 1898. The new name was partly derived from the word “dyspepsia,” a technical term for indigestion, and was meant to convey the tasty beverage’s supposed medicinal properties. Bradham trademarked the name in 1903, and the company grew exponentially over the next few years, with 240 franchises opening across 24 states by 1910.

View of the IKEA logo.
Credit: Jueun Song/ Unsplash

“IKEA” Is an Acronym

You’d be forgiven for assuming that IKEA is a Swedish word related to furniture. In fact, it’s an acronym that combines the initials of founder Ingvar Kamprad (IK) with the name of the farm where he grew up (Elmtaryd) and a nearby village (Agunnaryd). Kamprad was just 17 when he founded the company in 1943, initially selling small household items — think pens and wallets — rather than beds and sofas. He likely had no idea that there would one day be more than 450 IKEA stores across the globe.

Walt Dinsey sketching cartoons.
Credit: Hulton Archive/ Archive Photos via Getty Images

Walt Disney’s Cartoons Were Originally Called “Laugh-O-Grams”

Before founding the animation studio that bears his name, Walt Disney was a commercial artist in Kansas City, Missouri. It was there, around 1919, that he began making hand-drawn cel animations of his own, which were screened in a local theater and dubbed “Laugh-O-Grams.” The studio he acquired following his cartoons’ success had the same moniker, but it was a short-lived venture — Laugh-O-Gram’s seven-minute fairy tales and other works were popular with audiences, but financial troubles forced Disney to declare bankruptcy in 1923.

Disney, his brother Roy, and cartoonist Ub Iwerks moved to Hollywood the same year and founded Disney Brothers Cartoon Studio, which quickly changed its name to Walt Disney Studios at Roy’s behest. Had it not been for Laugh-O-Gram, however, it’s likely that Disney’s most famous creation would never have been born. The inspiration for Mickey Mouse came from a brown mouse who frequented his Kansas City studio trash basket — a “timid little guy” Disney was so fond of that before leaving for Hollywood, he “carefully carried him to a backyard, making sure it was a nice neighborhood,” at which point “the tame little fellow scampered to freedom.”

Guinness World Record contestant.
Credit: Maja Hitij/ Getty Images News via Getty Images

Guinness World Records Started Out as a Guinness Brewery Promotion Intended To Help Settle Bar Bets

In 1954, Sir Hugh Beaver, the managing director of Guinness, thought up a way to reduce pub disputes so bartenders could focus on pouring his company’s signature beers. He suspected that every bar could benefit from a book filled with verified facts and stats about subjects that might arise mid-conversation over a drink. Two events in particular prompted his decision: Earlier in the decade, he and fellow guests at a hunt in Ireland memorably argued about Europe’s fastest game bird, which they had no means of identifying. Then, on May 6, 1954, English athlete Roger Bannister became the first person to run a mile in less than four minutes, causing public interest in records-related news to surge. Norris McWhirter had served as the stadium announcer during Bannister’s historic run, and Beaver hired both him and his identical twin, Ross McWhirter — another sports journalist — to assemble The Guinness Book of World Records.

The McWhirter twins spent about three months working feverishly on their 198-page compendium. Although initially meant to be given out for free at bars to promote Guinness, the book became so popular, the company started selling it, soon to great success. To date, more than 150 million books from the series — eventually renamed Guinness World Records — have been purchased, educating readers in 40-plus languages.

Stacks of Chef Boyardee pastas.
Credit: Dorann Weber/ Moment Mobile via Getty Images

Chef Boyardee Was a Real Person

The world knows him as the jovial-looking fellow whose face has graced untold numbers of ravioli cans, but to those who knew him in life, he was Ettore “Hector” Boiardi — which is to say, Chef Boyardee was a real person. Born October 22, 1897, in Piacenza, Italy, Boiardi was working as an apprentice chef by the age of 11 and founded the company bearing his name in 1928, after he and his family settled in Cleveland. The business began because Boiardi’s restaurant there was so successful that patrons wanted to learn how to make the dishes at home, which was remarkable for the time — Italian food wasn’t nearly as well known (and beloved) as it is today. In fact, Chef Boyardee has been credited with helping to popularize the cuisine in America. There was just one problem: “Boiardi” was difficult for Americans to pronounce, so his products were sold under the phonetic name of Chef Boy-Ar-Dee (since simplified to its current spelling).

Close-up of a Zelda game insert.
Credit: Jacob Spaccavento/ Unsplash

The Zelda Video Game Was Named for F. Scott Fitzgerald’s Wife

Video games aren’t often associated with literary figures, but the Legend of Zelda has always been unique. Take, for instance, the fact that its title character was named after writer, artist, and Jazz Age icon Zelda Fitzgerald, whose marriage to The Great Gatsby author F. Scott Fitzgerald generated nearly as many headlines as his professional output. Zelda, who’s been described as the first flapper of the Roaring ’20s (and the inspiration for Gatsby’s Daisy Buchanan), was chosen because a Nintendo PR rep suggested that the eponymous princess should be “a timeless beauty with classic appeal” and that Zelda Fitzgerald was one such “eternal beauty.” The name chain didn’t end there; actor Robin Williams was such a fan of the series that he named his daughter after the Princess of Hyrule. As for Zelda F. herself, she was — rather fittingly — named for the fictional heroine of a 19th-century novel.

A cup of Starbucks coffee.
Credit: Samule Sun/ Unsplash

Starbucks Coffee Was Almost Called “Cargo House”

The world’s largest coffeehouse chain, Starbucks, almost had a very different name. According to a 2008 Seattle Times interview with the company’s co-founder Gordon Bowker, the famous java chain was once “desperately close” to being called “Cargo House,” a name meant to tie the first store (in Seattle’s Pike Place Market) to the idea of beans coming from far away. Anxious for another, more pleasing moniker, a brand consultant working with Bowker mentioned that words starting with “st” felt especially strong. Bowker ran with the idea, listing every “st” word he could think of.

The breakthrough moment occurred after the consultant brought out some old maps of the Cascade mountains and Mount Rainier — both close to the company’s hometown of Seattle — and Bowker stumbled across an old mining town named “Starbo.” The name lit up a literary reference embedded in his mind: Starbuck, a name of a character from Herman Melville’s 1851 masterpiece Moby-Dick; or, The Whale. Bowker readily admits that the character has nothing to do with coffee, but the moniker stuck, and the company doubled down on the nautical theme by introducing a mythological siren, likely influenced by a seventh-century Italian mosaic, as its now-famous green-and-white logo.

Cats roaming Disneyland.
Credit: MediaNews Group/Orange County Register via Getty Images

About 200 Feral Cats Roam Disneyland, Where They Help Control Rodents

Spend enough time at Disneyland and you’ll see them. Maybe you’ll spot one snoozing in the bushes near the Jungle Cruise or observing you warily as you ride the tram, but one thing is certain: However many cats you see, there are more out of sight. About 200 feral cats roam the Happiest Place on Earth, where they earn their keep by helping to control the rodent population. The felines were first seen not long after Disneyland opened in 1955, when they took up residence in Sleeping Beauty Castle, and it soon became evident that keeping them around had more advantages than trying to escort them off the premises.

The mutually beneficial alliance even includes permanent feeding stations for the cats, as well as spaying or neutering and vaccinations. Though not official cast members, these adept hunters — who mostly come out at night — have earned a devoted following of their own. There are websites, Instagram feeds, and YouTube videos devoted to them. They’re not quite as popular as the actual rides at Disneyland, of course, but for cat lovers, they’re an attraction all their own.

Chupa Chups lolipops.
Credit: John Keeble/ Getty Images News via Getty Images

You may not know it by name, but you’re almost certainly familiar with Salvador Dalí’s best-known work, “The Persistence of Memory,” which depicts melting clocks on a bleak landscape. No less famous, albeit in an entirely different way, is the Chupa Chups logo — which Dalí also designed. While the idea of a surrealist collaborating with a lollipop company may sound odd, it begins to make sense when you learn a bit more about the eccentric artist — starting with the fact that he was close friends with Chupa Chups founder Enric Bernat, a fellow Spaniard.

The two met at a café one day in 1969, with Bernat making Dalí aware of his need for a logo and the world-renowned artist quickly taking care of it for him. He did so with great intention, of course: “Acutely aware of presentation, Dalí insisted that his design be placed on top of the lolly, rather than the side, so that it could always be viewed intact,” Phaidon notes. Dalí reportedly designed the instantly recognizable daisy-based logo in less than an hour on that fateful day, and it’s still in use decades — not to mention billions of sales — later.

Sir Issaic Newton under an apple tree.
Credit: Hulton Archive via Getty Images

Apple has always been known for its design. Before its iconic logo resembled an actual apple, however, it featured Sir Isaac Newton sitting under an apple tree. This is, of course, a reference to the legend of Newton formulating his law of universal gravitation after getting bonked on the head by a falling apple — which ranks among history’s best-known “aha!” moments. The more widely accepted version of events is that Newton merely observed a falling apple, but that doesn’t make the event any less fun to ponder. In addition to the drawing, the logo featured a line from poet William Wordsworth: “Newton … a mind forever voyaging through strange seas of thought … alone.” The logo — which debuted when the company was founded in 1976 — was short-lived, however, in part because co-founder Steve Jobs felt the design couldn’t be effectively rendered in smaller versions. Soon, he hired graphic designer Rob Janoff, who came up with the logo now recognized worldwide.

A box of Cracker Jacks.
Credit: Felix Choo/ Alamy Stock Photo

Some Historians Consider Cracker Jack America’s First Junk Food

It all started with Chicago candy and popcorn peddlers Frederick and Louis Rueckheim, German immigrants who crafted a non-sticky caramelized popcorn as a way to stand out from other popcorn vendors. Their version — with a sweet, crunchy coating that was different from the salted popcorn and kettle corn available at the time — became a hit after it was mass-produced in 1896.

Cracker Jack’s early marketing warned prospective customers about the effects of the product. “Do not taste it,” one 1896 article cautioned. “If you do, you will part with your money easy.” Some historians believe that the caramel-coated popcorn and peanut treat jump-started the American snack food industry around the turn of the 20th century. It may even hold the title of the country’s first junk food, though the types of junk food popular today didn’t make their appearances until the 1950s. It was a song, however, that helped cement Cracker Jack’s snack status. In 1908, songwriter Jack Norworth — entirely unknown to the Rueckheims — composed “Take Me Out to the Ball Game” after seeing an advertisement for an upcoming game. The song, which mentions the snack by name, led to a surge in sales that forever linked Cracker Jack with sports.

Shelves of movies and television inside a Blockbuster.
Credit: ANDREW MARSZAL/ AFP via Getty Images

There Is Only One Remaining Blockbuster Location — In Bend, Oregon

At its peak in 2004, Blockbuster, the wildly successful movie rental chain, boasted 9,094 locations. Today it has just one. Bend, Oregon, is home to the former giant’s last remaining outpost, a status the store attained when its counterpart in a suburb of Perth, Australia, closed in 2019. Originally opened in 1992 as Pacific Video, the location became a Blockbuster franchise store eight years later — and doesn’t look to be closing any time soon. That’s thanks in part to the 2020 documentary The Last Blockbuster, which contributed to the brick-and-mortar store being cemented as a tourist attraction among nostalgia-minded visitors.

Boxes of Girl Scout Cookies.
Credit: John Moore/ Getty Images News via Getty Images

There Are Three Mandatory Flavors of Girl Scout Cookies Sold Each Year

Though there have been many changes to the kinds of Girl Scout Cookies sold over the decades, three stalwart flavors are mandated each year: Thin Mints, Do-si-dos (also called Peanut Butter Sandwiches), and Trefoils. None of these varieties existed in their current form in the earliest years of cookie sales, but a version of Thin Mints can be traced back to 1939, when troops started selling a flavor known as “Cooky-Mints.” By the 1950s, shortbread had joined the lineup, alongside the renamed Chocolate Mints and sandwich cookies in vanilla and chocolate varieties. Peanut Butter Sandwiches hit the scene soon after, and by 1966, all three of the aforementioned flavors were among the group’s bestsellers. Other cookies came and went in the decades that followed, but Thin Mints, Do-si-dos, and Trefoils have been staples since the 1970s — and for good reason.

A variety of Snapple bottles.
Credit: Mario Tama/ Getty Images News via Getty Images

The Name “Snapple” Is a Portmanteau

The brand name Snapple is a portmanteau of two words — “snappy” and “apple.” When the company began in 1972, founders Leonard Marsh, Hyman Golden, and Arnold Greenberg (who ran a health food store in New York City’s East Village) aimed to sell fruit juice-based soft drinks. One early product was a carbonated apple soda called “Snapple.” That original product wasn’t without its issues, however: Some of the bottles would ferment, sending the caps flying. That didn’t deter the trio, who went on to become some of the first to sell soft drinks made with natural ingredients. They officially changed the company’s name from Unadulterated Food Products to “Snapple” in the early 1980s.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.