Original photo by MaraZe/ Shutterstock

A dash of pepper, a teaspoon of cumin — most of our interactions with spices are limited to tossing them into dinner dishes. However, some of the spices stashed in kitchen cabinets have uses besides cooking, or interesting backstories of their own. These seven facts about common household spices may surprise you, or at least give you something to think about while you’re hovering over the stove.

A variety of spices in rectangular containers.
Credit: Unsplash+ via Getty Images

There’s a Difference Between Spices and Herbs

The terms “herbs” and “spices” are often used interchangeably, but there is a difference between the two. Herbs come from the leaves of herbaceous plants — aka those that don’t have woody stalks. Spices also come from plants, of course, but can be harvested from any portion, including the roots, bark, fruit, or seeds. (So, for example, mint leaf is an herb, while cumin seed is a spice.) And while you may find salt adjacent to cinnamon and oregano at the grocery store, or store it on your spice rack, it’s actually a mineral.

Cinnamon sticks are decorated on a wooden table with cinnamon powder.
Credit: Light Stock/ Shutterstock

Cinnamon Was Sometimes Burned at Roman Funerals

Researchers believe cinnamon may be one of the world’s oldest spices, and during its earliest known history, the heavily scented bark was actually more valuable than gold.  However, the civilizations that collected this spice back then had a variety of uses for it besides eating. In ancient Egypt, cinnamon was used for medicine and religious practices. Similarly, ancient Romans considered the scent sacred, sometimes burning it at funerals of the wealthy. Cinnamon eventually gained its most common modern use — as a food flavoring — around the Middle Ages.

Woman chopping garlic on wooden cutting board.
Credit: RESTOCK images/ Shutterstock

Garlic Is Actually a Vegetable

Historians believe humans have been harvesting garlic for about 5,000 years, with people in ancient Egypt and India being the first to adopt the pungent spice. Garlic has remained a pantry staple ever since, though unlike most spices, it’s botanically a vegetable (even if it’s also one we use as a spice). Garlic belongs to the onion family, and the entire plant — bulb, leaves, and flowers — is edible.

Close-up of black peppercorns along side ground black pepper.
Credit: Tim UR/ iStock

Peppercorns Are Fruits, Not Seeds

Before they’re cracked or ground down, peppercorns tend to resemble seeds, though they’re actually dried fruit. Black pepper plants (aka piper nigrum) are flowering vines native to India and Southeast Asia, where their fruits grow in tiny, grape-like clusters. Peppercorns are initially green and turn darker as they dry, eventually reaching their deep, nearly black color.

Ground nutmeg in measuring spoon.
Credit: seanrmcdermid/ iStock

Nutmeg May Help You Sleep Better

Nutmeg sees most of its use during the holiday baking season, added into pies and baked goods from Thanksgiving through the winter holidays. However, researchers believe adding the antioxidant-rich spice into your diet more regularly can help you sleep better and longer. Nutmeg can also boost your mood and even help balance blood sugar.

Close up view of dill seeds.
Credit: Mulevich/ Shutterstock

Dill Seeds Were Once Used as Mints

Feathery dill leaves are best known for flavoring pickles, though their seeds contain an anise-like flavor when chewed. Colonial Americans took advantage of this refreshing flavor, using dill seeds as a natural breath mint and giving them another name: “meetinghouse seeds.” During long church services, the edible seeds were occasionally given to fidgety children to keep them calm or help perk up sleepy congregants.

Baltic Sea marked with Red Circle on Realistic Map.
Credit: hyotographics/ Shutterstock

Archaeologists Have Discovered Preserved Spices on a Sunken Ship

If your pantry contains ginger, peppercorns, and clove, you have something in common with King Hans, the 15th- and 16th-century ruler of Denmark and Norway. Researchers have unearthed spices, along with peppercorns, dill, and the remnants of other foods, from the Gribshunden, a ship once owned by the king that has sat at the bottom of the Baltic Sea since 1495. Despite being underwater for more than 500 years, the spices have been preserved thanks to cold waters and low salinity. Some, such as the five-centuries-old saffron, even maintained their aromas. As for spices in your above-ground pantry, most stay fresh between two and four years before losing their potency.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Tadeusz Wejkszo/ Shutterstock

On any given day, 37% of adults in the United States pick up a greasy bag of treats from their local fast-food joint. Fast food has been blamed for all kinds of societal ills, particularly ones that are nutrition-related, but we keep eating it anyway. Whether you consume fast food every day or avoid it at all costs, you might not know these six facts about some of America’s most popular fast-food chains.

Signs of Hungry Jack's restaurant.
Credit: ullstein bild via Getty Images

In Australia, Burger King Is Called Hungry Jack’s

Like many fast-food restaurants, Burger King expands with franchising — individual locations with separate owners who license the larger brand’s identity and business model. When the first Burger King franchise hit Australia in 1971, there was already a local (unrelated) restaurant called Burger King, trademark and all. So Canadian Australian entrepreneur Jack Cowin decided to call his new restaurant Hungry Jack’s, even though it was otherwise identical to any other Burger King.

Over the next couple of decades, Hungry Jack’s expanded throughout Australia, but when the Australian Burger King trademark expired in the mid-’90s, things got a little weird. The American Burger King bought the naming rights, and the head office opened a bunch of different Burger King locations under the Burger King name, then tried to terminate its agreement with Hungry Jack’s.

After a 2001 court case, Hungry Jack’s retained the right to Burger King’s Australian presence. It’s still called Hungry Jack’s today.

Colonel Sanders in the kitchen.
Credit: Leila Grossman/ Michael Ochs Archives via Getty Images

Colonel Sanders Opened a Competing Fried Chicken Spot

Harland Sanders, better known as Colonel Sanders, opened his first restaurant, Sanders Court & Café, in Kentucky in 1940. He licensed his chicken recipe in 1952 to a restaurant in Utah, which became the first KFC franchise. By 1963, the chain had 600 franchised locations. Sanders sold his company to food conglomerate Heublein Inc. in 1964, but maintained a promotional role in the brand as “goodwill ambassador.”

By the time the 1970s rolled around, he didn’t have a lot of goodwill left. The new corporate ownership had changed the recipes, and the quality had, in his words, “slipped mightily.” So Sanders and his wife Claudia decided to open their own sit-down restaurant, originally called The Colonel’s Lady, and began talks of franchising the new concept in 1972. Heublein sued, claiming it had exclusive rights to the Colonel’s name, and the Sanderses countersued, claiming that Heublein was interfering with their business operations. They eventually settled out of court for $1 million.

A Subway sandwich is seen in a restaurant.
Credit: Joe Raedle/ Getty Images News via Getty Images

Subway Used to Be Called Pete’s Super Submarines

Subway started as a partnership between a 17-year-old kid named Fred DeLuca, who needed to raise money for college, and Dr. Peter Buck, a family friend who was able to lend him $1,000. Their sandwich shop, Pete’s Super Submarines, opened in August 1965. The pair opened a second store in 1966, and in 1968 changed their business’s name to Subway.

White Castle Hamburger Restaurant.
Credit: Education Images/ Universal Images Group via Getty Images

White Castle Sold Paper Hats to Other Restaurants

White Castle, founded in Wichita, Kansas, in the 1920s, grew in clever ways — including creating other businesses to meet its needs. In 1932, co-founder Billy Ingram grew frustrated with how quickly linen caps looked dingy and gray, and devised a machine to make paper ones; he then started manufacturing them under the subsidiary Paperlynen. That business expanded quickly beyond White Castle; the company made 240,000 hats the first year and 42 million by 1955. The hats shipped all over the world, including to other restaurants. Paperlynen even manufactured hats for Dwight D. Eisenhower’s 1956 presidential campaign.

That’s not the only part of the White Castle economy to exceed its scope. In 1934, the business launched a subsidiary called Porcelain Steel Buildings to construct its iconic castles, and that subsidiary ended up manufacturing amphibious vehicles during World War II (and, after the war was over, lawn equipment from the spare parts).

Close-up of McDonalds chicken nuggets.
Credit: Brett Jordan/ Unsplash

McDonald’s Chicken Nuggets Come in Four Shapes

Think McDonald’s chicken nuggets shapes develop randomly from the raw pink goo? Think again! The nuggets actually have four shapes, although they’re all a little rough around the edges: the boot, the bow tie, the ball, and the bell. They come out of a rotating mold and everything. After getting shaped and dropped on a conveyor belt, they’re breaded and slightly cooked before going out to restaurants, where they’ll finish cooking and be served to customers.

The Doritos Locos Tacos in a row.
Credit: Joshua Blanchard/ Getty Images Entertainment via Getty Images

There Is No Taco Bell in Mexico (But Not for Lack of Trying)

Taco Bell has locations all around the world, but they can’t break into Mexico — the country that very loosely inspired the restaurant’s fare. They’ve attempted twice, trying to correct for the difference between Taco Bell’s approximation of a taco and an actual Mexican taco.

The first attempt in 1992 was met with confusion because customers didn’t know exactly what they were ordering. Crispy tacos are very rare in Mexico, so Taco Bell had to rename the dish a Tacostada — a blend of “taco” and “tostada,” which has toppings on top of a flat, crisp tortilla. The effort was still unsuccessful, and Taco Bell pulled out of Mexico two years later.

The chain tried again in 2007. This time, they decided to embrace their Americanness by adding French fries and soft serve ice cream to the menu, but they were again unsuccessful. The closest thing that Mexico has to a Taco Bell is a little independent taco stand coincidentally called Taco Bell, which has nothing to do with the chain.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Nodar Chernishev/ iStock

Big grocery brands carefully plan the layouts of their stores, and it’s not usually about convenience for customers. Even if you’re perfectly prepared for a grocery shopping trip — you’re not hungry, you have a list, your coupons are clipped — one well-laid psychological trick can leave you with a higher bill than you planned at checkout. Even some more obvious sales strategies, like free samples, are deeper than they appear. Here are six ways that stores upsell you even on the quickest of grocery runs.

Buyer hands with pork meat packages at the grocery store.
Credit: Andrii Spy_k/ Shutterstock

Listing the Sale Price for Multiple Items

You’ve probably seen a sale tag that advertises multiple items at a certain price — like two cans of soup for $5 — but that doesn’t necessarily mean that you have to buy two to get the deal. Take a closer look, because chances are those cans are actually $2.50 each. It’s worth looking carefully at the tag just in case you do actually have to buy multiple things, but most of the time it’s just a technique to upsell you.

View down a grocery isle.
Credit: Fikri Rasyid/ Unsplash

Displaying Items From Different Aisles Together

Chocolate syrup isn’t frozen, so why do you sometimes find it near ice cream? It’s the same reason you might find marshmallows next to graham crackers, whipped cream in the produce aisle, or red pepper flakes near frozen pizza: to get you to go in for one thing and leave with two. You were perfectly happy to just buy cheese when you walked in the store — you don’t need fancy crackers, too.

Close-up of prices at a grocery store.
Credit: Franki Chamaki/ Unsplash

Displaying Full-Price Items Like They’re on Sale

The short sides of the aisle are called end caps, and they’re often the source of deals. Sometimes the producer negotiates a low price with the store for visibility, and other times, especially in the back, it’s where discontinued or clearance items go. But new or seasonal products sometimes end up in flashy end-cap displays, too — at full price, occasionally with the bonus upsell of pairing up two items from different aisles that go together.

Woman choosing a dairy products at supermarket.
Credit: LADO/ Shutterstock

Stocking Essentials in the Back

One thing you might notice about shopping at a grocery store is that staples like eggs, cheese, and bread are rarely placed toward the front entrance — making you walk through a labyrinth of potential impulse purchases (and other sales techniques) on your way to your essentials. That makes it hard for even the most diligent list-makers to stay immune from heavy merchandising. Keep this in mind on your long journey to the back, especially if you can’t afford to buy extra snacks.

A women picking up a grocery item from the shelf.
Credit: Boxed Water Is Better/ Unsplash

Stocking Expensive Items at Eye Level

Ever notice that store brands tend to be lower on the shelves than name brands? This makes the more expensive items easier to spot and more likely to end up in a shopping cart. There’s a very common exception to this rule: More expensive children’s cereals tend to be a little lower down, at eye level with smaller shoppers. Some are even designed so that the cartoon characters on the boxes are looking directly at the kids.

Close-up of samples at a grocery store.
Credit: Tyler Olson/ Shutterstock

Free Samples

This one may seem obvious. Of course you’ll want to buy an item you try first if it’s delicious — and that’s a big part of it. Sales of an item can go up as much as 2,000% if customers get to sample it, partially because they know what they’re getting, but partially because they feel bad for getting something for free.

Yet the psychology goes deeper than just the product itself. After sampling something good, customers may be more likely to buy other things that they like throughout the store, not just the sampled product. In other words, while free samples can be great, just make sure to check your instincts after filling up on tiny bites.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Annmell_sun/ Shutterstock

Some flavors just beckon with the changing of the seasons. When fall rolls around, many of us begin craving the layered, warming flavors of cinnamon, nutmeg, and the infamous mix known as “pumpkin spice.” It’s not unusual to turn to spices as we celebrate shorter days and sweater weather — as it turns out, humans have been relying on spices to flavor the season for centuries. Here are seven tantalizing facts you might not know about our favorite autumn spices.

Mix of spices, including pumpkin spice.
Credit: Joanna Musisz/ Shutterstock

Pumpkin Spice Is Almost as Old as the United States

Food trends come and go, but pumpkin spice has an enduring power over Americans, perhaps because it originated here in the days of the Founding Fathers. Colonial newcomers learned quickly to cook pumpkins, taking the once-unfamiliar squash and turning it into table fare and brewed beer. So it makes sense that the first cookbook written by an American and published in the U.S., Amelia Simmons’ 1796 work American Cookery, offered up two recipes for “pompkin” pie — which just so happened to be flavored with a blend of nutmeg, ginger, allspice, and mace (a spice made from the webbed covering that grows around nutmeg).

It would take a while, but spice companies eventually caught on to pumpkin spice’s harvest-time popularity, launching their own blends around the same time canned pumpkin puree hit the market in 1929. Despite having few ingredients and being easy to replicate, pumpkin spice has become a spice of its own — McCormick’s first pumpkin pie spice, released in 1934, features the same ingredients nearly 90 years later: cinnamon, ginger, nutmeg, and allspice.

Cinnamon sticks in the form that they leave the plantations from.
Credit: Mattias Cook/ Shutterstock

Cinnamon Trees Were Once Guarded by Secrets and Myths

Cinnamon may just be fall’s favorite spice, considering how much we use its scent to freshen our homes and sprinkle it onto desserts and flavored coffees. But as popular as cinnamon is now, it was once so in-demand that spice traders concealed its real origins to help line their pockets. Cinnamomum trees are native to India, Sri Lanka, and Myanmar, but early merchants drove up the price by telling outlandish tales of how it was dangerous to harvest the bark of the trees thanks to aggressive “winged creatures” in distant lands. Amazingly, their efforts helped keep cinnamon’s real habitat secret for centuries.

Closeup of cloves in wooden teaspoon on wooden background.
Credit: Drbouz/ iStock

Cloves Were the Original Breath Mint

Cloves are grown in Madagascar, Tanzania, Sri Lanka, and elsewhere, but a staggering amount of the spice — 74% in 2019 — comes from Indonesia, where the dried flowers of the native Syzygium aromaticum tree are harvested. Today’s cloves are mostly used to add sweetness and warmth to dishes, but these pods — also called “nails” for their resemblance to the fastener — were once used in ancient China as breath mints. To avoid the offense of bad breath, visitors to the Han dynasty royal court around 200 BCE would pop a clove into their mouths before meeting with the emperor, though the spice could also be used as a natural anesthetic to treat toothaches.

By the Middle Ages, cloves had reached Europe, where they were used to season food at a high price that wouldn’t deflate for centuries. In the 1600s, Dutch traders who held the clove monopoly regularly destroyed Syzygium trees and portions of the clove harvest to create scarcity and drive up spice prices. But starting in 1770, French smugglers whisked clove seedlings out of Indonesia to create their own supply, eventually pushing down the price. Today, you don’t have to go far to find cloves: They’re commonly found in ketchup but get the most use in autumnal fare like pumpkin pie and spice cake.

Fresh ginger at the farmer's market.
Credit: BruceBlock/iStock

There’s No Such Thing as “Wild” Ginger

Humans have revered ginger for more than 2,500 years, rightly crediting the spicy root with calming nausea and upset stomachs. But a few things aren’t so true about ginger — it can’t cure the plague as medieval doctors once believed, and it’s not a naturally growing plant species. Botanists consider ginger a cultigen: a plant that doesn’t exist naturally in the wild, and was instead bred by early humans so much that it became fundamentally different from its wild ancestors.

Ginger slowly spread across the world from India over the centuries, thanks to Arab, Spanish, and Portuguese traders. In Europe, the pungent rhizome was a 16th-century favorite — even beloved by British monarch Queen Elizabeth I, who’s credited with serving gingerbread men cookies at royal banquets.

An antique scoop tin scoop filled with cardamom seeds and cloves.
Credit: BruceBlock/ iStock

Nordic Countries Consume the Most Cardamom

Cardamom is native to India, where farmers have undertaken the labor-intensive harvest of its green, seed-filled pods for at least 5,000 years. With a spicy citrus flavor, the spice is commonly used in rice, desserts, and chai spice blends for a warming tea on a crisp day. But nowhere is it more sought-after than in some Nordic countries: Sweden claims the second-highest cardamom consumption, following only Norway, whose citizens consume almost 30 times more per capita than any other nation.

It’s unclear how cardamom — often considered the world’s third-most expensive spice because it must be harvested by hand — became so popular in Scandinavian countries. Some historians believe the spice took hold between the eighth and 13th centuries, and it continues to fuel cold-weather dishes like meatballs, sweet buns, and holiday glögg.

A girl rubs a nutmeg on a fine grater in a bowl.
Credit: Anakumka/ Shutterstock

Nutmeg Enthusiasts Once Carried Their Own Spice Graters

Modern chefs reach for nutmeg when cooler temperatures linger, generally using the warm and nutty flavor to spice up pies, drinks, and other sweets. But at one point in history, nutmeg was used just as frequently as black pepper is today. Sourced from the seeds of the Myristica fragrans, an evergreen tree found in Indonesia’s Banda Islands, nutmeg was first used as far back as 3,500 years ago, archaeologists believe. By the 14th century, spice traders considered it more valuable than gold. Nutmeg flourished in popularity in the 17th and 18th centuries, and wealthy connoisseurs of the spice began carrying their own miniature graters so they could season meals to their liking.

Pile of all spice dried berries.
Credit: Annmell_sun/ Shutterstock

Allspice Is Just One Spice

Allspice has gone by many names. Because the dried berry harvested from Pimenta dioica trees found in the West Indies looks like a peppercorn, it’s sometimes called Jamaica pepper or myrtle pepper. In the 17th century, Londoners introduced to the spice deemed it the unimaginative “newspice.” But the name we most commonly use was given around the 1600s because the flavor resembled a blend of cloves, cinnamon, and nutmeg. Over time, the name has become something of a confusing misnomer, but the spice used to flavor apple cider, spiced wine, and other autumnal treats is really in a category of its own.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Pictorial Press Ltd/ Alamy Stock Photo

Gone With the Wind, the 1939 film adaptation of Margaret Mitchell’s Pulitzer Prize-winning novel of the same name, has a complicated and even contradictory legacy. It is a towering achievement of Old Hollywood, as well as an overly long melodrama of more than three and a half hours; a faithful re-creation of the ways of the antebellum South and a whitewashed representation of the horrors of slavery. But regardless of one’s views, Gone With the Wind was undeniably a seismic force in Depression-era America, and it remains a cultural touchstone even now. Here are eight facts about this famous book and even more famous movie.

Novelist Margaret Mitchell sits at her desk at home.
Credit: Bettmann via Getty Images

Its Author Was Reluctant To Share the Story

In 1926, Mitchell began work on her sweeping Civil War-era novel as she recovered from a badly sprained ankle. Featuring a main character named Pansy O’Hara, under a title that swayed between options including Bugles Sang True and Tote the Weary Load, the story was largely written in secret over the following decade. After a friend tipped off Macmillan Publishers editor Harold Latham to the manuscript’s existence, Mitchell declined his initial offer to read the story during his trip to Atlanta in 1935, only to hand it over shortly before his departure. Latham sensed a bestseller within the yellowed, scribbled-out pages, and his instinct bore fruit when Gone With the Wind sold 1 million copies in the first six months after its 1936 release.

American actor Clark Gable (1901-1960), star of 'Gone With The Wind'.
Credit: General Photographic Agency/ Hulton Archive via Getty Images

Some Arm-Twisting Was Required To Get Clark Gable on Board

Having already garnered acclaim for his starring roles in It Happened One Night (1934) and Mutiny on the Bounty (1935), Clark Gable was the preferred choice of producer David O. Selznick for the screen version of Rhett Butler. However, because Gable was under contract with MGM (which was run by Selznick’s father-in-law, Louis Mayer), the star was made available only after a profit-sharing agreement that also gave the film’s distribution rights to MGM’s parent company. Gable himself nearly torpedoed the arrangement, as he was unsure of his ability to handle the demanding role, although he reportedly acquiesced for financial reasons because of his looming divorce from Maria Langham.

Vivien Leigh as Scarlett O'Hara.
Credit: Bettmann via Getty Images

Shooting Began Before the Film Had its Scarlett O’Hara

After a nationwide search for Scarlett O’Hara failed to yield anyone promising, and a deal for would-be star Paulette Goddard was scrapped, the film was still lacking a female lead when shooting commenced with the burning of Atlanta sequence in December 1938. That same evening, Selznick’s agent brother Myron was dining with a group that included English actress Vivien Leigh, who coveted the part of the Southern belle. When Myron arrived to the set with his dinner party, his brother was struck by the intensity of Leigh’s gaze, which seemed to echo Mitchell’s very description: “The green eyes in the carefully sweet face were turbulent, willful, lusty with life, distinctly at variance with her decorous demeanor.” A hastily arranged screen test revealed that Leigh possessed a plucky attitude to match her appearance, and the film had its long-awaited leading lady.

Portrait of Sidney Coe Howard.
Credit: Historical/ Corbis Historical via Getty Images

Only One Screenwriter Was Credited for the Work of Many

Although Sidney Howard is the sole credited screenwriter for the movie, some 15 different people contributed to the script. After Howard completed his long-winded draft, Oliver H. P. Garrett, John Van Druten, Jo Swerling, and F. Scott Fitzgerald all took their turns at meeting Selznick’s exacting standards. Ben Hecht later worked on an intensive weeklong rewrite alongside Selznick and director Victor Fleming, with 20-hour days fueled by peanuts, bananas, and Dexedrine, a drug used to treat ADHD. Ultimately, however, much of these efforts were undone by Selznick, who claimed 80% of the shooting script as his own work.

There Were Nearly Two Dozen Alternate Choices for the Film’s Riskiest Line

“Frankly, my dear, I don’t give a hoot!” That was one of the proposed alternate choices for Butler’s memorable final line, as Selznick was unsure whether “damn” would make it past the censors of the Motion Picture Production Code (more popularly known as the Hays Code). Although it’s been reported that the producer was fined $5,000 for electing to go ahead with the line as is, an amendment to Production Code rules enacted before the film’s release actually permitted the use of mild swearing in “proper historical context” or in “a quotation from a literary work.” Thus, “I don’t give a damn” was given the green light, sparing the audience from decidedly less juicy put-downs such as “the whole thing is a stench in my nostrils” or “it makes my gorge rise.”

Portrait of American actress Hattie McDaniel.
Credit: John Kisch Archive/ Moviepix via Getty Images

Hattie McDaniel Accepted Her Landmark Academy Award in a Segregated Venue

While Hattie McDaniel made history as the first Black actor to win an Academy Award, for her performance as Mammy, her victory was clouded by the attitudes of a society that in some ways hadn’t changed much since the Reconstruction era. With the awards ceremony set to be held in February 1940 at the segregated Ambassador Hotel’s Cocoanut Grove nightclub, Selznick reportedly had to make a special request to have McDaniel admitted — and even then, she was seated at a back table separate from the film’s other stars. This came two months after McDaniel and other Black cast members were told not to show up for the film’s lavish premiere in Atlanta, an act that nearly pushed an angered Gable into boycotting the event.

Actors Clark Gable and Vivien Leigh as Rhett Butler and Scarlett O'Hara.
Credit: Silver Screen Collection/ Moviepix via Getty Images

It Is the Highest-Grossing Film Ever When Adjusted for Inflation

Made for the then-whopping cost of around $4 million (estimates vary), Selznick’s opus proved worth the investment when audiences flocked to theaters to watch Scarlett and Rhett match wits on screen. Gone With the Wind took home $189 million during its initial domestic release, a jaw-dropping sum at a time when movie tickets cost around a quarter. Worldwide, the feature grossed approximately $393 million, translating to about $3.44 billion in 2014 dollars, which puts the Hollywood classic above more recent blockbusters such as Avatar (2009) and Avengers: Endgame (2019) on the list of the highest-grossing films when accounting for inflation.

Scarlett by Alexandra Ripley, the sequel to Gone With the Wind.
Credit: CBW/ Alamy Stock Photo

Multiple Prequels and Sequels Have Appeared in Print

Although Mitchell had no intention of delivering a sequel, the heirs to her intellectual property had different ideas, resulting in the 1991 publication of Alexandra Ripley’s Scarlett: The Sequel to Margaret Mitchell’s Gone With the Wind. Featuring a story line that brings its heroine to Ireland, Scarlett drew plenty of criticism but nevertheless sold well enough to be adapted into a 1994 TV miniseries starring Joanne Whalley-Kilmer and Timothy Dalton. Two more authorized novels by Donald McCaig followed: Rhett Butler’s People (2007), which expands on the original book from the title character’s point of view, and Ruth’s Journey (2014), which tells Mammy’s backstory. Meanwhile, the Mitchell estate attempted to block the 2001 release of Alice Randall’s The Wind Done Gone, about Scarlett’s enslaved half-sister, before publication was permitted with an “unauthorized parody” label on the cover.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by Piotr Wytrazek/ Shutterstock

Decades ago, the world didn’t just look different; it sounded different, too. We communicated, watched our favorite movies, and did mundane tasks using different devices, and as technology has progressed, so has the noise we hear every day. A smartphone buzzing on a table would have been an unfamiliar noise 20 years ago — and a lot of stuff we used back then has fallen silent today.

Take a listen down memory lane with these eight sounds that we don’t hear much anymore, from old-timey internet accessories to vintage AV equipment.

An old, external dial-up modem.
Credit: Doug McLean/ Shutterstock

Dial-Up Modem

Back in the early days of the internet, your connection worked through your landline phone. Instead of having your internet on most of the time, you had to deliberately connect by asking your computer to dial in. That started a telltale series of intense-sounding noises, beginning with a dialing sound and escalating into bouncing beep-boops and several pitches of static. This song-and-dance served a purpose: The sounds were the various complicated steps of computers trying to talk to one another using borrowed infrastructure.

Because the connection tied up your phone line, if you didn’t have a second line and somebody tried to call you, they’d get another sound you don’t hear too often nowadays …

Detail of a woman's hand picking up an old white corded telephone.
Credit: Alicia Fdez/ Shutterstock

Busy Signal

It’s now really easy to put someone on hold to answer another call. But back when nearly everyone had a landline, it was common to call someone and hear a series of beeps indicating that they were on another call. Call waiting eventually became available for nonbusiness landlines, but it still wasn’t as easy to switch over as it is on a smartphone, since there wasn’t any visual interface to guide you.

You’ve got mail message concept with computer keyboard.
Credit: Eviart/ Shutterstock

“You’ve Got Mail!”

America Online, better known as AOL, used to be America’s biggest internet provider, and was so ubiquitous in everyday culture that you didn’t have to be a subscriber to know what it sounded like to get an email via the service. A male voice semi-enthusiastically stating, “You’ve got mail!” was so well known that it even lent its name to an A-list rom-com.

Closeup of Old fax machine.
Credit: Takaeshiro/ Shutterstock

“Accidentally Called a Fax Machine” Sound

Having to key in a number every time you called someone — as opposed to just finding someone in your contact list or making Siri call someone for you — meant that mistakes were inevitably made. Sometimes you’d read the wrong line of a business card, dial the wrong number, or just catch someone at the wrong time and get a screeching ringing sound, indicating that there was a fax machine on the other end.

Vintage television with test patter.
Credit: shaunl/ iStock

TV Test Pattern Beep

If you still have TV service, there’s something on 24/7, even if it’s infomercials. Years ago, however, channels would eventually pack it in for the night and display a test pattern — a series of colorful bars designed for calibrating a color TV or, on the other end, a camera. (Before color TVs, they looked much different.) This was often accompanied by an obnoxious long beep for calibrating audio.

man rewinding a cassette tape.
Credit: Pingun/ Shutterstock

Rewinding Tape Noise

From the 1970s until DVDs took over, most home video was on VHS tapes, which used a length of magnetic tape to store audio and video. Tape moved from one spool to the other as the video played, so if you wanted to go back to the beginning, you’d have to rewind it, which made a distinct whirring sound. The same thing applied to audiocassettes, although you could flip those over and play the other side to get back to square one.

Retro rotary telephone.
Credit: WPixz/ Shutterstock

Rotary Telephone Dials

When you dial a phone — even a landline — you’re typically pressing buttons, not actually dialing. Rotary telephones predate the touch-tone models most people are used to, and had an actual round dial, with different points corresponding to different numbers. To call someone, you had to turn the dial from each number, let go, and wait for the dial to return to the starting point before putting in the next digit. The rotation of the dial made a kind of rapid clicking sound.

Close-up of an Adding Machine.
Credit: Lloyd Paulson/ Shutterstock

Adding Machine Typing and Tape Sounds

If you needed to crunch some numbers on a calculator and required a record of your work, you used to need an adding machine — a calculator that printed out each equation and sum as you typed. Then you could use it as a receipt or go back and check your work. It made a distinct series of sounds: an electric typewriter-esque tapping as you entered the numbers, then a big crunch when you told it to add or subtract and it went to the next line.

It was a pretty common sight (and sound), especially in stores, in banks, and around tax time, until everybody had a computer in their pocket that could do the same thing.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Trinity Mirror / Mirrorpix/ Alamy Stock Photo

He was an irresistibly compelling actor who exploded from the stage in A Streetcar Named Desire and the screen in On the Waterfront and The Godfather, before seemingly rejecting the ability and beauty that had made him so famous. But even in a career marked by as many disappointments as triumphs, Marlon Brando was never anything less than an original character. Here are nine facts about the life of a Hollywood icon who raised the bar for all the leading men who followed.

Young Marlon Brando writing at a desk.
Credit: Pictorial Parade/ Archive Photos via Getty Images

A Teenage Brando Was Expelled From Military School

The youngest child of a strict father and an alcoholic mother, and hamstrung by dyslexia, Brando acted out in school. According to Peter Manso’s biography, Brando orchestrated an endless series of pranks while accumulating just six of 15 possible credits over three years at Libertyville (Illinois) High School, forcing his dad to arrange a transfer to Minnesota’s Shattuck Military Academy. But the disruptions continued in Shattuck’s hallways, with Brando at one point hiding the dining room silverware before he was kicked out at the end of his second year. Amazingly, the cadets who were often the butt of his jokes threatened to boycott classes over what they felt was an unfair expulsion, and Brando later framed the letter of support they wrote to him.

Scene From "A Streetcar Named Desire" featuring Marlon Brando.
Credit: Bettmann via Getty Images

He Nearly Blew the Opportunity for His Breakout Stage Role

After witnessing Brando’s impressive audition for A Streetcar Named Desire in August 1947, director Elia Kazan gave the magnetic young actor money to take a bus to Massachusetts for a further tryout with playwright Tennessee Williams. Brando instead spent the cash on party supplies, before hitchhiking his way to Massachusetts a week later. Upon reaching Williams’ home, Brando smoothed over any bad feelings about his late arrival by fixing a blown fuse and broken toilet. A quick read for the part sealed the deal, and Brando was on his way to revealing his preternatural talent to the world.

Marlon Brando In One-Eyed Jacks film.
Credit: Silver Screen Collection/ Moviepix via Getty Images

He Directed One Feature Film

Taking on an outsized role in the production of One-Eyed Jacks (1961), Brando drove out original helmer Stanley Kubrick and took over double duty as director and star of the Western. That worked out fine for his artistic sensibilities, but Brando’s habit of letting the camera endlessly roll as characters improvised their way through scenes took its toll on time and budget constraints. After viewing the costly, 4.5-hour director’s cut, producer Frank P. Rosenberg complained, “That’s not a picture. That’s just an assemblage of footage.” One-Eyed Jacks was whittled down to 141 minutes, and while the still-meandering final product has its admirers, the experience was apparently off-putting enough to discourage its star from returning to the director’s chair.

Marlon Brando's bungalow with his HAM radio equipment on Tetiaroa Beach circa 1979.
Credit: Images Press/ Archive Photos via Getty Images

Brando Owned an Atoll in French Polynesia

After frolicking in the tropical locale of Tahiti during the filming of Mutiny on the Bounty (1962), Brando decided to take a slice of paradise for himself by buying the nearby atoll of Tetiaroa in 1966. Although the initial plan was to build a hotel as part of what would become a self-sustaining community, Brando preferred using the property as a private retreat for himself, family, and friends, though he neglected to put in the time and money needed for its upkeep. He steered clear of Tetiaroa following a tragedy involving his son and a daughter’s boyfriend in the early 1990s, and a section of the atoll was leased to a developer after the actor’s death in 2004. That area now boasts the Brando Resort, the sort of exclusive vacation destination its namesake was reluctant to develop while still alive.

Marlon Brando and Robert Duvall in a scene from Francis Ford Coppola's 1972 'The Godfather'.
Credit: Screen Archives/ Moviepix via Getty Images

He Spontaneously Created Vito Corleone’s Persona During a Screen Test

Although Paramount Studio executives were loath to cast Brando in The Godfather (1972) following his string of poorly received films, director Francis Ford Coppola convinced them to at least consider a screen test. He subsequently brought a camera to Brando’s home, upon which the just-awakened host, realizing this was his audition, quickly slipped into his interpretation of Mafia boss Vito Corleone. Suggesting that Corleone should “look like a bulldog” and talk in a peculiar way, Brando stuffed tissues into his mouth and began acting out the character, even delivering that now-famous mumbling when answering a phone call. The once-leery execs were floored by the footage, paving the way for Brando’s highly celebrated comeback.

Sacheen Littlefeather refuses the Academy Award for Best Actor on Marlon Brando's behalf.
Credit: Bettmann via Getty Images

Brando Surprised Sacheen Littlefeather With His Plan for the 1973 Oscars

In one of the more infamous moments of his career, Brando sent actress and activist Sacheen Littlefeather to the 1973 Academy Awards to decline his Best Actor Oscar over “treatment of American Indians today by the film industry.” Littlefeather, who had struck up a friendship with the actor via their shared passion for Native American rights, reportedly wasn’t aware of the full scope of Brando’s plan until the afternoon of the Oscars telecast. She then waited as Brando worked on a lengthy speech, leaving her barely enough time to make it to the ceremony, and endured harassment in the parking lot before making it back to the safety of the actor’s home. According to Manso’s biography, Brando was happy with her effort, although he later noted that he’d “probably handle it differently” were he to do it all over again.

Marlon Brando And Jack Nicholson In 'The Missouri Breaks'.
Credit: Archive Photos/ Moviepix via Getty Images

He Was Close Friends With Fellow Star Jack Nicholson

While Brando and Jack Nicholson made for a fun pairing in the 1976 Western The Missouri Breaks, the two were far closer than your typical co-stars. The actors shared a driveway as Los Angeles-area neighbors for about 30 years, and at one point even lived together while Nicholson was going through a divorce. Nicholson helped care for Brando toward the end of his life, after which he penned a heartfelt obituary in Rolling Stone magazine. He also purchased the late actor’s mansion with the hope of making it available to Brando’s children, but reportedly turned it into a garden when none of them showed any interest in the property.

Marlon Brando plays the bongos in his Hollywood Hills home, 1955.
Credit: Bettmann via Getty Images

Brando Received Four Patents for a Drum Tuner

An enthusiastic percussionist with an ear for Afro-Cuban music and an innovative mind, Brando devoted much of his final years to developing a new and improved conga drum. Collaborating with a custom drum parts maker and a patent attorney, Brando obtained four patents for his drum tuner, a single lever and linkage system designed to replace the five or six bolts normally used for the purpose. He even produced a few working prototypes, but was unable to get the design licensed before his passing.

Close-up of Marlon Brando.
Credit: Ron Galella Collection via Getty Images

More Than 300 Hours of Confessional Audio Tapes Were Found After His Death

Although Brando published an autobiography in 1994, that book provided only a partial reveal of a celebrity who increasingly shunned the spotlight. Additional insights arrived around 20 years later, when a production team gained access to more than 300 hours of audio footage of the actor waxing on his troubled upbringing, his own struggles as a father, his relationship with fame, and much more. Producers also found rudimentary 3D scans of Brando’s head, and used updated technology to match some of the audio with his animated, talking face. The result was the 2015 documentary Listen to Me Marlon, a life story narrated solely by the enigmatic star, between old film and interview clips, that marked one final posthumous screen performance.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by Photo 12/ Alamy Stock Photo

Most of us are familiar with holiday-themed movies like It’s a Wonderful Life and A Christmas Story, whether they’re remembered from childhood or part of an annual tradition today. But even the die-hard enthusiasts who’ve committed the dialogue to heart may not know the behind-the-scenes stories that helped bring these heartwarming films to life. Here are nine facts about some of the classics that regularly show up on our TVs in November and December, but of course can be enjoyed at any time of year.

James Stewart and Donna Reed in It's a Wonderful Life.
Credit: Screen Archives/ Moviepix via Getty Images

A New Kind of Fake Snow Was Created for “It’s a Wonderful Life”

The problem that plagues many holiday movies is how to create convincing snow when there isn’t any, and It’s a Wonderful Life (1946) director Frank Capra wasn’t satisfied with the bleached cornflakes that had been used to middling effect in other Hollywood features. Fortunately, RKO special-effects man Russell Shearman found a solution by mixing the carbon dioxide foam found in fire extinguishers with soap, sugar, and water. The resulting mix not only looked the part (and was much less noisy than cornflakes), but it also held up through fan-controlled applications that could be sped up to simulate a blizzard. More than enough of this “snow” was created to give the fictional Bedford Falls a wintry backdrop despite the film’s summertime shoot, and Shearman later received a technical achievement Oscar for his contribution to movie magic.

Parade scene in Miracle on 34th Street.
Credit: Collection Christophel/ Alamy Stock Photo

The Thanksgiving Day Parade in “Miracle on 34th Street” Was Real

Staging a parade in a movie can be an arduous undertaking with all the performers, set pieces, and choreography involved, but the creators of Miracle on 34th Street (1947) were fortunate to gain permission to hitch their wagons to New York City’s annual Macy’s Thanksgiving Day Parade in 1946. As co-star Maureen O’Hara recalled in her memoir, the experience of working around the event’s schedule was stressful for everyone involved: “They weren’t going to run the parade more than once on our account … It was a mad scramble to get all the shots we needed and we got to do each scene only once.” Nevertheless, the cameras got enough footage of Edmund Gwenn’s Kris Kringle waving to fans as he rode through Manhattan in Santa’s sleigh, and the authenticity of the scene set the tone for what became a true holiday classic of Hollywood’s golden age.

Actors singing in 1954's White Christmas.
Credit: John Swope/ The Chronicle Collection via Getty Images

“White Christmas” Was Supposed To Pair Fred Astaire With Bing Crosby

Following the success of 1942’s Holiday Inn and 1946’s Blue Skies, 1954’s White Christmas was meant to once again pair the singing and dancing talents of Bing Crosby and Fred Astaire. When Astaire declined to participate over his dissatisfaction with the script, the role of Phil was offered to Donald O’Connor. When he was stricken with illness before production began, the casting merry-go-round ended with Danny Kaye stepping in. Crosby at one point also backed out of the movie following the death of his wife in 1952, before returning to play the part of Bob the following year.

RUDOLPH THE RED-NOSED REINDEER 1964.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

A Pioneering Japanese Stop-Motion Animator Was Behind “Rudolph the Red-Nosed Reindeer”

Tadahito Mochinaga created China’s first stop-motion puppet animation with a 1940s propaganda film mocking nationalist leader Chiang Kai-shek, and he created Japan’s first stop-motion puppet animation the following decade for a beer company. Those pioneering efforts caught the attention of American TV producers Arthur Rankin and Jules Bass, who tapped the Tokyo-based filmmaker to animate an adaptation of a Depression-era Christmas story turned hit holiday tune. Mochinaga brought his trademark detail to the project, even spending time in a Japanese deer sanctuary to better render the distinct features of the main characters. The mesmerizing result can still be witnessed many years later, as Rudolph the Red-Nosed Reindeer (1964) became the first in a string of popular Rankin/Bass seasonal holiday programs, en route to becoming the longest-running Christmas special in TV history.

A Charlie Brown Christmas scene.
Credit: Photo 12/ Alamy Stock Photo

Head Animator Bill Melendez Voiced Snoopy in “A Charlie Brown Christmas”

A Charlie Brown Christmas (1965) marked the Peanuts gang’s first major entry into the world of animated television. This brought numerous questions about how to translate the popular comic strip to the screen, among which was what to do about the voice of Snoopy. Although Peanuts creator Charles Schulz wanted to downplay Snoopy’s role, head animator Bill Melendez insisted on enhancing the beagle’s personality through his voice, and set about recording a series of noises that he hoped could be replicated by a trained voice actor. With time running out to finish the special, Melendez went with the sped-up, higher-pitched recordings he had been tinkering with instead of hiring another actor. Schulz was amused by Snoopy’s nonsensical ramblings, and Melendez was rewarded with the responsibility of voicing Charlie Brown’s pet for subsequent TV specials and animated features.

Tongue pole scene in A Christmas Story (1983).
Credit: Moviestore Collection Ltd/ Alamy Stock Photo

Flick’s Tongue Wasn’t Really Stuck to the Flagpole in “A Christmas Story”

You may have already figured there was nothing approaching actual danger for the actor in this enduring scene of A Christmas Story (1983), although clever set design ensured that the visual of a tongue stuck to a frozen flagpole seemed real enough. According to actor Scott Schwartz, the pole was wrapped with a layer of plastic, through which a clear tube ran down to a motorized vacuum buried in the snow. When Schwartz’s Flick plugged his tongue into a tiny hole in the plastic, the tube’s suction was strong enough to keep his organ in place, but mild enough to be easily withdrawn. All in all, it was painless enough for Schwartz to shoot the entire scene twice — after the first round of footage was damaged by underdeveloped film.

Car scene in Planes, Trains and Automobiles film.
Credit:United Archives GmbH/ Alamy Stock Photo

“Planes, Trains and Automobiles” Was Based on a True Story

Back when acclaimed screenwriter and director John Hughes was an unknown advertising copy man, he regularly traveled from Chicago to New York City on behalf of a client. During one blustery winter day, strong winds nixed the return flight to Chicago and forced him to find a hotel for the night. More cancellations awaited the following day due to deteriorating weather in the Midwest, and Hughes wound up on a flight that was rerouted to Des Moines, Iowa, and then Denver, Colorado, before he decided to remain on board for the sunnier destination of Phoenix, Arizona. Hughes eventually made it to Chicago five days later than originally planned, the torturous experience leaving a lasting imprint that became the basis of his 1987 Thanksgiving travel comedy Planes, Trains and Automobiles.

MACAULAY CULKIN in HOME ALONE, 1990.
Credit: AJ Pics/ Alamy Stock Photo

Macaulay Culkin’s Iconic Facial Gesture in “Home Alone” Was Improvised

Even fans who haven’t seen Home Alone (1990) in eons can recall the image of Macaulay Culkin, as the abandoned Kevin McCallister, slapping aftershave on his face and screaming into the mirror. However, that scene didn’t quite go according to plan; most people would move their hands after creating a burning sensation on their face, and director Chris Columbus instructed his young star to do so. Instead, Culkin kept his hands glued to his face as he screamed at his reflection, prompting everyone else to break up in laughter. Although different reactions were tried in subsequent takes, it was that first one that stuck and became a defining moment of the immensely successful comedy despite encompassing a tiny fraction of the 103-minute running time.

Zooey Deschanel in Elf film.
Credit: Album/ Alamy Stock Photo

The Shower-Duet Scene in “Elf” Was Written for Actress Zooey Deschanel

While Elf (2003) was built around the physical comedic chops and man-child persona of star Will Ferrell, the endearing scene of Ferrell’s Buddy discovering Jovie singing in the shower didn’t take shape until Zooey Deschanel joined the production. According to Deschanel, the scene was initially a fluid one that would showcase the individual talent of the actress cast as Jovie; once her crooning abilities became apparent, the specifics of Buddy naively wandering into the women’s changing room fell into place. An added bonus was Ferrell’s surprisingly solid pipes, which brings a layer of sweetness to the building tension until Jovie inevitably realizes Buddy’s presence and orders him out.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by katalinks/ Shutterstock

When you think of vampires, what thoughts come to mind? Do you think of Dracula or Count von Count from Sesame Street? Or perhaps you think of more recent books, television series, and movies such as Twilight, Buffy the Vampire Slayer, and Blade? Once known as terrifying beings that would suck the lifeblood from people, these creatures somehow made the shift to become romantic and appealing. So what’s up with our collective fascination with vampires, and why do vampires keep appearing in pop culture?

Portrait of Vlad III the Impaler, or Dracula (1431-1476).
Credit: Stefano Bianchetti/ Corbis Historical via Getty Images

Origins of the Vampire

Long before Brad Pitt made vampires look sexy, tales of the creatures had been around for centuries, but they were feared. Vampires have popped up in mythology as far back as the ancient Egyptians. But most historians agree that the vampire as we know it today got its start in Europe sometime during the 17th and 18th centuries. According to scholars, Bram Stoker's Dracula novel was inspired by the real Romanian Prince Vlad Tepes, who lived during the 15th century around Transylvania.

While Romania sometimes looks fondly on his legacy, he was known to be very cruel to those he conquered, earning himself the nickname “Vlad the Impaler.” Some stories go so far as to say that he even dined with his dying victims, dipping his bread in their blood. Tales of vampire-like creatures also come from Asia; in Chinese mythology they’re known as jiangshi (pronounced chong-shee, and meaning “stiff corpse”).

Old copy of Dracula book.
Credit: Torontonian/ Alamy Stock Photo

Vampires in Literature

One of the best-known works about vampires — and the one that is credited with propelling vampires into the realm of popular culture — is the 1897 book Dracula. Stoker’s version of vampirism — a blood-sucking ghoul who preys on the innocent in order to prolong its own immortal life — was burned into the collective psyche. Notably, it also kept with the then-common belief that vampires were dangerous and unholy, although there’s a case to be made that the Victorian-era novel is full of innuendo and is, in fact, a heavily sexual piece.

But through this novel, we get several characters who continually pop up in future works by other authors and television and movie directors. You might be familiar with names like Van Helsing, the vampire slayer who is the central character portrayed by Hugh Jackman in a 2004 action movie, or Mina Murray, a love interest who features prominently in Dracula romance novel spin-offs.

Austrian-Hungarian born actor Bela Lugosi clenches his hand in the air.
Credit: Universal Pictures/ Moviepix via Getty Images

Vampires Get a Makeover

It wasn’t until 1931 that the vampire transitioned from being a vicious-looking monster into a handsome rogue who just so happened to also suck people’s blood. You can thank the film Dracula that was released that year, and actor Bela Lugosi for playing the titular role in a suave manner. Through the decades, vampires stayed attractive yet fearsome, until Sesame Street’s fourth season in 1972.

Best known as the Count von Count who likes to count, the friendly character manages to straddle popular vampire tropes such as wearing a cape, living in a decrepit castle, and laughing dramatically with a Transylvanian accent, while also delighting small children and teaching them how to count their numbers. He’s probably the only family-friendly vampire most people can name, though Disney’s Vampirina also proves that Transylvania’s most famous cultural export can be for kids.

But vampires didn’t take a decidedly sexy turn until the 1970s when Anne Rice began writing her Vampire Chronicles novel series that centered around the handsome French vampire Lestat. The most famous book in the series was Interview With the Vampire, which in 1993 was turned into a movie starring Brad Pitt and Tom Cruise. It’s safe to say that after this movie was released, the interest in vampires in pop culture experienced a rebirth, and there were plenty of people who were open to the idea of literally being bitten by love.

Nosferatu, a horror film directed by F. W. Murnau and starring Max Schreck as Count Orlok.
Credit: Buyenlarge/ Archive Photos via Getty Images

In the Art House

Arguably the most influential vampire movie ever made is 1922's Nosferatu, F.W. Murnau's silent fantasia, which belongs to the German Expressionist movement. Max Schreck stars as Count Orlock in the loose (and unofficial) adaptation of Stoker's novel, with a number of names and other details being changed for legal reasons. Its legacy is massive, so much so that none other than Werner Herzog remade it as Nosferatu the Vampyre in 1979. The filmmaker's frequent collaborator Klaus Kinski starred in that version, which is even stranger than its source material and just as worthwhile — not least because of Isabelle Adjani's performance.

Both films are part of a proud tradition of avant garde vampire movies that continues today. French auteur Claire Denis threw her proverbial hat in the ring with 2001’s Trouble Every Day, in which American newlyweds find themselves among many tantalizing necks in Paris; Jim Jarmusch did likewise with Only Lovers Left Alive, a romantic drama starring Tilda Swinton and Tom Hiddleston as a bloodsucking couple whose centuries-long affair has made them as prone to waxing philosophical as they are to sucking blood. Similarly artful films are made all over the world, from Let the Right One In (Sweden) and The Transfiguration (America) to A Girl Walks Home Alone at Night (Iran) and Thirst (South Korea), all of them demonstrating how many different approaches there are to depicting these creatures of the night.

Buffy the Vampire Slayer film poster.
Credit: MARKA/ Alamy Stock Photo

Vampires Go Mainstream

Although Interview With the Vampire was a racy novel and movie, a more family-friendly version of vampire relations also hit the big screen a year earlier with Buffy the Vampire Slayer. The 1992 movie centered on a cheerleader who discovers that she’s a vampire hunter and was successful enough to be adapted into a beloved television series five years later. The “Buffyverse” is home to one of the most dedicated fan-bases around, not least because of Sarah Michelle Gellar’s inspired performance in the title role — she even received a Golden Globe nomination. Buffy proved both that bloodsuckers can draw ratings and that a strong female lead could be accepted by a diverse audience. It also paved the way for the likes of True Blood and The Vampire Diaries, both of which spawned devoted followings of their own and suggest that, like the creatures themselves, this genre refuses to die.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by Antares Light/ Shutterstock

Everybody knows the stories of Cinderella, Aladdin, and Sleeping Beauty. These centuries-old fairy tales have been immortalized in every art form imaginable, from books and ballets to musicals and movies. What’s often forgotten, however, is where these stories came from — and who was responsible for writing them down. Here’s a look at eight of history’s most important fairy-tale tellers.

Aesop, ancient Greek writer of fables.
Credit: Edward Gooch Collection/ Hulton Archive via Getty Images

Aesop: A (Literal) Legend

If you’ve ever taken “the lion’s share” or claimed that “necessity is the mother of invention,” then thank Aesop. The Greek fabulist — purportedly born around 620 BC — is responsible for some of our most famous phrases and fables, including The Hare and the Tortoise. Greek authors like Herodotus and Plutarch claim that Aesop was a slave who became an adviser to Croesus, the King of Lydia. The accuracy of their accounts, however, is disputed, and it’s possible that Aesop was never a real person.

Portrait of Marie-Catherine Le Jumel de Barneville.
Credit: DEA PICTURE LIBRARY/ De Agostini via Getty Images

Marie-Catherine le Jumel de Barneville: Pioneer of the Fairy Tale

Countess d’Aulnoy’s life is like a folktale — difficult to parse fact from fiction. A French author who lived during the 17th century, de Barneville may have been a spy who accused her husband of high treason. True or not, she established a literary salon later in life and published at least two collections of tall tales. Her works, like “The White Cat,” were famously conversational in style and were lauded for being popular with adults and children alike. In fact, she even coined the term “fairy tale.”

Antique artisanal Aladdin Arabian nights genie.
Credit: zef art/ Shutterstock

Hanna Diyab: The Man who Conjured Aladdin

The brain behind Aladdin and Ali Baba and the Forty Thieves, Diyab was a Syrian storyteller who lived during the early 18th century. When Diyab was young, he bumped into a French collector of antiquities who hired him to become his traveling assistant. Diyab visited Paris and met the folklorist Antoine Galland, who he entertained with folktales from home. Years later, Galland published some of Diyab’s tales in his famous translation of The Thousand and One Nights. Diyab wouldn’t receive credit until centuries later.

Portrait of Jean de La Fontaine.
Credit: API/ Gamma-Rapho via Getty Images

Jean de la Fontaine: The Editor Who Turned Fairy Tales into an Art Form

In 1668, Frenchman Fontaine released the first volume of Fables, a literary landmark that would lay out a formula for centuries of European folk and fairy tales. Born to a well-to-do family, de la Fontaine became interested in writing upon being inspired by the work of the French poet Malherbe. Between 1668 to 1694, he released six volumes of fables — a total of 239 stories — that drew from diverse sources, from the Roman fabulist Phaedrus to the Panchatantra, an Indian book of fables. De la Fontaine’s fresh and artful retellings of stories such as “The Grasshopper and the Ant” and “The Raven and the Foxturned Fables into an instant classic.

Portrait of Charles Perrault.
Credit: Bettmann via Getty Images

Charles Perrault: The Original Mother Goose

A major influence on the Brothers Grimm, Perrault — hailing from France as well — helped transform tales like “Puss in Boots,” “Cinderella,” “Blue Beard,” “Sleeping Beauty,” and “Little Red Riding-Hood” into cultural touchstones. His 1697 book Histoires ou Contes du Temps Passe — better known as The Tales of Mother Goose — was an unexpected departure from his life’s work. Perrault had spent decades working as a government official, but when political bickering forced him to change careers, he turned to writing literary fairy tales for aristocratic literary salons. The career change at age 67 is what made him famous.

Portrait of Wilhelm and Jacob Grimm.
Credit: Bettmann via Getty Images

The Brothers Grimm: Disney before Disney

Jacob and Wilhelm Grimm didn’t write “Rapunzel” or “Snow White,” but they did popularize the tales among the masses. The German-born brothers attended college with the intention of becoming civil servants, but a pair of influential teachers changed their minds — and inspired a love of folk poetry (or naturpoesie) and the arts. The duo gave up any hopes of a law career and began collecting literature that, they believed, emphasized the character of German culture and people. The brothers didn’t view themselves as writers, but as preservationists and historians who were saving common tales from extinction. Published in 1812, their first edition contained 156 fairy tales, including “Hansel and Gretel,” “Rumpelstiltskin,” “The Elves and the Shoemaker,” and “The Fisherman and His Wife.”.

Hans Christian Andersen, Danish author and poet.
Credit: Print Collector/ Hulton Archive via Getty Images

Hans Christian Andersen: The Original Ugly Duckling

The Danish writer of over 150 fairy tales — including “The Emperor’s New Clothes,” “The Little Mermaid,” “The Princess and the Pea,” and “Thumbelina” — Andersen, born in 1805, came from humble beginnings. His mother was illiterate and his father only had an elementary school education. And when his dad died, Andersen started working at a factory at the age of 11. But he always had an artistic side, and he tried to express his struggles through his work. As a teenager, for example, Andersen was routinely harassed by other boys because he had a high voice, and that abuse inspired him to write “The Ugly Duckling.” “The story is, of course, a reflection of my own life,” he once wrote.

Portrait of Alexander Nikolayevich Afanasyev.
Credit: DE AGOSTINI PICTURE LIBRARY via Getty Images

Alexander Afanasyev: From Bureaucrat to Bard

Russia’s answer to the Brothers Grimm, Afanasyev was a 19th century Slavic folklorist who published nearly 600 folk and fairy tales. (His works include “The Firebird,” which was famously transformed into a ballet by composer Igor Stravinsky in 1910, and “Vasilisa the Beautiful and Baba Yaga.”) Much like Charles Perrault, Afanasyev spent decades clocking in at a normal day-job for the government. But while working at the Ministry of Foreign Affairs of the Russian Empire, he developed an obsession with collecting and preserving local fairy tales. Unlike many of the other folklorists on this list, Afanasyev regularly cited his sources and often tried to pinpoint where the tale originated.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.