Original photo by apomares/ iStock

Olive oil is a vital crop in many countries around the world, and it has so many uses — some you might not even be aware of. You may like to pour it on your pastas and salads, but do you know where it originally comes from, or what “extra-virgin” really means? Here are eight rich facts about delicious, nutritious olive oil for you to soak in.

A wooden shelf with many clay jugs of olive oil.
Credit: Irina Khabarova/ iStock via Getty Images Plus

Olive Oil Has Been Produced by Humans for Millennia

Although it’s not definitively known which culture first began pressing olives for culinary uses, the earliest historical evidence of olive oil being produced by humans is a clay pot relic found near Galilee, Israel, that bore olive oil residue. The pot was made between 7,000 and 7,600 years ago, and it’s thought that the Neolithic people in this area were only just learning how to make clay pots at the time. The oldest known olive oil press was also found in this region, near the modern-day city of Haifa, Israel; the press is slightly younger than the oil pot, at only 6,500 years old.

Oil pouring and dripping onto a spoon close up.
Credit: DUSAN ZIDAR/ Shutterstock

Extra-Virgin Olive Oil Is Defined as the Pure, Unprocessed, Unrefined Oil of an Olive

Although it’s well known that extra-virgin olive oil is the highest grade of olive oil, many people don’t know how its method of production differs from that of other kinds of olive oil. In order for olive oil to be classed as extra-virgin, it must be made by grinding olives and then cold-pressing them to extract their oil, without the use of heat or chemical solvents. Olive oil is also required to have no more than 0.8% acidity in order to qualify as EVOO, per the European Commission, as well as zero median defects. As a result of these stipulations, extra-virgin olive oil is lighter in color and flavor than lower grades of olive oil and has a fruity, slightly peppery flavor and odor.

Fresh olives in sacks in a field in Crete, Greece.
Credit: Georgios Tsichlis/ Shutterstock

It Takes 11 Pounds of Olives to Make a Quart of Olive Oil

Olives have a surprisingly low yield when it comes to oil. It takes about 11 pounds of olives to make 32 ounces — or one quart — of olive oil. That’s between 5,200 and 8,000 olives, depending on the variety. This is also the reason olive oil is often more expensive than other kinds of edible oils. About 90% of the world’s harvested olives get slated to become oil (the rest will become table olives).

Detail of olive oil production line.
Credit: Mrak.hr/ Shutterstock

Most of the World’s Olive Oil Is Made in Spain, but It’s Most Consumed in Greece

Spain leads the world in olive oil production, but it’s the Greeks who consume the most olive oil by far: Per capita, they’re responsible for using 24 liters per year. The Spanish are in second place, but it’s not even close: They only use 15 liters a year per capita.

Olive Oil used as dressing on top of a salad.
Credit: Pinkyone/ Shutterstock

Olive Oil Has Antimicrobial Properties

Olive oils with a high polyphenol content (that is, extra-virgin olive oils) have been shown to prevent or delay the growth of pathogenic bacteria and microfungi, and the oil is therefore considered antimicrobial. This phenomenon is specifically attributed to oleuropein, tyrosol, and hydroxytyrosol, the major phenolic compounds found in olives. These antimicrobial properties were even known by ancient people, as evidenced by the biblical passage Luke 10:33-34, wherein a Samaritan pours olive oil onto a bleeding man’s wounds.

Olive oil being used during a religious service.
Credit: Andreas Politis/ Shutterstock

Many Religions Use Olive Oil in Sacred Rituals and Practices

Olive oil is overwhelmingly the oil of choice used by churches and synagogues across the world. The Christian Orthodox, Anglican, and Roman Catholic churches all use olive oil to bless those preparing to be baptized, as well as to anoint the sick — as does the Church of Jesus Christ of Latter-Day Saints and others. Catholic bishops use olive oil mixed with balsam (called “chrism”) in the sacrament of confirmation, in the rites of baptism, and other rites. Eastern Orthodox churches use olive oil in lamps for vigils. Under Jewish Halakhic law, olives are one of the seven foods that require the recitation of me’eyn shalosh after they are consumed, and olive oil is the preferred oil used to light Shabbat candles. As for Islam, olives are mentioned in the Quran as a “precious fruit.”

Detail of olive tree branch.
Credit: Tomo Jesenicnik/ iStock

Olive Trees Can Live for Thousands of Years

Olive trees, and therefore olive oil, originated in the Levant and were probably cultivated from wild trees growing near the Syria–Turkey border. Although the average life span for these trees is between 300 and 600 years, there are several trees throughout Greece, Israel, and Lebanon that have lived for over 2,000 years. One tree in the Portuguese village of Mouriscas is estimated to be 3,500 years old, having been planted in the Atlantic Bronze Age — and it still produces olives! Meanwhile, a grove of 16 olive trees (called “The Sisters”) in Bechealeh, Lebanon, is said to be as many as 6,000 years old.

Female hands beginning to write in a notebook.
Credit: PhotoSunnyDays/ Shutterstock

There’s a Literary Prize for Extra-Virgin Olive Oil-Themed Writing

Sponsored by the Pandolea Association, a Rome-based group of women olive oil producers, the Ranieri Filo della Torre International Award is a yearly literary prize awarded to poets and authors who are moved to write about extra-virgin olive oil. Named for a departed journalist and member of the National Academy of the Olive and Olive Oil who co-authored several publications on olive cultivation, the award honors poetry, fiction, and nonfiction about extra-virgin olive oil. Writers from all nations are invited to apply, but entries must be about extra-virgin olive oil specifically.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by anilakkus/ iStock

Apples serve all sorts of useful purposes: They’re great for a snack, fun to pick, perfect for giving to a teacher, and ideal as a subject for a still life. They also offer an entire smorgasbord of interesting facts for those who enjoy nourishing their minds along with their bodies. Here are eight bite-sized tidbits to chew on regarding this wondrous fruit.

Wild apple (Malus sieversii) native of Central Asia.
Credit: Ron Ramtang/ Shutterstock

Modern Apples Are Descended From a Single Wild Ancestor

All varieties of the domestic apples we know and cherish stem from a single wild ancestor, Malus sieversii. Though the apple was originally found in the foothills of the Tien Shan mountains of Central Asia, its seeds may have spread from its native region via birds and bears. Sometime after the fruit’s domestication more than 4,000 years ago, apples made their way to Europe and beyond by way of pre-Silk Road trading routes. In the early 20th century, Russian biologist Nikolai Vavilov traced the modern apple’s origins to the forests outside Almaty, Kazakhstan. Today, the town still celebrates its status as the birthplace of this botanic marvel.

Clean and fresh gala apples on a conveyor belt.
Credit: Paula Cobleigh/ Shutterstock

The U.S. Ranks Second to China in Apple Production

Apples consistently rank among Americans’ favorite fruits, which explains the impressive productivity of the U.S. apple industry. According to the USApple association, there are more than 26,000 apple growers over 382,000 acres across all 50 states, who combine to produce 11.1 billion pounds of apples per year. However, even that output pales in comparison to the prodigious amount of apples reaped in China, which produced more than 47 million metric tons of the fruit in 2022.

Close-up of apple slices.
Credit: PTP034/ Shutterstock

Galas Are the Top-Selling Apples in the United States

One of the great things about apples is the variety of tasting experiences offered by the numerous types available at your neighborhood store. Honeycrisps are crunchy and sweet. Granny Smiths deliver a burst of tartness. Cortlands are great for baking. So which is the favorite of the American public? By sales, at least, that honor goes to Galas, which made up 19% of the U.S. apple market in 2021.

Watering freshly planted fruit tree in a garden.
Credit: encierro/ Shutterstock

It Takes at Least Four Years for a Tree to Produce Apples

A standard (non-dwarf) apple tree will normally take at least four years to begin bearing fruit, and can continue to do so for another three decades. That covers nearly the entire life span of many trees, which generally live up to about 50 years of age, although some can survive for 100 years or longer. In 2020, an apple tree believed to be the oldest in the Pacific Northwest died at the ripe old age of 194.

A man sitting on cliff and eating a green apple.
Credit: C_Production/ Shutterstock

Apples Are Very Healthy

It may take more than an apple a day to keep you out of the doctor’s office, but these fruits are loaded with nutritional benefits. Along with high levels of vitamin C and fiber, apples are packed with disease-fighting antioxidants such as quercetin. Research has shown that a steady diet of apples can reduce the chances of heart disease, diabetes, and certain types of cancers, while supporting weight loss and gut health. Of course, these benefits are best realized by eating raw, unpeeled apples, as opposed to gorging on sugar-filled ciders and pies.

Apple pie and ice cream.
Credit: MSPhotographic/ Shutterstock

Apple Pie Originated in England

Speaking of apple pie, this quintessentially American dish actually hails from England, with one of the first known recipes appearing in the late-14th-century manuscript The Forme of Cury. Arriving in the New World with European settlers, the dessert was well known within the borders of the nascent United States by the late 1700s, as evidenced by the presence of two recipes in the 1796 cookbook American Cookery. By the mid-1900s, the combination of advertising and war-fueled patriotism had embedded the “American as apple pie” concept in popular culture.

Three ripe apples from biggest to smallest.
Credit: TRR/ Shutterstock

The Heaviest Apple Ever Weighed More Than 4 Pounds

According to the Guinness World Record scorekeepers, the heaviest known apple was a 4-pound, 1-ounce behemoth picked at a Hirosaki City, Japan, farm in 2005. The prize specimen was a Hokuto, a cross between the Fuji and Matsu varieties.

Ripe apples in a fruit box in the refrigerator.
Credit: Endless luck/ Shutterstock

Apples Are Best Stored Separately From Other Produce in the Refrigerator

Those who return from the store or orchard with a hefty quantity of apples should consider stuffing as many as possible in the fridge; although apples will last about a week when left out on the counter, they can remain suitably edible for up to two months when refrigerated. But regardless of where you leave them, apples should be stored separately from other fruits and vegetables, as they release a gas called ethylene that speeds up the ripening and spoiling of nearby produce.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by ready made/ Pexels

With over 99 billion customers served at the chain, chances are that most people have set foot in a McDonald’s at one point or another. What began as a small California hot dog stand during the Great Depression has since blossomed into an international operation, with more than 36,000 locations in over 100 countries. Throughout the decades, McDonald’s has amassed a rich history filled with facts that tantalize the brain, much as its burgers tantalize the taste buds.

Some of the Happy Meal toys developed specifically for the Canadian Market.
Credit: Chris So/ Toronto Star via Getty Images

McDonald’s Is the World’s Largest Toy Distributor

Since the creation of the Happy Meal in 1979, McDonald’s has leapfrogged industry giants such as Hasbro and Mattel to become the world’s largest toy distributor. Early Happy Meal toys included stencils and spinning tops, though the trinkets were later designed as part of advertising campaigns to promote family movies, like 1989’s The Little Mermaid. All told, McDonald’s distributes 1.5 billion toys worldwide each year. As part of a recent effort to be more environmentally conscious, the company has pledged to largely phase out plastic toys in Happy Meals, and vowed to work to provide kids with plant-based or recycled toys instead.

McDonald's famous golden arches.
Credit: Darren McCollester/ Hulton Archive via Getty Images

One McDonald’s in Arizona Features Turquoise Arches

Golden arches may be synonymous with McDonald’s, but they’re nowhere to be found at one location in Sedona, Arizona. Due to a local law that prevents buildings from infringing on the region’s natural beauty, this McDonald’s instead features turquoise arches. City officials determined the gold would clash with the surrounding red rocks, whereas the turquoise was a more appropriate hue. Other unique color schemes at McDonald’s around the world include white arches at Paris and Belgian locations, as well as a big red “M” in place of the traditional yellow at one Rocklin, California, joint.

McDonald's worker wearing a protective gloves serves a meal to the client in a car.
Credit: NurPhoto via Getty Images

McDonald’s Used to Sell Hot Dogs and Barbecue

The men who founded the chain that would become McDonald’s — Dick and Mac McDonald — opened the fast food giant as a modest California hot dog stand in 1937. They would later pivot to a different food: On May 15, 1940, they opened McDonald’s Bar-B-Que in San Bernardino. The foray into BBQ was somewhat short-lived, however, because by October 1948 the brothers had realized that most of their profits came from selling burgers. The pair decided to establish a simple menu featuring hamburgers, potato chips, and orange juice, and added french fries and Coke a year later. The franchise was licensed to Ray Kroc in 1954, who transformed McDonald’s into the chain we know today.

Fluorescent lighting, neon writing, Coca-Cola and McDonald Logo.
Credit: ullstein bild via Getty Images

Coca-Cola Tastes “Better” at McDonald’s

No, it’s not your imagination, Coke actually does taste different — and many would say better — at McDonald’s restaurants. This is largely due to the way it’s packaged. While the actual flavoring is identical to other restaurants, McDonald’s gets its Coke syrup delivered in stainless steel tanks instead of the more common plastic bags, which in turn keeps the syrup fresher. McDonald’s also filters its water prior to adding it to the soda machines, and calibrates its syrup-to-water ratio to account for melting ice. In addition, McDonald’s utilizes wider straws than normal, allowing more Coke to “hit your taste buds,” according to the company.

The Queen arrives at the McDonalds restaurant at the Chesire Oaks Designer Outlet Village.
Credit: PA Images/ Alamy Stock Photo

Queen Elizabeth II Technically Owned a McDonald’s

While she was never actually there whipping up McFlurries, Britain’s former monarch technically owned a branch located in Oxfordshire, England, atop the Crown Estate, which is land belonging to the royal family. The queen used to own a second location in Slough, but sold the land in 2016. The location is truly fit for royalty, with leather couches and table service, plus a menu that includes English breakfast tea and porridge. This is not the only royal association with McDonald’s: Princess Diana used to frequently take her sons William and Harry to McDonald’s, despite the fact that they had access to a personal chef.

Close-up of a person eating a Big Mac.
Credit: Cate Gillon/ Getty Images News via Getty Images

A McDonald’s Superfan Has Eaten Over 34,000 Big Macs

In a tradition that first began over 50 years ago on May 17, 1972, Wisconsin’s Don Gorske has consumed upwards of 34,000 Big Macs — and counting. While Gorske originally ate, on average, a whopping (no Burger King pun intended) nine Big Macs per day, he has since scaled back to about two a day. Gorske claims that in those 50 years he has only missed eating a Big Mac on eight days. The previous record for Big Macs eaten in one lifetime was 15,490, a number that Gorske smashed back in 1999 and has been dwarfing ever since.

A wrapped cheeseburger and hamburger display sits inside the McDonald's.
Credit: Tim Boyle/ Getty Images News via Getty Images

McDonald’s Sells 75 Burgers Per Second

According to its own training manual, McDonald’s locations combined sell more than 75 hamburgers per second. The average hamburger is cooked and assembled in 112 seconds, whereas a Quarter Pounder takes a lengthier but still lightning-quick 180 seconds to prepare (assuming there are no other orders being worked on at the same time). McDonald’s produces and sells so many burgers that it had already sold its 100 millionth by 1958. In 1963, it sold its billionth burger live on TV during an episode of Art Linkletter’s variety show. The chain officially stopped keeping track in 1993, when it updated its signs to say “Over 99 Billion Served.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by james benjamin/ Shutterstock

There’s no wrong way to eat a banana — in a smoothie, underneath a mountain of ice cream, or even green (according to a 2019 poll, 5% of Americans prefer bananas in that unripened state). This grocery store staple is one that humans have been eating for at least 6,000 years, with no sign of slowing anytime soon; on average, people around the globe eat 130 bananas per year. These eight facts highlight a few things you may not know about one of the planet’s most beloved fruits.

Bananas arrive at Jerrard & De Camp, wholesale fruit dealers in St. Paul, Minnesota, ca. 1886.
Credit: Minnesota Historical Society/ Corbis Historical via Getty Images

Americans Didn’t Eat Bananas Until the 1870s

Bananas made their U.S. debut in Philadelphia in 1876, sold to fairgoers attending the Centennial Exhibition (the first world’s fair held in America). For 10 cents, visitors could purchase a foil-wrapped banana and get a taste of a fruit many had never seen before. Today, bananas are one of the most popular fruits among American snackers, who consume an average of 13.2 pounds per person each year.

A banana being eaten on the ocean.
Credit: Christopher Moswitzer/ iStock

Sailors Once Believed Bananas Were Bad Luck

Transporting bruisable, temperature-sensitive bananas by boat was no easy feat hundreds of years ago, which could be how sailors became wary of bringing the fruit aboard. Many fishermen and sailors believed that having bananas on a ship invited bad luck, leading to accidents, broken equipment, and a reduction in the number of fish caught. While the origin of the superstition is unclear, some believe it could have started after crew members got sick from eating spoiled bananas or skidded on the slippery peels.

Close-up of green bananas growing.
Credit: Leonsbox/ iStock

Bananas Don’t Grow on Trees

While banana trees can reach upwards of 40 feet tall, these lumbering plants technically aren’t trees — they’re instead considered giant herbs. Botanists designate trees as having woody stems that contain lignin, a substance that makes them sturdier. Banana plants are instead large, herbaceous stalks made from cellulose, a material that decomposes much faster — a necessity considering that after the fruiting process, the large trunks die back and fall over to make way for new growth.

A banana peel on wooden background.
Credit: Bigc Studio/ Shutterstock

Bananas Are Actually Berries

The way scientists classify berries doesn’t always jive with how fruit eaters categorize them. That’s certainly the case for bananas, which are botanically berries. To be considered a true berry, a fruit must develop from a flower that contains an ovary; bananas form from nearly foot-long flowers that meet this criteria. Botanists also require fruit to have three layers: an outer skin (the exocarp), a fleshy layer (called a mesocarp), and an interior that holds two or more seeds (the endocarp). While commercially grown bananas don’t have seeds, their wild counterparts do.

A group of potassium rich foods.
Credit: Danijela Maksimovic/ Shutterstock

Bananas Are Radioactive

Don’t worry — you don’t need a Geiger counter to pick out a bunch of bananas at the supermarket. The potassium in bananas contains trace amounts of radioactive atoms, though because our bodies regularly flush the nutrient out, it’s unable to build up to dangerous levels in our system. Bananas aren’t the only radioactive food: spinach, potatoes, and oranges are, too.

Banana peels are soaked in bottles to be used as liquid organic fertilizer.
Credit: johan kusuma/ Shutterstock

Banana Peels Can Purify Water

Researchers experimenting with ways to remove heavy metals from water have found that banana peels can get the job done. While natural materials like coconut fibers and peanut shells have been used successfully, a 2011 study found that minced banana peels were able to quickly remove lead and copper from water at the same rate or better. The slippery peels can be reused up to 11 times before they lose their toxin-attracting properties.

Close-up of a banana half eaten.
Credit: Vadym Sh/ Shutterstock

Humans Generally Eat Just One Type of Banana

There are more than 1,000 species in the banana family, though it’s rare to see more than one kind at the grocery store. More than 55 million tons of Cavendish bananas are harvested each year, making them the most popularly grown and consumed species. Cavendish bananas get their name from William Spencer Cavendish, Britain’s sixth Duke of Devonshire, whose estate was home to numerous exotic plants. The duke’s eponymous banana stalks would eventually play a huge role in worldwide banana production — all modern Cavendish banana plants are descendants from those grown at the U.K. estate in the 1830s.

Local worker in the banana plantation.
Credit: Somchai_Stock/ Shutterstock

No, Bananas Aren’t Going Extinct

Commercially grown Cavendish banana plants aren’t nurtured from seed, but cloned from existing plants. Farming this way means the species lacks genetic diversity, making it vulnerable to pests and diseases that other species might have evolved to fight off. The main threat concerning commercial bananas is Fusarium wilt, aka “Panama disease,” a soil-borne fungus that is fatal to some species — such as Gros Michel, which was the world’s previously preferred banana. The fungus wiped out most Gros Michel crops in the 1950s, after which farmers switched to the Cavendish species. Modern botanists worry that Panama disease could strike again, destroying the Cavendish species that was once (incorrectly) believed to be immune to the fungus. But with genetic crop engineering and other species to choose from, it’s unlikely bananas will disappear altogether.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by New Africa/ Shutterstock

In America, the act of tipping — chipping in a little extra on top of a bill in recognition of service workers — is both customary and controversial. How much is polite to tip, and how did tipping even come to be in the first place? Is it just an American thing? These eight facts about tipping help answer some questions about when to tip, how tipping evolved, and where you might not want to tip at all.

Retro glass jar used for tipping.
Credit: nutcd32/ Shutterstock

“Tip” Is Not an Acronym

There’s a persistent rumor that the word “tip,” when used to refer to a gratuity, is an acronym for “To Insure Promptness,” “To Insure Performance,” or “To Insure Prompt Service.” This is false. Around the 1700s, “tip” was underworld jargon among petty criminals as a verb meaning “give” or “share.” It’s been in the mainstream, both as a verb and a noun, since the 18th century.

Minimum wage word written on wood block with American Dollar-bills underneath.
Credit: Jenn Miranda/ Shutterstock

The U.S. Minimum Wage Is Different for Tipped Workers

The federal minimum wage for most people in the United States is $7.25 per hour as of 2023, but for tipped workers it’s just $2.13. Those tipped workers need to get paid the equivalent of $7.25 an hour once tips are tallied, and their employer needs to make up the difference, but for staff at some restaurants, a tip isn’t always a bonus — up to a certain point, it’s just supplementing staff wages. (Many states and cities have higher minimum wages.)

Close-up of a man's hands counting European money.
Credit: Viktor Kintop/ Shutterstock

Tipping Was Imported From Europe

As it exists now, tipping is a very American phenomenon, but it was customary in Europe for hundreds of years before wealthy Americans imported it back home. It dates back to the feudal system in the Middle Ages, when servants would perform duties for their wealthy masters and receive a paltry sum in exchange. Eventually, this evolved into gratuities for service industry workers from their customers. Wealthy Americans who traveled to Europe in the 19th century brought back the practice, just as a wave of poor European immigrants, used to working under the European tip system, were arriving. The idea got major pushback at the time as “un-American,” but tipping picked up after the Civil War because …

Illustration of American slaves being forced to work.
Credit: Kean Collection/ Archive Photos via Getty Images

U.S. Tipping Has Roots in Slavery

When recently freed former enslaved folks entered the U.S. workforce after the Civil War, many had few employment prospects — so restaurants would “hire” them, but force them to rely on tips for payment instead of a reliable wage. Six states then abolished tipping in an attempt to force employers to pay their employees, but those bans were eventually overturned. When the Fair Labor Standards Act established the minimum wage in 1938, tipping was codified as a way to earn those wages.

Close-up of tapping a phone to pay for a bill.
Credit: Chay_Tee/ Shutterstock

Tipping Is a Faux Pas in Japan

In America, tipping is customary to not only say thank you for good service but also to help service workers make ends meet — even in states without tipped wages — so not tipping or undertipping is considered an insult in many industries. Yet at most businesses in Japan, tipping is embarrassing at best and insulting at worst. Many restaurants do implement service charges though, so you should still prepare yourself for something on top of face value when the bill comes.

Housekeeper cleaning a hotel room.
Credit: Rawpixel/ Shutterstock

At Hotels, Tips for Housekeeping Are Customary

When you’re staying at a hotel — at least, an American hotel — it’s customary to tip the housekeeping staff. Don’t worry, the etiquette isn’t to leave 20% of your hotel stay. Experts recommend a minimum of anywhere from $1 to $5 per day, and to leave a bigger tip if you’ve been messy, you made special requests, or if there’s a pandemic on.

Paying for a bill tab at a restaurant cafe with cash money.
Credit: MargJohnsonVA/ Shutterstock

65% of Americans Always Tip at Sit-Down Restaurants

According to a poll by Bankrate, slightly less than two-thirds of all Americans always tip at sit-down restaurants. Furthermore, 42% say they always tip 20% or more. This doesn’t include takeout or coffee shops — Americans are much less likely to tip consistently there, at only 13% and 22%, respectively.

Takeout food being picked up from a restaurant.
Credit: Chay_Tee/ Shutterstock

You Should Still Tip for Takeout

Americans are far less likely to tip when they get takeout vs. when they sit down to eat because they’re not getting table service, but restaurant staff still work to prepare your food and package it up. You may not be expected to tip as much, but a little something is still warranted — after all, the kitchen staff perform essentially the same job whether you sit down or take it to go.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by amoklv/ iStock

Whether it’s a hot date or a birthday brunch, even the most seasoned among us can feel a little adrift when it comes to dining out etiquette. Where do your hands go? What do you do with your napkin when you step away from the table? How do you know what to order when it’s someone else’s treat? These eight tips could help your next big culinary outing go a little more smoothly.

Restaurant table setting with food.
Credit: Pablo Merchán Montes/ Unsplash+

Put Your Phone on Silent — and Keep It Off the Table

Etiquette for dining with smartphones is threefold. First, put it on silent so it doesn’t disturb your meal. But don’t just turn down the volume and set it on the table; put it away in your pocket or bag, along with anything else that’s not a part of the meal. Lastly, don’t check your phone while you’re eating. You can take a peek at your notifications when you’re away from the table; while some experts say to wait until the meal is finished, you can probably discreetly check in the restroom, too. Yet rules are always flexible, and there are extenuating circumstances (like getting check-ins from a babysitter) that your dining companions may understand.

A table setting including napkins.
Credit: Joe Vaughn/ iStock

Napkin in the Lapkin

It’s pretty well-trodden etiquette territory to say that your napkin goes in your lap — but when do you put it there, and what do you do with it when you leave the table?

First: Place your napkin in your lap when everybody is seated. If you get up to use the restroom, place your napkin loosely to the left of your plate. It’s less likely to soil the napkin than placing it on your plate itself, and that way you don’t risk transferring food smears from your napkin to your chair (and potentially your clothes).

A couple with menu in a restaurant making order.
Credit: Minerva Studio/ Shutterstock

Let the Person Paying Order First

If dinner’s on someone else, it can be hard to know exactly what the expectations are in terms of price point. A good general rule is to follow the lead of the person treating you. Letting them order first can give you a sense of what budget they had in mind. Regardless, you probably shouldn’t order the most expensive thing on the menu.

A businesswomen enjoying lunch in comfortable cafe.
Credit: JohnnyGreig/ iStock

Your Elbows Are Probably Fine Where They Are

If you struggle to keep your elbows off the table — as your grandmother scolded you to do — there’s some good news. Originally, the rule existed to keep your elbows clean and prevent slouching, but most experts seem to agree that it’s now outdated, particularly when there isn’t any food actually on the table. The Emily Post Institute, a five-generation family powerhouse of etiquette advice, warns against putting your elbows on the table while eating, but instructs that it’s always been acceptable to have your elbows on the table between courses. In general, elbows on the table is also acceptable before and after a meal, although you might want to play it on the safe side while actually eating to avoid dipping your sleeves in gravy.

Woman using knife and fork to cut her dinner.
Credit: a-wrangler/ iStock

Raise Food Issues Quietly

If there’s a hair in your salad or a smudge on your glass, there’s no need to turn it into a tablewide conversation topic. Flag down your server and explain your issue discreetly and politely. They should be back with a replacement momentarily, and meanwhile, the mood at the table stays light.

A woman enjoying a meal at a restaurant.
Credit: skynesher/ iStock

Chew With Your Mouth Closed

The global COVID-19 pandemic kept a lot of people out of sit-down dining establishments, so you may have reverted back to some old habits, like talking with your mouth full. But remember: At no point should anybody you are eating with see food on the inside of your mouth. One study says that food does taste better if you chew with your mouth open — but it’s not worth alienating your dining companions over.

Guest check with Cash and Coins.
Credit: studiocasper/ iStock

Yes, You Need To Tip

Unless the establishment has a specific policy against it, tip your server — at least, if you’re dining in America. It’s not just good etiquette: Tips can amount to more than half of the income of servers and bartenders, and that money is often shared with back-of-house workers such as cooks. Experts say that 15% to 20% of the pre-tax total is customary, but 42% of Americans always tip 20% or more. A 20% tip is easy to calculate, too: Calculate 10% by moving the decimal point on the total once to the left. Then double it.

Empty tables inside a nice Restaurant.
Credit: Hispanolistic/ iStock

Don’t Overstay

Don’t linger for too long after you finish your meal, especially if the restaurant is full or you have an especially large party. It’s disrespectful to the establishment, which needs to serve more customers to stay in business, and to other customers who are waiting for a seat. In some cases, you may even be holding up a reservation. Some diners take offense when they feel rushed away from their table; try to be understanding if you do. If you want to stay for longer and there’s not a line, order something else, like a dessert, a shared plate, or a round of cocktails, or at least check in with your server.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Thanatip S./ Shutterstock

It’s hard to imagine life without a spoonful of sugar. It helps fuel our sweet tooth and our bursts of energy, and it just may be a future way to power high-flying jets. It’s also been with us for a while: Scientists believe the saccharine substance likely made its way into our guts by accident millennia ago, eventually becoming a standard human craving as it spread across the globe. Take a bite out of sugar’s backstory with these six sweet facts.

Sugar beet root crop organically grown in a cultivated field.
Credit: Bits And Splits/ Shutterstock

All Plants Produce Sugar

Not all plants are made for eating, and chances are most aren’t palatable to human taste buds. However, nearly all plants make sugar, particularly those with green leaves. Sugar, aka sucrose, consists of two simple sugars, glucose and fructose; glucose is a naturally occurring byproduct of photosynthesis, the process plants undergo to convert sunlight to energy. Plants produce glucose in their leaves and then send it to their roots, storing the energy they need to grow.

All plants store their sugar differently; some, like potatoes, transform it into starch, while others, like apple and orange trees, store sugar in their fruits. Plants with particularly high concentrations of glucose are the ones humans harvest for table sugar — specifically sugar cane and sugar beets.

Sugar and Sugar cane on leaf and wooden background.
Credit: apichart sripa/ Shutterstock

Sugar Cane Originally Comes From New Guinea

More than 60 million acres of land worldwide are used for sugar cane farming, often in regions that were once tropical forests. The crops thrive in warm climates with consistent year-round temperatures — generally in spots close to the equator. However, biologists believe sugar cane plants, aka Saccharum officinarum, originated in just one spot — New Guinea — where Indigenous peoples may have cultivated the crop starting 10,000 years ago. Some researchers believe sugar cane was originally grown for chewing, like gum, and early farmers selected the sweetest, softest stalks for consumption. Over time, humans helped spread Saccharum plants through Southeast Asia, India, and the Pacific islands, where they merged with other wild sugar canes to create the modern variety we know and grow. By the 15th century, sugar cane plants made their way to the Americas, where they became established crops; today, Brazil is the world’s leading exporter of sugar cane.

Agronomist inspects a sugar beetroot at sunset.
Credit: DedovStock/ Shutterstock

More Than One-Third of the World’s Sugar Comes From Beets

Not all commercially produced sugar comes from sugar cane plants; about one-third of the world’s sugar supply comes from sugar beets, a root crop that thrives in cooler temperatures far from the equator. More than half of the U.S. sugar supply comes from sugar beets, which are grown in Michigan, Minnesota, Montana, and other northern and western states, and each year more than 4.5 million tons of sugar are produced from American-grown sugar beets. Each beet grows for about five months before reaching its maximum size: about a foot long, and weighing between 2 and 5 pounds. While sugar cane and sugar beets are grown and processed differently, the final sugar product is chemically identical. Nevertheless, some chefs believe the two sugars cook slightly differently and can have contrasting colors when caramelized or used to make syrups.

1904 World's Fair in St. Louis, Missouri.
Credit: Education Images/ Universal Images Group via Getty Images

The 1904 World’s Fair Was a Sugar Showcase

World’s fairs may feel like a relic of the past; the last one in North America was in Vancouver in 1986. Yet they were the launching point for some of today’s favorite sugary treats. At the 1904 world’s fair (aka the Louisiana Purchase Exposition) in St. Louis, attendees got their first sample of fairy floss, the fluffy spun-sugar that’s now more commonly called cotton candy. The confection was so popular that creators William J. Morrison and John C. Wharton sold more than 65,000 boxes at 25 cents each — about half the price of admission to the fair.

“Cornucopias,” aka ice cream cones, also hit American taste buds on a wide scale for the first time at the fair, crafted from rolled waffles and stuffed with ice cream. And while Jell-O had already been around in its fruit-flavored form since 1897, the world’s fair helped launch the jiggly sweet’s advertising campaign, with demonstrations that showed how easy it was to make by just adding hot water. The fair’s influence was immediately noticeable: Jell-O sales quadrupled between 1902 and 1906, reaching $1 million in sales.

Spaceship takes off into the starry sky.
Credit: Alones/ Shutterstock

There’s Sugar in Space

If you’re trying to curb your sweet tooth, it can feel like sugar is everywhere. And in some ways, you’re not wrong — sugar isn’t just on Earth; it can also be found in space. In 2000, space scientists discovered a simple sugar called glycolaldehyde while looking for other molecules that could potentially support life outside our atmosphere. Despite being labeled a “simple sugar,” glycolaldehyde plays a huge role in DNA creation; when combined with a chemical called propenal, it makes ribose, a major component of ribonucleic acid (aka RNA, a chemical chain found in all living things). However, this clue for potential space life has only been found in two spots: the center of the Milky Way, and near a star some 400 light-years from Earth.

Jack russell terrier dog eating ice cream cone on the green lawn.
Credit: Reshetnikov_art/ Shutterstock

Dogs Can Taste Sugar

Man’s best friend shares our ability to taste different flavors, albeit at a diminished level. While humans have between 2,000 and 10,000 taste buds (a number that shrinks with age), dogs have a mere 1,700. Yet studies have shown that dogs can taste sweetness. This trait may have developed from ancient dogs who lived as omnivores, consuming fruits and vegetables along with meat. However, not all household pets have a sweet tooth. Cats are unable to taste sugars and sweets because they lack the necessary taste buds thanks to genetic mutations that occurred millions of years ago — meaning that while dog owners may have to give up a scoop of ice cream or order a “pup cup,” cat parents are free to indulge without sharing, guilt-free.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Africa Studio/ Shutterstock

The world’s love affair with coffee seemingly knows no bounds. Beginning in 13th-century Arabia and later migrating across the globe, coffee — and coffee culture — has shaped the way we work, socialize, and savor our daily rituals. Today, gourmet espresso drinks are an important piece of many people’s lives, often multiple times per day. But with so many different beverage choices, how do you choose what to order? What’s the difference between a caffè latte and a flat white, and how can you know which drinks are petite pick-me-ups and which are more, ahem, grande? Here are six captivating, caffeinating explanations of well-known espresso drinks to sort it all out.

An Iced Americano Coffee.
Credit: Hyeong-Taek Lee/ Shutterstock

What’s an Americano?

The humble Americano is a product of World War II, born of American soldiers’ attempts to recreate their good old cup of joe from back home while they were stationed in Italy. The “black coffee” of espresso drinks, the caffè Americano is just water and espresso. It contains no dairy or plant milks, and the ratio is usually one-third espresso to two-thirds hot water — although some serve it with half espresso and half water. To make this drink, one must first pour the espresso, and then add the water. Ice can be added to serve it cold. The Americano’s close cousin, the long black, is served in Australia and New Zealand, and it’s a slightly stronger beverage, due to the espresso being poured second.

A woman pouring a latte into a mug.
Credit: Brooke Cagle/ Unsplash+

What’s a Latte?

Latte means “milk” in Italian, so this drink is aptly named. In most coffee shops, if you order a caffè latte — or “latte” for short — what you’ll get is two shots of espresso poured into between 4 and 6 ounces of steamed milk. (If you order a larger size, you’ll get more steamed milk added, but the amount of espresso will stay the same — unless you request an extra shot, that is.) Coffee has been commonly taken with cream or milk, either steamed or unsteamed, in Europe since at least the 17th century. But the modern latte we know today was established on the West Coast of the U.S., starting out on the menu at the historic Caffe Mediterraneum in Berkeley, California, in the 1950s and later popularized in cafes throughout Seattle, Washington, in the ’80s. Depending on the barista, your latte may have a small amount of foamed milk added to the top — and you may even receive a pretty design in the foam, if they’re skilled at latte art. Just don’t order a “latte” when in Italy — if you don’t use its full name, caffè latte, you might end up with a regular glass of milk!

Iced Mocha Coffee with Whipped Cream and Chocolate.
Credit: Brent Hofacker/ Shutterstock

What’s a Mocha?

Are you a chocolate fan? Well, a mocha is just a latte with chocolate in it, usually in the form of either cocoa powder mixed with sugar or chocolate syrup. Its name alludes to the port city of Mokha, Yemen, an early center of the coffee trade starting in the 15th century. As the story goes, the beans from Mokha were said to have an element of chocolate in their flavor and tone, and eventually, around the turn of the 20th century, people just started adding the chocolate themselves. A caffè mocha — or just “mocha“ for short — is traditionally served in a glass rather than a mug and is frequently topped with a dusting of cinnamon. Because it has added sugar, some people consider the mocha a dessert rather than a breakfast drink, but be careful — because it contains chocolate, a mocha can contain significantly more caffeine than other coffee drinks, so it may not be the best choice before bedtime.

A flat white coffee.
Credit: anotherwork91/ Shutterstock

What’s a Flat White?

The flat white comes to us from New Zealand (though some say Australia), and it’s essentially a more intense latte with a thinner layer of foam. (Or a “flatter” layer, as some might describe it; the name may also come from the unique type of foam the drink uses, as it’s flatter than the super-airy foam in, say, a cappuccino.) The flat white is only about 6 ounces total, so it’s a smaller, shorter drink than a latte, which is lengthened with more milk. This gives the flat white a more powerful espresso flavor, thanks to the higher coffee-to-milk ratio. To make a flat white, start with a heatproof glass, pour in a single or double shot of espresso, add about 4 ounces of steamed milk, then cap it with “microfoam,” which is made at a lower temperature, in a thin layer to the top of the beverage. This special microfoam has smaller, tighter bubbles that impart a more velvety feel, which contributes to the creamy overall mouthfeel of the drink. The popularity of the flat white in the U.S. jumped significantly around 2010, and now they’re such a standard part of the American coffee repertoire that you can even order them at Starbucks.

Barista preparing a cortado coffee.
Credit: Al Gonzalez/ iStock

What’s a Cortado?

The cortado originates in Spain — the word cortado means “cut,” because it’s an espresso that’s been cut with warm milk. This is a mini-drink, usually around 4 ounces, intended as an afternoon pick-me-up, often accompanied by a pastry or some other snack. It’s simple to make, too: A cortado is half espresso and half steamed milk, usually 2 ounces of each, and it’s typically served without sugar. If you prefer your coffee a little sweeter, you can opt for the Cuban version, the cortadito, with a dab of sugar or sweetened condensed milk added.

What’s a Cappuccino?

Named in the 19th century after the tan-and-white robes of Rome’s Capuchin monks, a cappuccino is about balance. It’s a trinity of espresso, steamed milk, and foamed milk — a third of each, layered in the cup, in equal amounts — served in a ceramic cup that’s a little wider and flatter than the usual coffee mug. The result is a rich drink, usually only 3 or 4 ounces in volume, with low acidity, a strong espresso flavor, and a mild sweetness from the milk. To correctly drink a cappuccino, one should not mix the elements together and instead sip it as is, with a visual demarcation between the ingredients, so that each flavor can be tasted separately. Although a cappuccino is usually served unsweetened, in the United States, the milk foam is often lightly dusted with cinnamon, while in Australia, New Zealand, and the U.K., cocoa is used in this way. Fun fact: In Italian, cappuccio means “hood,” which is how the monks themselves got their name, thanks to this feature of their distinctive robes.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Adam Melnyk/ Shutterstock

The typical American grocery store carries 40,000 to 50,000 items, including hundreds of fruits and vegetables. Those photogenic displays of dew-kissed leafy greens, pyramids of shiny apples, and baskets of sunny lemons are carefully organized, of course, to entice shoppers into purchasing them. Let’s investigate some secrets of the produce aisle below.

A look at the produce aisle at a supermarket.
Credit: Stefan Malloch/ Shutterstock

The Produce Aisle Is Strategically Placed in the Grocery Store

Imagine walking into your favorite grocery store. You push a cart through the automatic door, and bam — a wonderland of vibrant vegetables and fragrant fruits hits your senses. Displays of colorful produce at peak freshness are designed to entice shoppers and encourage spending, even if a consumer didn’t initially come into the store to buy bananas or Brussels sprouts. Produce aisles — which, these days, can be more like smorgasbords of fruits, veg, flowers, salad dressings, fresh juices, and more — employ flattering lighting and pleasant music to whet customers’ appetites and prime them for more shopping throughout the store.

Local produce outside a farmers market.
Credit: Arina P Habich/ Shutterstock

“Local” Produce Might Come From Hundreds of Miles Away

According to a 2017 study by the Food Marketing Institute, 54% of shoppers want a large selection of locally grown produce in their grocery stores. The USDA doesn’t have a set definition for what “local” means in terms of miles, though, so grocery store chains have devised their own. In 2018, ABC10 in Sacramento surveyed several chains operating in Northern California and found widely varying radii: Safeway considers produce grown and packaged in the state to be “local,” Sprouts Farmers Market pegs its definition to produce grown in state or within 500 miles of a store, and Raley’s said that produce tagged with a “living local” label had been grown within 50 miles of the location.

A view of a grocery store's fresh salad bar counter.
Credit: 8th.creator/ Shutterstock

Cooking Demos, Salad Bars, and Pineapple Corers Helped Popularize the Produce Section

In the early 1970s, produce sections accounted for only 3% of space in mom-and-pop grocery stores. Fresh fruits and vegetables were considered specialty items instead of an essential food group. After concluding that there was a communication breakdown between fruit and vegetable producers and consumers, grocery managers launched campaigns to educate shoppers about their products, complete with cooking demonstrations, free samples, and salad bars in the produce aisles. Displays of pineapple corers and other tools were positioned near the fresh items, demystifying the prep involved with buying and eating produce. By 1993, the average grocery store carried around 250 different produce types (up from 100 in 1980), fresh fruit and vegetable consumption had increased, and produce aisles contributed 20% of stores’ net profits.

Fresh flower bouquets displayed in grocery store.
Credit: Noel V. Baebler/ Shutterstock

Flowers Are Placed in the Produce Aisle for a Specific Reason

Cut flowers and potted plants are commonly placed in the produce aisle for the same reason that the aisles are positioned in the front sections of grocery stores: They reinforce consumers’ perception of freshness. Flowers introduce color, texture, fragrance, and beauty to shoppers as soon as they walk through the doors. That makes shoppers more likely to associate freshness and desirability with the other items in the store — even things that are canned or frozen — and feel encouraged to buy additional products.

Various vegetables on shelves in grocery store.
Credit: sirtravelalot/ Shutterstock

The Freshest Produce Is at the Back of the Display

Have you ever moved a piece of fruit to the front of your fridge so you eat it before it rots? The same technique is at work in the produce aisle. Managers frequently rotate the fruit or veggies in the displays so that older items are brought to the front, a system dubbed FEFO (for “first expired, first out of the store”). The arrangement encourages shoppers to buy the older bananas or boxes of blueberries, which translates to less waste and higher profits for the store. Thus, it pays to reach to the back of the bags of spinach or piles of pears to glean the freshest and most flavorful specimens. Another tip: Produce deliveries usually arrive in the middle of the week, so if your favorite store gets shipments on Tuesdays, for example, you’ll find the freshest foods on Wednesdays.

Bunches of ripe bananas that can found in the produce isle of the grocery store.
Credit: Baloncici/ Shutterstock

Bananas Should Be Yellow, but Not Just Any Yellow

The bananas you see at the grocery store are Cavendish bananas — a cultivar that the banana industry adopted in the 1950s after a tropical fungus wiped out an earlier variety. Growers produce more than 60.6 million tons of Cavendish bananas every year for export all over the world. The bananas are prized for their sweetness, creamy texture, and appealing bright-yellow skin.

Stores buy unripe green bananas from growers so that by the time the fruit hits grocery store shelves, it has ripened to its more well-known yellow shade. Marketing expert Martin Lindstrom has written that bananas matching Pantone 12-0752 TPX Buttercup — a warm, inviting yellow — tend to sell better than bananas in Pantone 13-0858 TCX Vibrant Yellow, just one shade cooler.

A farmer washes fresh lettuce at a farmers market stall.
Credit: carterdayne/ iStock

Misting Produce Is a Clever Way To Make You Buy More

Many grocery stores display produce in open cases fitted with tiny jets to periodically bathe the veggies in a cool mist. (Some supermarkets even pipe in the sound of thundering rain to add to the rainy vibe.) The purpose behind misting is not to keep produce clean or extend its shelf life — it’s a clever way for grocers to make the fruits and vegetables look fresher and healthier so consumers purchase more. Water clinging to leafy greens also adds weight, which increases revenue for the store when vegetables are sold by the pound.

Ironically, misting actually shortens produce’s shelf life because water allows bacteria and mold to take hold. Misted veggies will likely not last as long in your fridge as those that weren’t misted in the produce aisle — which is another, perhaps sneakier, way to get you to buy produce more often.

A woman cutting broccoli on a cutting board.
Credit: alvarez/ iStock

Brussels Sprouts, Broccoli, and Kale Are All Subspecies of the Same Plant

A surprising number of veggies in the produce aisle are the same species, Brassica oleracea — but you wouldn’t know it by looking at them. Brussels sprouts, broccoli, cauliflower, kale, collard greens, purple and green cabbage, and kohlrabi are all domesticated cultivars of wild cabbage, a plant native to western and southern Europe. For the last few thousand years, farmers have selectively bred the wild plant to augment some part of its form, such as the leaves, buds, or stems. Today, each cultivar is classified as a subspecies of B. oleracea.

Close-up of the popular "trail mix" snack on new steel countertop.
Credit: billnoll/ iStock

Botanically speaking, a nut is a fruit with a hard shell containing a single seed. The true nuts you might encounter in the produce aisle include hazelnuts and chestnuts. Many of the products sold as “culinary nuts” belong to other botanical classifications. Cashews, almonds, and pistachios are known as “drupes,” a type of fruit with thin skin and a pit containing the seed. (Peaches, mangos, cherries, and olives are also drupes.) And the jury is still out on whether walnuts and pecans fall into the nut or drupe category since they have characteristics of both. Some botanists call them drupaceous nuts.

Woman standing in front of a row of produce in a grocery store.
Credit: Adam Melnyk

The Produce Industry Has a Special Lingo

Like any business, the produce industry has its own slang, describing everything from a cosmetic flaw in a tomato (“catfacing”) to the practice of hiding some less-than-ideal specimens in a box of otherwise fresh fruit (“stovepiping”). In produce slang, veggies “with legs” are those that have a longer shelf life than those that require special handling and rotation on the display. A flawless fruit, whether it’s a peach, pear, or pineapple, is a “diamond.” A quality cantaloupe will exhibit a “full slip” on the blossom end, meaning it separated easily from the vine when it was picked, indicating the best flavor.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by il21/ Shutterstock

Who invented the pumpkin spice latte? What country has its own national standard for brewing the perfect cup of tea? And what fierce bicoastal debate surrounds the martini? We’ve rounded up our favorite facts about beverages from around the website, so brew up a cuppa or pour yourself a cold one and pore over these facts about coffee, tea, wine, beer, cocktails, and more.

Coffee beans being scooped up in a factory.
Credit: Tim Mossholder/ Unsplash

Coffee Beans Aren’t Actually Beans

It turns out that the name we use for those tiny pods that are ground and brewed into a cup of joe is a misnomer. Coffee “beans” are actually the seeds found within coffee cherries, a reddish fruit harvested from coffee trees. Farmers remove the skin and flesh from the cherry, leaving only the seed inside to be washed and roasted.

Coffee farming is a major time investment: On average, a tree takes three to four years to produce its first crop of cherries. In most of the Coffee Belt — a band along the equator where most coffee is grown that includes the countries of Brazil, Ethiopia, and Indonesia — coffee cherries are harvested just once per year. In many countries, the cherries are picked by hand.

Two cups of healthy herbal tea with mint, cinnamon, dried rose and chamomile flowers.
Credit: Foxys Forest Manufacture/ Shutterstock

Herbal Tea Isn’t Actually Tea

This may be a shocking revelation, but herbal “teas” like chamomile and peppermint aren’t officially teas at all. In order for a drink to be classified as tea, it must come from the Camellia sinensis plant, from which many white, green, oolong, and black teas do. Herbal teas, however, are tisanes, or infusions that incorporate various leaves, fruits, barks, roots, flowers, and other edible non-tea plants. So while the experience of drinking a minty tea may be quite similar to drinking a warm cup of green tea, the two beverages fall into completely different categories from a scientific gastronomic perspective.

Man having glass of fresh sparkling mineral water with his dinner.
Credit: Bignai/ Shutterstock

Seltzer Water Was Named After the German Town of Selters

Germany loves its beer, but seltzer is a close second. The country is so entwined with the fizzy beverage that the word “seltzer” comes from the name of the German town of Selters (located about 40 miles northwest of Frankfurt), which is famous for its naturally carbonated mineral springs. The springs have been well known in the area for more than 1,000 years, and by 1791, fizzy water from Selters was so popular, it was exported throughout the world in jugs stamped with the name “selters-wasser,” or “Selters water.” The word transformed into “seltzer” when the beverage became popular in North America, especially in New York and Philadelphia, around the early 19th century. Today, the Selterswassermuseum (in Selters, of course) chronicles the local spring’s long history.

White Starbucks cup.
Credit: Aaron Ho/ Unsplash

Starbucks Invented the Pumpkin Spice Latte

Love it or hate it, the pumpkin spice latte is a part of the American coffee identity. It’s hard to imagine that the drink didn’t exist as recently as the early 2000s. We have Starbucks to thank for the seasonal treat — they introduced it in 2003. Pumpkin spice lattes were created by the “Liquid Lab” at Starbucks’ Seattle headquarters, and are considered to be the brainchild of Peter Dukes. Dukes had the idea for the latte back in 2001, at a time when Starbucks was trying to conceive of a fall-themed beverage that would become as popular as their seasonal holiday drinks. Short of an actual recipe, the testers brought pumpkin pies into a lab, poured espresso atop, and ate the pie in what proved to be a delicious treat. After matching the taste in drink form, the result blew up into a worldwide sensation.

Pumpkin spice lattes were first tested in 100 Starbucks stores in 2003 before launching worldwide the following year. They went on to sell upwards of 500 million cups in the drink’s first 18 years on the market. The drink has expanded far beyond Starbucks ever since, becoming an autumnal staple of coffee shops everywhere.

Close-up of a Negroni cocktail.
Credit: Allan Francis/ Unsplash

The Negroni Was Invented by One Count Negroni

In 1919, Count Camillo Negroni bellied up to the bar at Café Casoni and asked for something stronger than his usual Americano (Campari, club soda, and vermouth). Fosco Scarselli obliged, replacing the club soda with gin, and the Negroni was born. While the ownership and name have changed a few times, you can still visit the original space on Piazza della Libertà, now known as Caffè Lietta.

Hot tea being poured into a cup.
Credit: Barrett Baker/ Unsplash

The British Have Their Own Official Standard for the Perfect Cup of Tea

The British are serious about tea. So much so that British Standards — a national body that produces technical specifications for products and services — released an edict in 1980 on the official British guidelines for making the perfect cup of tea. Though some may disagree with the standard, the rules include the following: Use a porcelain pot and a ratio of 2 grams of tea per every 100 ml of water, brew for six minutes, maintain a temperature of 60 to 85 degrees Celsius (140 to 185 degrees Fahrenheit) when serving the tea, and add milk to the mug first if using tea that’s already been steeped.

A glass of almond milk on a wooden table with almonds and dates.
Credit: Madeleine Steinbach/ Shutterstock

Plant Milks Have Been Around for 5,000 Years

For years, dairy producers have sued alternative milk companies for using the word “milk” on their packaging — but history is not on their side. Evidence suggests that Romans had a complex understanding of the word “milk,” as the root of the word “lettuce” comes from “lact” (as in “lactate”). Many medieval cookbooks make reference to almond milk, and the earliest mention of soy milk can be found on a Chinese stone slab from around the first to third century CE. However, coconut milk has the longest history; archaeologists have recovered coconut graters among relics from Madagascar and Southeast Asia that date back to around 3000 to 1500 BCE.

A view of a California vineyard landscape.
Credit: Getty Images/ Unsplash+

California Wines Beat French Rivals in a Blind Taste Test

In a legendary event dubbed “The Judgment of Paris,” held on May 24, 1976, French wine experts preferred upstart California wines to the finest French ones in a taste test. An English wine shop owner staged the event to drum up business, and everyone assumed a French victory was a foregone conclusion. The nine experts swirled, sniffed, and sipped a variety of reds and whites, then tallied the number of points they awarded to each sample; shockingly, a cabernet sauvignon and a chardonnay from Napa Valley won out, proving that countries besides France could produce the world’s finest wines. A bottle of each winning wine is now in the Smithsonian collection.

Brewed coffee getting poured it into a mug.
Credit: Natalia Belay/ Shutterstock

Decaf Coffee Is Still a Tiny Bit Caffeinated

Decaf coffee has helped coffee drinkers enjoy the taste of coffee without (much of) the jolting effects of caffeine, but its creation was entirely accidental. According to legend, around 1905 German coffee merchant Ludwig Roselius received a crate of coffee beans that had been drenched with seawater. Trying to salvage the beans, the salesman roasted them anyway, discovering that cups brewed with the beans retained their taste (with a little added salt) but didn’t have any jittery side effects. Today, the process for making decaf blends remains relatively similar: Beans are soaked in water or other solvents to remove the caffeine, then washed and roasted. However, no coffee is entirely free of caffeine. It’s estimated that 97% of caffeine is removed during preparation, but a cup of decaf has as little as 2 milligrams of caffeine — compared to regular coffee’s 95 milligrams.

Painted building advertising Dr. Pepper.
Credit: Katelyn Perry/ Unsplash+

Dr Pepper Was Once Marketed as a Warm Beverage

Dr Pepper was first served around 1885 at Morrison’s Old Corner Drug Store in Waco, Texas. The drink was created by Charles Alderton in an effort to capture the fruity and syrupy smells wafting through the store. Though Dr Pepper was initially served cold, the drink was briefly marketed as a warm beverage, a plan that was developed to ensure the brand’s continued popularity throughout the colder holiday months.

Hot Dr Pepper was first conceived of in 1958, when company president Wesby Parker found inspiration while visiting a bottling plant during a blizzard. The result was a new recipe developed by the company that encouraged consumers to heat Dr Pepper over a stovetop to 180 degrees and then pour it over a thin slice of lemon. The drink was marketed in ads using taglines such as “Devilishly Different” and “Winter Warmer,” and an alcoholic version containing rum, called the Schuss-Boomer, was later popularized. Hot Dr Pepper remained a beloved holiday drink into the 1970s, and though it has since faded in popularity, the beverage continues to be made each year by certain pockets of loyal fans.

A close-up of a Martini with two olives.
Credit: Katelyn Perry/ Unsplash+

There’s a Debate Over Whether the Martini Was Invented in California or New York

The “shaken or stirred” debate has nothing on the origin of America’s most iconic cocktail, which is vigorously argued by both of the nation’s coasts. The historic town of Martinez, California, swears the gin-and-vermouth classic was created as a celebratory Champagne replacement for a gold miner who struck it rich. New Yorkers insist it’s solely the invention of the bar staff at the Knickerbocker Hotel, named after the Martini in Martini & Rossi vermouth. As for us? We’ll think about it while we have another.

Smiling photo of Wendy Kaufman, the Snapple lady.
Credit: Pool BASSIGNAC/REGLAIN/ German Rapho via Getty Images

The Snapple Lady Was an Actual Employee

Wendy Kaufman was hired in 1991 to work in Snapple’s shipping department. A hardworking, dedicated employee, she noticed the fan mail piling up in the mail room and made it her mission to answer the letters personally, writing or even calling fans back to thank them for their devotion to the brand. Kaufman ultimately rocketed to stardom after being cast as “Wendy the Snapple Lady,” a character who appeared in 37 commercials between 1993 and 1995. The commercials featured a fictionalized version of Wendy doing what she did best — reading and answering fan mail — and some of the ads even involved filming at the homes of fans who had written letters.

Coca-Cola advertisement featuring Santa Clause.
Coke advertisement with Santa Clause.

Coca-Cola Ads Helped Popularize Santa Claus’ Modern-Day Likeness

Coke has a surprising connection to Santa Claus. In 1931, Coca-Cola hired illustrator Haddon Sundblom to paint Santa Claus for a series of holiday advertisements. Using friend and retired salesman Lou Prentiss as a model, Sundblom produced a version of Santa that depicted the jolly, bearded man with rosy cheeks that we all recognize today. Sundblom would continue painting Santa for Coke’s advertisements until 1964.

While the character of Santa Claus predated Coke, of course, he had been depicted in a variety of ways, ranging from tall and thin to looking like an elf. An 1862 drawing of Santa Claus by Thomas Nast for Harper’s Weekly portrayed Santa as a tiny figure compared to the booming presence he is today, though Nast would also be the first to draw Santa wearing a red jacket, and some other Nast drawings showed a version of Santa that resembles the jolly man we now know. Yet all in all, it wasn’t until Coca-Cola debuted its holiday advertisements that Americans began to fully associate Santa Claus with the large, jovial figure we know him as today.

Glasses with light and dark beer in a cafe.
Credit: Nick Starichenko/ Shutterstock

The Czech Republic Consumes the Most Beer of Any Country

For over 25 years running, the country that drinks the most beer per capita — by quite a large margin — is the Czech Republic. The average resident there guzzles 142.6 liters of the golden bubbly beverage annually. By comparison, people in other major beer-drinking countries such as Austria and Germany barely crack 100 liters.

It’s fair to say that the Czech Republic has a strong beer culture. After all, it is the birthplace of pilsner, one of the most popular styles of beer, and in many Czech cities, a beer will set you back less than a bottle of water. And it doesn’t seem likely that the country will reverse course anytime soon. Each year, consumption is increasing, although trends in recent years favor take-home bottles from breweries rather than old-fashioned pints at a local pub.

Vitis vinifera grape vine.
Credit: jessicahyde/ Shutterstock

Almost All Wines Are Grown From a Single Species of Grape

The mother vine of almost all wines today is Vitis vinifera, a grape likely native to Western Asia. Over millennia, winemakers have domesticated and crossbred the vines to create subspecies with distinct colors, flavors, and suitability to different climates. About 8,000 cultivars exist today, including well-known varieties like pinot noir, chardonnay, sauvignon blanc, and merlot. V. vinifera vines have long been cultivated in regions with hot, dry summers and mild winters, such as Italy, Spain, and France, but the U.S., Chile, Australia, and South Africa are also major producers, among other countries.

Close-up of a cup of coffee.
Credit:Lala Azizli/ Unsplash

Your Genes Might Determine How Much Coffee You Drink

If you can’t get through the day without several cups of coffee, you may have your genes to blame. A 2018 study suggests inherited traits determine how sensitive humans are to bitter foods like caffeine and quinine (found in tonic water). Researchers found that people with genes that allow them to strongly taste bitter caffeine were more likely to be heavy coffee drinkers (defined as consuming four or more cups daily). It seems counterintuitive that people more perceptive to astringent tastes would drink more coffee than those with average sensitivity — after all, bitter-detecting taste buds likely developed as the body’s response to prevent poisoning. But some scientists think that human brains have learned to bypass this warning system in favor of caffeine’s energizing properties. The downside? Constant coffee consumers are at higher risk of developing caffeine addiction.

Woman taking a tea bag out of a cup.
Credit: New Africa/ Shutterstock

Tea Bags Were Popularized by Accident

Before individual tea bags came into wide use, it was more common to make an entire pot of tea at once by pouring hot water over tea leaves and then using a strainer. In 1901, Wisconsin inventors Roberta C. Lawson and Mary Molaren filed a patent for a “tea leaf holder,” a concept that resembles the tea bags we use today. It wasn’t until about seven years later, however, that another individual inadvertently helped popularize the concept of tea bags — at least according to legend. Around 1908, American tea importer Thomas Sullivan reportedly sent samples of tea inside small silken bags to his customers. His clients failed to remove the tea leaves from the bags as Sullivan assumed they would, and soon Sullivan realized that he’d stumbled onto an exciting new concept for tea brewing. He later reimagined the bags using gauze, and eventually paper.

Tea bags were booming in popularity throughout the United States by the 1920s, but it took a while for residents of the United Kingdom to adopt the concept. In fact, tea bags wouldn’t make their way to the U.K. until 1952, when Lipton patented its “flo-thru” bag, but even then the British weren’t keen to change their tea-brewing ways. By 1968, only 3% of tea brewed in the U.K. was done so using tea bags, with that number rising to 12.5% in 1971. By the end of the 20th century, however, 96% of U.K. tea was brewed with bags.

Farmer pouring fresh milk from milk churn container can into another.
Credit: PixHound/ Shutterstock

Dairy Milk Was Revolutionized by Bacteriology

In 1857, French chemist and microbiologist Louis Pasteur discovered that microorganisms in the air caused lactic acid fermentation, aka the souring of milk. Pasteur also discovered (after a request from Emperor of France Napoleon III) that certain microbes caused wine to go bad, but briefly heating the libation to around 140 degrees Fahrenheit caused those microbes to die off, leaving behind a sterilized (or as it would be later known, “pasteurized”) liquid that would stay fresh for longer.

Pasteurization for milk wasn’t introduced until 1886, but it was a game-changer, as diseases introduced via contaminated milk killed scores of infants in the 19th century. With the introduction of pasteurization, that number dropped significantly.

Close-up of group of wine corks.
Credit: Michal Zak/ Shutterstock

Humans Invented Alcohol Before We Invented the Wheel

The wheel is credited as one of humankind’s most important inventions: It allowed people to travel farther on land than ever before, irrigate crops, and spin fibers, among other benefits. Today, we often consider the wheel to be the ultimate civilization game-changer, but it turns out, creating the multipurpose apparatus wasn’t really on humanity’s immediate to-do list. Our ancient ancestors worked on other ideas first: boats, musical instruments, glue, and alcohol. The oldest evidence of booze comes from China, where archaeologists have unearthed 9,000-year-old pottery coated with beer residue; in contrast, early wheels didn’t appear until around 3500 BCE (about three millennia later), in what is now Iraq. But even when humans began using wheels, they had a different application — rudimentary versions were commonly used as potter’s wheels, a necessity for mass-producing vessels that could store batches of brew (among other things).

Homemade chocolate Brooklyn egg cream in a glass.
Credit: Brent Hofacker/ Shutterstock

Egg Creams Contain Neither Eggs nor Cream

Foods tend to get their names from their appearance or ingredients, though not all are so clear-cut. Take, for instance, the egg cream, a beverage that has delighted the taste buds of New Yorkers (and other diner patrons) since the 1890s. But if you’ve never sipped on the cool, fizzy drink known for its chocolate flavor and foamy top, you should know: There are no eggs or cream in a traditional egg cream drink.

According to culinary lore, the first egg cream was the accidental invention of Louis Auster, a late-19th- and early-20th-century candy shop owner in New York’s Lower East Side. Auster’s sweet treat arrived in the 1890s, at a time when soda fountains had started selling fancier drinks, and it was a hit — the enterprising inventor reportedly sold upwards of 3,000 egg creams per day by the 1920s and ’30s. However, Auster kept his recipe well guarded; the confectioner refused to sell his formula, and eventually took his recipe to the grave. The origins of the drink’s name have also been lost to time. Some believe the name “egg cream” came from Auster’s use of “Grade A” cream, which could have sounded like “egg cream” with a New York accent. Another possible explanation points to the Yiddish phrase “echt keem,” meaning “pure sweetness.”

Regardless of the misleading name, egg creams are once again gaining popularity in New York, though you don’t have to be a city dweller to get your hands on the cool refreshment. Egg creams can be easily made at home with just three ingredients: milk, seltzer, and chocolate syrup.

A rum Hemingway daiquiri with lime and grapefruit.
Credit: Brent Hofacker/ Shutterstock

Hemingway Has His Own Type of Daiquiri

Ernest Hemingway had more than one favorite bar, but in Cuba, it was El Floridita. The bar was founded in Havana’s Old Quarter in 1817, and it was already an institution as la cuna del daiquiri — the cradle of the daiquiri — when the famous author walked in. After sampling the original, Hemingway requested “more rum, less sugar” from legendary barman and owner Constantino Ribalaigua. You can still order a Papa Doble, Hemingway’s favorite, while sitting next to his life-sized statue.

 Snapple apple 12 pack juice.
Credit: Wirestock, Inc/ Alamy Stock Photo

Snapple’s Apple Juice Once Contained No Apple

Though they’ve since updated the ingredients to list both apple and pear concentrate, there was a time when Snapple’s apple juice drink didn’t contain a single drop of real apple juice. Instead, the company used pear juice flavored to taste like apple, perhaps because the flavor of altered pear concentrate more closely resembled what the public expected out of an apple drink than did apple juice itself.

Agave plants planted in red soil in the mountains of Jalisco.
Credit: Jose de Jesus Churion Del/ Shutterstock

It Takes Eight Years To Grow Agave Plants for Tequila

When European colonists first encountered Mexico’s native agave plants, they were intrigued by the succulents the Aztecs had been using to make clothing, rope, and intoxicating drinks. The spike-tipped plants, which grow as tall as 20 feet, were dug up and transplanted to greenhouses and botanical gardens throughout Spain, Portugal, and other parts of Europe starting in the 16th century. But most agave plants struggled to flourish in areas lacking their natural arid climate; in cooler countries, they were dubbed “century plants,” because those that survived the overseas journey didn’t bloom for nearly 100 years. Agave plants mature much faster when left in their natural habitats, but growing the crop for today’s tequila production is still a time investment. It traditionally takes about eight years before the plants are ready to harvest, though some agave crops are left to grow even longer.

Bottles of reformulated "100% Natural" 7-Up soda.
Credit: Scott Olson/ Getty Images News via Getty Images

7Up Once Contained Mood Stabilizers

While it’s somewhat common knowledge that early versions of Coca-Cola contained cocaine, it wasn’t the only soda to contain unusual and potentially harmful ingredients. In fact, 7Up’s formula used to contain prescription mood stabilizers upon its launch in 1929 — specifically, a drug known as lithium citrate, which is used in modern times to treat conditions such as bipolar disorder.

At the time of 7Up’s inception, the soda was called “Bib-Label Lithiated Lemon-Lime Soda,” which was descriptive of the product’s actual ingredients back then, even though it doesn’t quite roll off the tongue. Though the product’s name was later shortened to 7Up in 1936, it wasn’t until 1948 that lithium citrate was deemed potentially harmful and removed from the recipe after the U.S. Food and Drug Administration outlawed the use of the chemical in sodas.

Viennese chocolate on the terrace of a small cafe near a touristic place in Portugal.
Credit: Pierre-Olivier/ Shutterstock

Portugal Drinks More Hot Chocolate Than Any Country in the World

Hot chocolate is a decadent treat for children and a guilty pleasure for adults, and no country in the world drinks more of it per person than Portugal. The Portuguese drink a whopping 100.2 cups per capita annually — an amount that sounds either soothing or sickening, depending on your sweet tooth.

The hot chocolate that originated in Spain during the 1600s consisted of ground cocoa beans, water, wine, and chili peppers. Although the powdered packets today are quite different, Spain is fourth worldwide in per-capita consumption (76.6 cups). Ahead of Spain are Finland (90.1 cups) and Colombia (84 cups).

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.