It’s easy to lose track of items in the back of a dark pantry, which is why expiration dates can be so helpful in determining when to toss old foods. However, the “best by” dates we rely on aren’t always a true picture of how long a food is shelf-stable. Food dating is mostly a voluntary process for grocery manufacturers, who often just estimate when their products will no longer be at their best quality. Luckily, there are some foods — like the six listed below — that are safe to keep using even if their expiration date has long passed.
Most foods produce a noxious smell when they’ve spoiled, but vinegar always smells pretty potent, so it may be hard to use the old-fashioned sniff test to guess at its quality. Luckily, you don’t have to, since vinegar doesn’t expire. Vinegar is a fermented product, created when yeast consume sugars or starches to create alcohol; that byproduct is then exposed to oxygen and a bacteria called Acetobacter, which continues fermenting to create the final acidic product. That acidity actually makes vinegar self-preserving, which is why it generally doesn’t need to be refrigerated. Over time, vinegar can become hazy or develop sediment, particularly a gelatinous substance called “mother,” though that doesn’t mean you need to toss it — in fact, vinegar mothers (aka a colony of healthy bacteria that forms in fermented liquids) can even be used to create a new batch of the multipurpose solution.
Comedian Mitch Hedberg once joked that rice is the perfect meal if you’re “really hungry and want to eat 2,000 of something.” It’s also a great food for long-term storage. White rice — which starts as brown rice but is milled to remove its exterior husk, bran, and germ — keeps best, so long as it’s properly stored away from moisture and pets. At temperatures under 40 degrees Fahrenheit, white rice’s life span pushes upwards of 25 to 30 years, but even when stored at warmer temperatures, it can last up to 10 years if packed with oxygen absorbers. However, not all rice keeps long-term; opened bags should be used within two years, and brown rice lasts about six months at room-temperature storage because of its naturally occurring oils, which can go rancid.
Sugar has a particularly sweet characteristic: It doesn’t really go “bad.” Granulated sugars (along with some syrups, like corn syrup) are so inhospitable for bacteria that they’re often the primary ingredient used to preserve jellies, jams, and canned fruits. However, like all long-stored pantry staples, helping sugar maintain a long shelf-life means keeping it away from any source of condensation or moisture, which is easily absorbed and can leave behind a hardened block. Even with its ability to last indefinitely, food storage experts say sugar is best consumed within two years of opening — just another reason to mix up a batch of fresh cookies.
Vegetable, animal, or mineral? Salt falls in the latter category, which is one reason it can enjoy an indefinite stay in your pantry without spoiling. Salt has been used to preserve foods (especially meats) for centuries because it’s so effective at inhibiting bacteria; the mineral is able to break down enzymes that help germs grow while also dehydrating food and removing water that bacteria needs to thrive. Its ability to repel water keeps salt unlimitedly useful, though there are some kinds of processed salt that are more likely to deteriorate in quality over time — specifically those with additives such as iodine or anti-caking agents (these kinds are best used in under five years). As for plain salt — it can last forever, especially if kept in a cool, dry place.
Pure vanilla extract can be a grocery store splurge, but if your oven is known for taking a hiatus between bursts of baking, it could be worth the extra cost. That’s because real vanilla extract doesn’t spoil thanks to its high alcohol content — over time, it can actually develop a deeper flavor. Imitation vanilla extract, however, has a drastically shorter shelf-life. While real vanilla is created by soaking vanilla beans in alcohol (which acts as a preservative), the flavoring dupe is made from vanillin, a manufactured substance that replicates the sweet and syrupy flavor. On the shelf, imitation vanilla lasts just six to 12 months before beginning to degrade and losing its flavor.
Humans have risked bee swarms for thousands of years in the hopes of collecting a little honey. Beyond its use in cooking, the substance has also been used for healing wounds and even as a natural preservative — because the insect-produced food is one of the few that rarely expires. Honey’s indefinite shelf-life is thanks to its sugar-dense composition, with less than 20% of its makeup coming from water. The nectar also has two other preserving factors: It has an acidic pH level that is unsuitable for bacteria, and its viscous state creates an oxygen barrier that prevents pathogens from growing. However, there is a catch: To maintain these properties, honey must be stored in a sealed container safe from humid conditions. Even then, the USDA suggests honey is at its best when consumed within a year.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Most of us probably just enjoy our food without thinking too deeply about it. But the world of culinary delights holds many mysteries. What’s the secret history of the bagel? What does “continental breakfast” really mean? Which nut has been known to explode during transport, and which favored breakfast item is slightly radioactive? And finally, what’s the difference between sweet potatoes and regular potatoes? The following 25 facts will give you plenty of fodder for your next dinner party.
Green Bell Peppers Are Just Unripe Red Bell Peppers
If you’ve ever found yourself in the grocery store struggling to decide between red and green bell peppers, you may be interested to learn that they’re the very same vegetable. In fact, green bell peppers are just red bell peppers that haven’t ripened yet, while orange and yellow peppers are somewhere in between the two stages. As they ripen, bell peppers don’t just change color — they also become sweeter and drastically increase their beta-carotene, vitamin A, and vitamin C content. So while the green variety isn’t quite as nutritious as its red counterpart, the good news is that one eventually becomes the other.
It turns out there’s a price to pay for how tasty and nutritious pistachios are: Under the right circumstances, they can spontaneously combust. Everyone’s favorite shelled nut is especially rich in fat, which is highly flammable. Thankfully, that only becomes a problem when pistachios are packed too tightly during shipping or storage. It’s important to keep the nuts dry lest they become moldy — but if they’re kept too dry and there are too many of them bunched together, they can self-heat and catch fire without an external heat source.
Though exceedingly rare and easy to avoid if the proper instructions are followed, pistachio self-combustion is a real enough concern that the German Transport Information Service specifically advises that pistachios “not be stowed together with fibers/fibrous materials as oil-soaked fibers may promote self-heating/spontaneous combustion of the cargo.” Don’t worry, though: It won’t happen in your pantry with just a few bags, so you can indulge in the shelled snack without worrying about their flavor becoming unexpectedly smoky.
Philadelphia Cream Cheese Isn’t Actually From Philadelphia
The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods. So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence branded his cheese under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.
Bagels Were Once Given as Gifts to Women After Childbirth
After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challahs is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.
The Word for a Single Spaghetti Noodle Is “Spaghetto”
If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction — one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word “spago,” which means “twine” or “string.”
If you asked for ketchup thousands of years ago in Asia, you might have been handed something that looks more like today’s soy sauce. Texts as old as 300 BCE show that southern Chinese cooks mixed together salty, fermented pastes made from fish entrails, meat byproducts, and soybeans. These easily shipped and stored concoctions — known in different dialects as “ge-thcup,” “koe-cheup,” “kêtsiap,” or “kicap” — were shared along Southeast Asian trade routes. By the early 18th century, they had become popular with British traders. Yet the recipe was tricky to recreate back in England because the country lacked soybeans. Instead, countless ketchup varieties were made by boiling down other ingredients, sometimes including anchovies or oysters, or marinating them in large quantities of salt (Jane Austen was said to be partial to mushroom ketchup). One crop that the English avoided in their ketchup experiments was tomatoes, which for centuries were thought to be poisonous.
What do Neil Armstrong, tortoises, and jelly beans have in common? Why, they’ve all been to space, of course. President Ronald Reagan was known for being a connoisseur of the chewy candy, so much so that he provided the astronauts aboard the Challenger shuttle with a bag full of them in 1983 — a gift that resulted in charming footage of them tossing the jelly beans in zero gravity before happily eating them. Reagan was also known to break the ice at high-level meetings by passing around jelly beans, even commenting that “you can tell a lot about a fella’s character by whether he picks out all of one color or just grabs a handful.”
The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.
When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name Froffles, a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years.
A rainy-day cache of sweet, sticky maple syrup may seem more like a luxury than a necessity, but it’s a big deal to Canada, which produces more than 70% of the world’s supply from maple trees grown in the province of Quebec. As such, the Federation of Quebec Maple Syrup Producers (QMSP) founded the Global Strategic Maple Syrup Reserve in 2000 to help regulate the profitable business. Covering an area of 267,000 square feet across three facilities, the reserve has endured poor sugaring seasons and the dastardly theft of some $20 million worth of barrels in 2012. And even when the COVID-19 pandemic forced many families to fulfill their pancake cravings at home, the QMSP promised to keep pace by announcing that it would release more than half of its 100 million-pound reserve in 2022.
Pineapples Were Once So Valuable People Rented Them for Parties
In the 1700s, party hosts and guests looking to make a statement were in the rental market for a special kind of accessory: pineapples. The message they were trying to send? That they were extravagantly wealthy. Prior to the 20th century, when pineapple plantations made the fruit widely available, pineapples were incredibly expensive imports to Europe (and most other places). In the 18th century, a single fruit bought in Britain could cost upwards of $8,000 in today’s money.
Christopher Columbus is credited with introducing pineapples to Europe in the 1490s after voyaging to the Americas. Just one survived his return journey, and the bromeliad quickly had an impact. Dubbed the “king of fruits,” the pineapple became a symbol of opulence and royalty because of its scarcity. Pineapples were featured in paintings of kings, printed on linens and wallpaper, and even carved into furniture. Obtaining a rare pineapple meant the buyer had money and status — and for that reason, the fruit was also often featured decor at parties and events. Eventually, European botanists learned to grow pineapples in greenhouses and reduce their cost. But until the fruits were widely available, many partygoers in Britain would seek out a pineapple for just one night, renting the fruit for a fraction of its full price and sometimes even carrying it around at the party as the ultimate (uneaten) accessory.
Today carrots are practically synonymous with the color orange, but their auburn hue is a relatively recent development. When the carrot was first cultivated 5,000 years ago in Central Asia, it was often a bright purple. Soon, two different groups emerged: Asiatic carrots and Western carrots. Eventually, yellow carrots in this Western group (which may have developed as mutants of the purple variety) developed into their recognizable orange color around the 16th century, helped along by the master agricultural traders of the time — the Dutch.
A common myth says the Dutch grew these carrots to honor William of Orange, the founding father of the Dutch Republic, but there’s no evidence of this. What’s more likely is that the Dutch took to the vegetable because it thrived in the country’s mild, wet climate. (Although the orange color may have first appeared naturally, Dutch farmers made it the predominant hue by selectively growing orange roots — scholars say these carrots likely performed more reliably, tasted better, and were less likely to stain than the purple versions.) The modern orange carrot evolved from this period of Dutch cultivation, and soon spread throughout Europe before making its way to the New World. Today, there are more than 40 varieties of carrots of various shapes, sizes, and colors — including several hues of purple.
Mentions of radioactivity can send the mind in a dramatic direction, but many ordinary items are technically radioactive — including the humble banana. Radioactivity occurs when elements decay, and for bananas, this radioactivity comes from a potassium isotope called K-40. Although it makes up only 0.012% of the atoms found in potassium, K-40 can spontaneously decay, which releases beta and gamma radiation. That amount of radiation is harmless in one banana, but a truckload of bananas has been known to fool radiation detectors designed to sniff out nuclear weapons. In fact, bananas are so well known for their radioactive properties that there’s even an informal radiation measurement named the Banana Equivalent Dose, or BED.
So does this mean bananas are unhealthy? Well, no. The human body always stores roughly 16 mg of K-40, which technically makes humans 280 times more radioactive than your average banana. Although bananas do introduce more of this radioactive isotope, the body keeps potassium in balance (or homeostasis), and your metabolism excretes any excess potassium. A person would have to eat many millions of bananas in one sitting to get a lethal dose (at which point you’d likely have lots of other problems).
Each year, about 4% of the world’s cheese supply is stolen — making it the most-stolen food in the world. Cheese, after all, is big business: Global sales exceeded $114 billion in 2019. In Italy, Parmesan is so valuable it can be used as loan collateral, according to CBS News. Consequently, the black market for cheese is thriving. From 2014 to 2016, organized crime was responsible for stealing about $7 million of Parmesan. And dairy-based crime definitely isn’t limited to Italy: In 2009, a duo of cheese thieves in New Zealand led police on a high-octane car chase — and tried to throw off the pursuit by tossing boxes of cheddar out the window.
The Ancient Romans Thought Eating Butter Was Barbaric
Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — including and especially gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.
The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose.
Sweet potatoes and common potatoes share part of a name and the spotlight at Thanksgiving meals, but the two are entirely different plants — and sweet potatoes aren’t even potatoes. While both root vegetable species are native to Central and South America, they’re classified as unrelated. Sweet potatoes belong to the Convolvulaceae family, a group of flowering plants that’s also called the morning glory family. Potatoes belong to the nightshade (Solanaceae) family, and are cousins to peppers, tomatoes, and eggplants. Both species get their name from an Indigenous Caribbean term, batata, which eventually morphed into the English “potato.” By the 1740s, “sweet” was added to the orange-fleshed tuber’s name to differentiate the two root crops.
Meanwhile, yams are biologically unrelated to either sweet potatoes or common potatoes. These tubers belong to the Dioscoreacea family, a group of flowering plants usually cultivated in tropical areas. Luckily, you don’t have to know their scientific classification to distinguish between the two non-spuds at the grocery store: Sweet potatoes have tapered ends and relatively smooth skin, while true yams are generally larger with rough bark and a more cylindrical shape. At most U.S. grocery stores, what you’re seeing labeled as a yam is probably actually a sweet potato.
Today, nutmeg is used in the kitchen to add a little zing to baked goods and cool-weather drinks, though at various times in history it’s been used for fragrance, medicine … and its psychotropic properties. That’s possible thanks to myristicin, a chemical compound found in high concentrations in nutmeg, but also produced in other foods like parsley and carrots. Myristicin is able to cause hallucinations by disrupting the central nervous system, causing the body to produce too much norepinephrine — a hormone and neurotransmitter that transmits signals among nerve endings. While the idea of conjuring illusions of the mind might sound intriguing, nutmeg intoxication also comes with a litany of unpleasant side effects, including dizziness, confusion, drowsiness, and heart palpitations, so don’t try this at home.
As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.
Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria. In most cases, honey can be safely stored for years on end — just make sure it’s in a sealed container (and check out these five other foods that almost never expire).
Misting Produce Is a Clever Way To Make You Buy More
Many grocery stores display produce in open cases fitted with tiny jets to periodically bathe the veggies in a cool mist. (Some supermarkets even pipe in the sound of thundering rain to add to the rainy vibe.) The purpose behind misting is not to keep produce clean or extend its shelf life — it’s a clever way for grocers to make the fruits and vegetables look fresher and healthier so consumers purchase more. Water clinging to leafy greens also adds weight, which increases revenue for the store when vegetables are sold by the pound.
Ironically, misting actually shortens produce’s shelf life because water allows bacteria and mold to take hold. Misted veggies will likely not last as long in your fridge as those that weren’t misted in the produce aisle — which is another, perhaps sneakier, way to get you to buy produce more often.
For years, dairy producers have sued alternative milk companies for using the word “milk” on their packaging — but history is not on their side. Evidence suggests that Romans had a complex understanding of the word “milk,” as the root of the word “lettuce” comes from “lact” (as in “lactate”). Many medieval cookbooks make reference to almond milk, and the earliest mention of soy milk can be found on a Chinese stone slab from around the first to third century CE. However, coconut milk has the longest history; archaeologists have recovered coconut graters among relics from Madagascar and Southeast Asia that date back to around 3000 to 1500 BCE.
“Continental Breakfast” Is a British Term for Breakfast on the European Continent
Many hotels offer guests a free “continental” breakfast with their stay, but what exactly makes a breakfast “continental”? The term originated in the mid-19th century in Britain as a way to distinguish the hearty English breakfast — typically consisting of eggs, bacon, sausage, toast, and beans — from the lighter fare found in places like France and other Mediterranean countries in continentalEurope. It typically consists of pastries, fruits, toast, and coffee served buffet-style. As American breakfasts also tended to feature outsized helpings of protein and fruits, the “continental” moniker proved useful for hotels on the other side of the Atlantic as well.
Chickens Might Be Among the Closest Living Relatives of the Tyrannosaurus Rex
Dinosaurs still live among us — we just call them birds. Today, scientists consider all birds a type of dinosaur, descendants of creatures who survived the mass extinction event at the end of the Cretaceous period. And yes, that even includes the chicken. In 2008, scientists performed a molecular analysis of a shred of 68 million-year-old Tyrannosaurus rex protein, and compared it to a variety of proteins belonging to many different animals. Although proteins from alligators were relatively close, the best match by far belonged to ostriches — the largest flightless birds on Earth — and the humble chicken.
Following the initial 2008 study, further research has proved that a chicken’s genetic lineage closely resembles that of its avian dinosaur ancestors. Scientists have even concluded that a reconstruction of T. rex’s chromosomes would likely produce something similar to a chicken, duck, or ostrich. Meanwhile, some archaeological evidence supports an idea that the earliest human-raised chickens may not have been eaten, but instead revered and possibly even used as psychopomps, aka animals tasked with leading the deceased to the afterlife.
Botanically speaking, a nut is a fruit with a hard shell containing a single seed. The true nuts you might encounter in the produce aisle include hazelnuts and chestnuts. Many of the products sold as “culinary nuts” belong to other botanical classifications. Cashews, almonds, and pistachios are drupes, a type of fruit with thin skin and a pit containing the seed. (Peaches, mangos, cherries, and olives are also drupes.) And the jury is still out on whether walnuts and pecans fall into the nut or drupe category since they have characteristics of both. Some botanists call them drupaceous nuts.
We humans have somewhere between 20,000 and 25,000 genes — a sizable number to be sure, but still considerably fewer than the 31,760 in everyone’s favorite nightshade, the tomato. Though scientists still aren’t sure why tomatoes have such a complex genome, an emerging theory relates to the extinction of the dinosaurs. Around the time those giant creatures disappeared from Earth, the nightshade family (Solanaceae) tripled its number of genes. Eventually the superfluous copies of genes that served no biological purpose disappeared, but that still left a lot of functional ones; some believe the extra DNA helped tomatoes survive during an especially perilous time on the planet, when it was likely still recovering from the aftereffects of a devastating asteroid.
Humans, meanwhile, have two copies of every gene: one from their mother and one from their father. The number of genes doesn’t necessarily imply biological sophistication, but rather how an organism “manages its cells’ affairs” — simply put, humans make more efficient use of the genes they have.
In 19th-century Europe, it wasn’t uncommon to see trained bears frolicking down the streets in celebration of a parade or festival. Called “dancing bears,” these animals would skip, hop, whirl, twirl, and perform an array of tricks. Fast-forward to the 1920s, when German candymaker Hans Riegel was searching for a clever way to sell his gelatin-based confections to children. Recalling the two-stepping bears of yore, Riegel decided to make an Ursus-shaped candy called Tanzbär (literally “dancing bear”). The snacks were a huge success. Today, you probably know Riegel’s company as Haribo.
Foods tend to get their names from their appearance or ingredients, though not all are so clear-cut. Take, for instance, the egg cream, a beverage that has delighted the taste buds of New Yorkers (and other diner patrons) since the 1890s. But if you’ve never sipped on the cool, fizzy drink known for its chocolate flavor and foamy top, you should know: There are no eggs or cream in a traditional egg cream drink.
According to culinary lore, the first egg cream was the accidental invention of Louis Auster, a late-19th- and early-20th-century candy shop owner in New York’s Lower East Side. Auster’s sweet treat arrived in the 1890s, at a time when soda fountains had started selling fancier drinks, and it was a hit — the enterprising inventor reportedly sold upwards of 3,000 egg creams per day by the 1920s and ’30s. However, Auster kept his recipe well guarded; the confectioner refused to sell his formula, and eventually took his recipe to the grave. The origins of the drink’s name have also been lost to time. Some believe the name “egg cream” came from Auster’s use of “Grade A” cream, which could have sounded like “egg cream” with a New York accent. Another possible explanation points to the Yiddish phrase “echt keem,” meaning “pure sweetness.”
Feature image credit:Original photo by kate_sept2004/ iStock
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
According to Ethiopian legend, the first beings to get a java jolt were a herd of goats who nibbled on the fruit of a coffee shrub; a goat herder named Kaldi quickly followed their lead. While it’s unclear whether Kaldi and his caprines truly discovered coffee, today the brewed beverage is one of the most widely consumed drinks in the world. Introduced to America in the mid-17th century, coffee quickly replaced heavily taxed tea as a more patriotic staple during the fight for independence. In the time since, soldiers have relied on coffee to boost morale overseas, children of U.S. Presidents have founded their own coffeehouses, and American coffee brands have expanded across the globe. Can’t get enough coffee? Discover 12 amazing facts you might not know about this beloved morning beverage.
Regularly Consuming Coffee May Have Health Benefits
Coffee was once believed to be an unhealthy (and possibly dangerous) indulgence, but newer research suggests it actually has health benefits that could extend your life. Researchers have found that people who drink moderate amounts of coffee — about two to five cups — have lower risks of developing Type 2 diabetes, Parkinson’s disease, heart disease, and some cancers. Both regular and decaf coffee offer these perks; it’s not the caffeine that helps, but likely the polyphenols, a plant compound found in coffee that works as an antioxidant.
At least three Italian inventors played a role in creating espresso. Angelo Moriondo was the first; his “steam machinery for the economic and instantaneous confection of coffee beverage” was patented in 1884, though it could only brew large batches of coffee and never became commercially available. Nearly two decades later, Luigi Bezzera created his own espresso machine, which cut down brewing time from several minutes to just 30 seconds (much better for workers on coffee breaks). With the help of inventor Desiderio Pavoni, Bezzera’s reworked machine — which most closely resembles espresso machines used in coffee shops today — debuted at the 1906 World’s Fair in Milan, where the duo sold “caffè espresso.”
New Orleans Is Known for an Herby Coffee Alternative
Coffee drinkers in France have long blended their brews with chicory, a blue-flowered herb native to Europe and Asia with roots that offer a coffee-like flavor when roasted. French settlers brought the practice to Louisiana, helping chicory coffee become a mainstay in times of conflict when real coffee imports were hard to come by — such as during the Napoleonic Wars in the early 1800s, and later during the Civil War. However, amid some conflicts, like World War I, chicory was in so much demand that the once-cheap substitute cost more than actual coffee. Today, many coffee drinkers in New Orleans continue to enjoy their java with chicory included.
Italy is often called the world’s coffee capital, so it’s no wonder that many of the words we use to describe a cup of joe come from Italian. Take, for example, the cappuccino (a drink made from espresso and steamed milk), which gets its name from a 16th-century order of Italian monks. The Capuchin friars were known for helping those experiencing poverty; as such, they themselves rebuked wealth and wore simple brown robes, with long, pointed hoods that were called “cappuccio.” The earliest cappuccino drinks, which emerged around the 1700s, were nicknamed after these religious figures, because adding milk to espresso resulted in a color similar to that of the monks’ attire.
Adding a little cream to your cup of coffee can improve the flavor and possibly help it stay warm for longer. Some food scientists believe that coffee with cream cools about 20% more slowly than plain black coffee, thanks to three rules of physics. Darker colors emit heat faster than lighter colors, so adding cream to lighten the drink’s hue may slow down heat loss. Hotter surfaces also radiate heat faster, so plain coffee will cool faster than a cup that’s been slightly chilled by adding cold cream. Viscosity is also a factor: Cream thickens coffee, making a steamy cup evaporate more slowly. Since evaporation causes heat loss, the less there is, the more time you’ll have to enjoy coffee before it’s too cold.
Home cooks of the early 1800s could try their hand at making coffee jelly, a dessert that originated in England and later spread to the Eastern U.S. Coffee jelly was promoted as an alternative to the hot beverage for those who didn’t like the taste or whose stomachs didn’t agree with the acidity. The jiggly dessert was considered a healthy option for people who were sick, or could be eaten as an after-dinner curative for people who drank too much alcohol at mealtimes. While coffee jelly is now a rarity in the United States, it is commonly found in Japan, where it’s a popular treat.
It turns out that the name you’re familiar with for those tiny pods that are ground and brewed for a fresh cup of joe is a misnomer. Coffee “beans” are actually the seeds found within coffee cherries, a reddish fruit harvested from coffee trees. Farmers remove the skin and flesh from the cherry, leaving only the seed inside to be washed and roasted.
Coffee farming is a major time investment: On average, a tree takes three or four years to produce its first crop of cherries. In most of the Coffee Belt — a band along the equator where most coffee is grown that includes the countries of Brazil, Ethiopia, and Indonesia — coffee cherries are harvested just once per year. In many countries, the cherries are picked by hand, a laborious process.
Decaf coffee has helped coffee drinkers enjoy the taste of coffee without (much of) the jolting effects of caffeine, but its creation was entirely accidental. According to legend, around 1905 German coffee merchant Ludwig Roselius received a crate of coffee beans that had been drenched with seawater. Trying to salvage the beans, the salesman roasted them anyway, discovering that cups brewed with the beans retained their taste (with a little added salt) but didn’t have any jittery side effects. Today, the process for making decaf blends remains relatively similar: Beans are soaked in water or other solvents to remove the caffeine, then washed and roasted. However, no coffee is entirely free of caffeine. It’s estimated that 97% of caffeine is removed during preparation, but a cup of decaf has as little as 2 milligrams of caffeine — compared to regular coffee’s 95 milligrams.
Credit: Rischgitz/ Hulton Archive via Getty Images
Bach Wrote an Opera About Coffee
Johann Bach is remembered as one of the world’s greatest composers, known for orchestral compositions such as the Brandenburg Concertos. But one of Bach’s lesser-known works is Schweigt stille, plaudert nicht (“Be Still, Stop Chattering”) — a humorous ode to coffee popularly known as the Coffee Cantata. Written sometime in the 1730s, Bach’s opera makes light of fears at the time that coffee was an immoral beverage entirely unfit for consumption. In the 18th century, coffee shops in Europe were known to be boisterous places of conversation, unchaperoned meeting places for young romantics, and the birthplaces of political plots. A reported lover of coffee, Bach wrote a 10-movement piece that pokes fun at the uproar over coffee. The opera tells the story of a father attempting to persuade his daughter to give up her coffee addiction so that she might get married, but in the end, she just becomes a coffee-imbibing bride.
We can credit coffee-craving inventors for creating the first webcam. In the early 1990s, computer scientists working at the University of Cambridge grew tired of trekking to the office kitchen for a cup of joe only to find the carafe in need of a refill. The solution? They devised a makeshift digital monitor — a camera that uploaded three pictures per minute of the coffee maker to a shared computer network — to guarantee a fresh pot of coffee was waiting the moment their mugs emptied. By November 1993, the in-house camera footage made its internet debut, and viewers from around the globe tuned in to watch the grainy, real-time recording. The world’s first webcam generated so much excitement that computer enthusiasts even traveled to the U.K. lab to see the setup in real life. In 2003, the coffee pot sold at auction for nearly $5,000.
Coffee has a long political history in the U.S. — colonists who tossed heavily-taxed tea into the Boston Harbor switched to drinking the caffeinated brew as part of their rebellion. But even after the Revolutionary War’s end, American leaders held an enduring love for the beverage. George Washington grew coffee shrubs at his Mount Vernon estate (though because of climate, they likely never produced beans), while Thomas Jefferson loved coffee so much that he estimated using a pound per day at Monticello during retirement. Similarly, Theodore Roosevelt reportedly consumed an entire gallon of coffee each day, and George H.W. Bush was known for imbibing up to 10 daily cups.
Your Genes Might Determine How Much Coffee You Drink
If you can’t get through the day without several cups of coffee, you may have your genes to blame. A 2018 study suggests inherited traits determine how sensitive humans are to bitter foods like caffeine and quinine (found in tonic water). Researchers found that people with genes that allow them to strongly taste bitter caffeine were more likely to be heavy coffee drinkers (defined as consuming four or more cups daily). It seems counterintuitive that people more perceptive to astringent tastes would drink more coffee than those with average sensitivity — after all, bitter-detecting taste buds likely developed as the body’s response to prevent poisoning. But some scientists think that human brains have learned to bypass this warning system in favor of caffeine’s energizing properties. The downside? Constant coffee consumers are at higher risk of developing caffeine addiction.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Fries without ketchup, pancakes without syrup — what would your favorite dishes be like without a little sauce? Condiments can make or break a meal; the word, after all, comes from the Latin condimentum, meaning “to season.” Take a moment to appreciate all the taste bud sensations that sidekick sauces can provide with these eight facts.
The soy sauce you find at grocery stores today typically contains just four simple ingredients — soybeans, wheat, salt, and water — which are blended and fermented over several months or years to give the sauce its umami flavor. However, the oldest known types of soy sauce used meat in place of legumes. Called jiang, the flavoring was a thick and pasty blend of meat, a fermenting agent made from millet, and salt that fermented for about 100 days; it was ready when the meat had entirely dissolved. Food historians believe Chinese soy sauce makers eventually ditched using meat and switched to soybeans about 2,000 years ago.
Mayo Became Popular After a French and British Battle
The origins of mayonnaise are heavily debated among food historians, particularly regarding the issue of whether the creamy spread was invented by the Spanish or the French. One commonly told tale dates back to 1756 during the Seven Years’ War, when French forces set siege to Minorca’s Port Mahon (then ruled by the British). After the battle, a French chef working for the invading forces reportedly blended egg and oil together in a celebratory meal, calling the finished product “mahonnaise” for the region. However, some researchers believe residents of Port Mahon had already been making and using mayonnaise (their version was called Salsa Mahonesa). Regardless of who created it, mayo became linked with French cooking by the early 19th century, and the multipurpose dressing reached American menus by the 1830s.
White House Staff Kept Ketchup on Hand for One President’s Breakfast
Among White House staff, Richard Nixon’s love of cottage cheese was well known. During his time in the Oval Office, the 37th President regularly enjoyed a breakfast of fruit, wheat germ, coffee, and cottage cheese topped with ketchup. (His last meal in office nixed the condiment, but did include a tall glass of milk and cottage cheese atop pineapple slices.)
There’s one condiment you’ll have a good chance of finding in pantries across the country: peanut butter. In 2023, 90% of U.S. households included the smooth and creamy spread on their grocery lists. On average, Americans consumed 4.4 pounds of peanut butter per capita in 2023, a culinary craving that first became popular during World War I, when peanut butter was an inexpensive and easily accessible protein during wartime rationing.
Making pure maple syrup is a time-intensive labor that starts inside of “sugarbushes,” aka groves of maple trees. Syrup farmers can wait up to 40 years before a maple tree grows large enough to be tapped, and even when they are, the trees typically produce just 10 gallons of sap per tap hole per season. After boiling off excess water, that’s enough to make about 1 quart of maple syrup.
Not many foods are the stars of an opera performance, though one kind of hot sauce is. Boston composer George Whitefield Chadwick debuted Tabasco: A Burlesque Opera in 1894. It tells the story of an Irish traveler lost at sea who washes ashore in Morocco and works as a chef, and who creates spicy dishes (his secret ingredient: Tabasco). Chadwick’s opera was partially financed by the McIlhenny Company — the maker of Tabasco. In its first week, it turned a profit of $26,000.
Credit: Fotosearch/ Archive Photos via Getty Images
Ernest Hemingway’s Burger Recipe Used Tons of Condiments
One of Ernest Hemingway’s lesser-known creations wasn’t a novel, but a hamburger. His recipe included a smattering of condiments inside the mixture rather than on top. The author’s technique called for wine, garlic, and sometimes ground almonds, but also several different spice blends and relishes. His recommendation for getting the meat perfectly ready for the grill? Let it “sit out of the icebox for 10 to 15 minutes while you set the table and make the salad.”
Historians Have Recreated a 2,000-Year-Old Condiment
You can get a taste for how ancient Romans and Greeks once ate with a little dash of garum, a fish sauce that was popular about 2,000 years ago. Historians relied on surviving recipes for instructions that included steps like leaving fish to break down in open containers for three months. However, it wasn’t until clay pots from a garum-making shop in Pompeii were unearthed that researchers found evidence of the sauce that could be analyzed for additional ingredients such as dill, fennel, and coriander that help the salty and umami-flavored sauce shine.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
It’s hard to resist the call of break room doughnuts, or the allure of a leisurely stroll down the grocery store snack food aisle. While it may not have much nutritional value, sometimes a good helping of a crispy, crunchy, sweet, salty, or sour snack is just what we need to remedy a bad mood or a tough day. Grab a snack and unwrap these eight facts about popular junk foods.
Ever wondered why doughnuts have holes? Historians aren’t certain why (or when) the doughy centers disappeared, but one theory suggests it may have been to help the pastries cook more evenly. According to food lore, American sailor Hansen Gregory created the doughnut’s modern shape around 1847 while at sea; by his account, doughnuts of the time were twisted or diamond-shaped and often cooked faster on the outsides than in the centers. Removing the dense middles helped create uniformly cooked treats that fried quickly and didn’t absorb as much oil.
The spongy, cream-filled cakes we call Twinkies were first created in 1930 in an attempt to put unused bakery pans back into production. Creator James Dewar was a manager at the Continental Baking Company outside Chicago, where he noticed the factory’s strawberry shortcake-making equipment sat idle once strawberry season ended. Dewar used the pans to bake small cakes injected with cream fillings, naming his invention Twinkies after seeing a billboard for Twinkle Toe Shoes.
7-Eleven and Coca-Cola Teamed Up To Create the Big Gulp
Supersized drinks are just one of the junk food finds you can pick up at 7-Eleven, thanks to a partnership between the convenience store chain and soda manufacturer Coca-Cola. Representatives from the beverage brand approached 7-Eleven leadership in 1976 about upgrading cup sizes from 20 ounces to 32 ounces; after successful market testing at locations in Southern California, 7-Eleven rolled out its larger cups nationally in 1980. However, supersizing drinks didn’t stop there — 7-Eleven rolled out its 44-ounce Super Gulp six years later, and launched the 64-ounce Double Gulp in 1989.
Potato Chips Were Nearly Discontinued During World War II
In the midst of World War II, the U.S. War Production Board was tasked with making the most of limited materials for the war effort, pausing manufacturing of noncritical foods and items. One of the items on the chopping block: potato chips. The snack was initially considered “nonessential,” a move that would stop factories from producing potato chips until the war ended. However, chip manufacturers lobbied to rescind the ruling and even secured contracts to produce chips for troops overseas and workers in manufacturing plants. One such company — Albany, New York’s Blue Ribbon potato chip brand — chipped in about 7 million pounds of crisps to the war effort in just nine months.
Spoon straws make it easier to gulp down frosty drinks, but usually get little thought once the Slurpee is done. That’s exactly why the Museum of Modern Art keeps one in its collection. In 2004, a single spoon straw was featured as part of the museum’s “Humble Masterpieces” exhibit, which highlighted around 120 simple, everyday items. The spoon straw’s inventor — engineer Arthur Aykanian — held more than 40 patents, some straw-related, and others not so much, like medical tools used in skin cancer treatment.
The first toaster pastries — called Country Squares — hit grocery store shelves in 1963, created by Post Cereals. Kellogg Company released its own version six months later, called the Fruit Scone. After further workshopping, Kellogg changed the name to Pop-Tarts (a play on the Pop Art movement of the 1950s and ’60s), and produced the treats in four flavors: strawberry, blueberry, apple-currant, and brown sugar cinnamon. However, the iconic hard icing didn’t top the toaster treats until 1967, four years after the snacks debuted.
There’s a reason ordering your favorite fast-food snack or indulging in a candy bar feels good. It all has to do with dopamine, the “feel-good” hormone inside our brains that influences our moods, behaviors, and motivation. Eating junk food — especially items packed with sugar — triggers the brain’s reward system to release large amounts of dopamine, which makes us feel happy. Over time, our brains adapt to these dopamine rushes, causing junk food cravings and even growing more dopamine receptors that require larger amounts of junk food to have the same satisfying feeling.
You Can Celebrate National Junk Food Day Each Summer
Hot dogs aren’t the only food with their own day of celebration in July — the ever-expanding junk food category is honored in the same month. National Junk Food Day lands on July 21, giving you the chance to celebrate by indulging in all your favorite sweet and savory treats.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Olive oil is a vital crop in many countries around the world, and it has so many uses — some you might not even be aware of. You may like to pour it on your pastas and salads, but do you know where it originally comes from, or what “extra-virgin” really means? Here are eight rich facts about delicious, nutritious olive oil for you to soak in.
Olive Oil Has Been Produced by Humans for Millennia
Although it’s not definitively known which culture first began pressing olives for culinary uses, the earliest historical evidence of olive oil being produced by humans is a clay pot relic found near Galilee, Israel, that bore olive oil residue. The pot was made between 7,000 and 7,600 years ago, and it’s thought that the Neolithic people in this area were only just learning how to make clay pots at the time. The oldest known olive oil press was also found in this region, near the modern-day city of Haifa, Israel; the press is slightly younger than the oil pot, at only 6,500 years old.
Extra-Virgin Olive Oil Is Defined as the Pure, Unprocessed, Unrefined Oil of an Olive
Although it’s well known that extra-virgin olive oil is the highest grade of olive oil, many people don’t know how its method of production differs from that of other kinds of olive oil. In order for olive oil to be classed as extra-virgin, it must be made by grinding olives and then cold-pressing them to extract their oil, without the use of heat or chemical solvents. Olive oil is also required to have no more than 0.8% acidity in order to qualify as EVOO, per the European Commission, as well as zero median defects. As a result of these stipulations, extra-virgin olive oil is lighter in color and flavor than lower grades of olive oil and has a fruity, slightly peppery flavor and odor.
Olive oils with a high polyphenol content (that is, extra-virgin olive oils) have been shown to prevent or delay the growth of pathogenic bacteria and microfungi, and the oil is therefore considered antimicrobial. This phenomenon is specifically attributed to oleuropein, tyrosol, and hydroxytyrosol, the major phenolic compounds found in olives. These antimicrobial properties were even known by ancient people, as evidenced by the biblical passage Luke 10:33-34, wherein a Samaritan pours olive oil onto a bleeding man’s wounds.
Many Religions Use Olive Oil in Sacred Rituals and Practices
Olive oil is overwhelmingly the oil of choice used by churches and synagogues across the world. The Christian Orthodox, Anglican, and Roman Catholic churches all use olive oil to bless those preparing to be baptized, as well as to anoint the sick — as does the Church of Jesus Christ of Latter-Day Saints and others. Catholic bishops use olive oil mixed with balsam (called “chrism”) in the sacrament of confirmation, in the rites of baptism, and other rites. Eastern Orthodox churches use olive oil in lamps for vigils. Under Jewish Halakhic law, olives are one of the seven foods that require the recitation of me’eyn shalosh after they are consumed, and olive oil is the preferred oil used to light Shabbat candles. As for Islam, olives are mentioned in the Quran as a “precious fruit.”
There’s a Literary Prize for Extra-Virgin Olive Oil-Themed Writing
Sponsored by the Pandolea Association, a Rome-based group of women olive oil producers, the Ranieri Filo della Torre International Award is a yearly literary prize awarded to poets and authors who are moved to write about extra-virgin olive oil. Named for a departed journalist and member of the National Academy of the Olive and Olive Oil who co-authored several publications on olive cultivation, the award honors poetry, fiction, and nonfiction about extra-virgin olive oil. Writers from all nations are invited to apply, but entries must be about extra-virgin olive oil specifically.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Apples serve all sorts of useful purposes: They’re great for a snack, fun to pick, perfect for giving to a teacher, and ideal as a subject for a still life. They also offer an entire smorgasbord of interesting facts for those who enjoy nourishing their minds along with their bodies. Here are eight bite-sized tidbits to chew on regarding this wondrous fruit.
Modern Apples Are Descended From a Single Wild Ancestor
All varieties of the domestic apples we know and cherish stem from a single wild ancestor, Malus sieversii. Though the apple was originally found in the foothills of the Tien Shan mountains of Central Asia, its seeds may have spread from its native region via birds and bears. Sometime after the fruit’s domestication more than 4,000 years ago, apples made their way to Europe and beyond by way of pre-Silk Road trading routes. In the early 20th century, Russian biologist Nikolai Vavilov traced the modern apple’s origins to the forests outside Almaty, Kazakhstan. Today, the town still celebrates its status as the birthplace of this botanic marvel.
The U.S. Ranks Second to China in Apple Production
Apples consistently rank among Americans’ favorite fruits, which explains the impressive productivity of the U.S. apple industry. According to the USApple association, there are more than 26,000 apple growers over 382,000 acres across all 50 states, who combine to produce 11.1 billion pounds of apples per year. However, even that output pales in comparison to the prodigious amount of apples reaped in China, which produced more than 47 million metric tons of the fruit in 2022.
Galas Are the Top-Selling Apples in the United States
One of the great things about apples is the variety of tasting experiences offered by the numerous types available at your neighborhood store. Honeycrisps are crunchy and sweet. Granny Smiths deliver a burst of tartness. Cortlands are great for baking. So which is the favorite of the American public? By sales, at least, that honor goes to Galas, which made up 19% of the U.S. apple market in 2021.
It Takes at Least Four Years for a Tree to Produce Apples
A standard (non-dwarf) apple tree will normally take at least four years to begin bearing fruit, and can continue to do so for another three decades. That covers nearly the entire life span of many trees, which generally live up to about 50 years of age, although some can survive for 100 years or longer. In 2020, an apple tree believed to be the oldest in the Pacific Northwest died at the ripe old age of 194.
It may take more than an apple a day to keep you out of the doctor’s office, but these fruits are loaded with nutritional benefits. Along with high levels of vitamin C and fiber, apples are packed with disease-fighting antioxidants such as quercetin. Research has shown that a steady diet of apples can reduce the chances of heart disease, diabetes, and certain types of cancers, while supporting weight loss and gut health. Of course, these benefits are best realized by eating raw, unpeeled apples, as opposed to gorging on sugar-filled ciders and pies.
Speaking of apple pie, this quintessentially American dish actually hails from England, with one of the first known recipes appearing in the late-14th-century manuscript The Forme of Cury. Arriving in the New World with European settlers, the dessert was well known within the borders of the nascent United States by the late 1700s, as evidenced by the presence of two recipes in the 1796 cookbook American Cookery. By the mid-1900s, the combination of advertising and war-fueled patriotism had embedded the “American as apple pie” concept in popular culture.
The Heaviest Apple Ever Weighed More Than 4 Pounds
According to the Guinness World Record scorekeepers, the heaviest known apple was a 4-pound, 1-ounce behemoth picked at a Hirosaki City, Japan, farm in 2005. The prize specimen was a Hokuto, a cross between the Fuji and Matsu varieties.
Apples Are Best Stored Separately From Other Produce in the Refrigerator
Those who return from the store or orchard with a hefty quantity of apples should consider stuffing as many as possible in the fridge; although apples will last about a week when left out on the counter, they can remain suitably edible for up to two months when refrigerated. But regardless of where you leave them, apples should be stored separately from other fruits and vegetables, as they release a gas called ethylene that speeds up the ripening and spoiling of nearby produce.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
With over 99 billion customers served at the chain, chances are that most people have set foot in a McDonald’s at one point or another. What began as a small California hot dog stand during the Great Depression has since blossomed into an international operation, with more than 36,000 locations in over 100 countries. Throughout the decades, McDonald’s has amassed a rich history filled with facts that tantalize the brain, much as its burgers tantalize the taste buds.
Since the creation of the Happy Meal in 1979, McDonald’s has leapfrogged industry giants such as Hasbro and Mattel to become the world’s largest toy distributor. Early Happy Meal toys included stencils and spinning tops, though the trinkets were later designed as part of advertising campaigns to promote family movies, like 1989’s The Little Mermaid. All told, McDonald’s distributes 1.5 billion toys worldwide each year. As part of a recent effort to be more environmentally conscious, the company has pledged to largely phase out plastic toys in Happy Meals, and vowed to work to provide kids with plant-based or recycled toys instead.
One McDonald’s in Arizona Features Turquoise Arches
Golden arches may be synonymous with McDonald’s, but they’re nowhere to be found at one location in Sedona, Arizona. Due to a local law that prevents buildings from infringing on the region’s natural beauty, this McDonald’s instead features turquoise arches. City officials determined the gold would clash with the surrounding red rocks, whereas the turquoise was a more appropriate hue. Other unique color schemes at McDonald’s around the world include white arches at Paris and Belgian locations, as well as a big red “M” in place of the traditional yellow at one Rocklin, California, joint.
The men who founded the chain that would become McDonald’s — Dick and Mac McDonald — opened the fast food giant as a modest California hot dog stand in 1937. They would later pivot to a different food: On May 15, 1940, they opened McDonald’s Bar-B-Que in San Bernardino. The foray into BBQ was somewhat short-lived, however, because by October 1948 the brothers had realized that most of their profits came from selling burgers. The pair decided to establish a simple menu featuring hamburgers, potato chips, and orange juice, and added french fries and Coke a year later. The franchise was licensed to Ray Kroc in 1954, who transformed McDonald’s into the chain we know today.
No, it’s not your imagination, Coke actually does taste different — and many would say better — at McDonald’s restaurants. This is largely due to the way it’s packaged. While the actual flavoring is identical to other restaurants, McDonald’s gets its Coke syrup delivered in stainless steel tanks instead of the more common plastic bags, which in turn keeps the syrup fresher. McDonald’s also filters its water prior to adding it to the soda machines, and calibrates its syrup-to-water ratio to account for melting ice. In addition, McDonald’s utilizes wider straws than normal, allowing more Coke to “hit your taste buds,” according to the company.
While she was never actually there whipping up McFlurries, Britain’s former monarch technically owned a branch located in Oxfordshire, England, atop the Crown Estate, which is land belonging to the royal family. The queen used to own a second location in Slough, but sold the land in 2016. The location is truly fit for royalty, with leather couches and table service, plus a menu that includes English breakfast tea and porridge. This is not the only royal association with McDonald’s: Princess Diana used to frequently take her sons William and Harry to McDonald’s, despite the fact that they had access to a personal chef.
Credit: Cate Gillon/ Getty Images News via Getty Images
A McDonald’s Superfan Has Eaten Over 34,000 Big Macs
In a tradition that first began over 50 years ago on May 17, 1972, Wisconsin’s Don Gorske has consumed upwards of 34,000 Big Macs — and counting. While Gorske originally ate, on average, a whopping (no Burger King pun intended) nine Big Macs per day, he has since scaled back to about two a day. Gorske claims that in those 50 years he has only missed eating a Big Mac on eight days. The previous record for Big Macs eaten in one lifetime was 15,490, a number that Gorske smashed back in 1999 and has been dwarfing ever since.
Credit: Tim Boyle/ Getty Images News via Getty Images
McDonald’s Sells 75 Burgers Per Second
According to its own training manual, McDonald’s locations combined sell more than 75 hamburgers per second. The average hamburger is cooked and assembled in 112 seconds, whereas a Quarter Pounder takes a lengthier but still lightning-quick 180 seconds to prepare (assuming there are no other orders being worked on at the same time). McDonald’s produces and sells so many burgers that it had already sold its 100 millionth by 1958. In 1963, it sold its billionth burger live on TV during an episode of Art Linkletter’s variety show. The chain officially stopped keeping track in 1993, when it updated its signs to say “Over 99 Billion Served.”
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
There’s no wrong way to eat a banana — in a smoothie, underneath a mountain of ice cream, or even green (according to a 2019 poll, 5% of Americans prefer bananas in that unripened state). This grocery store staple is one that humans have been eating for at least 6,000 years, with no sign of slowing anytime soon; on average, people around the globe eat 130 bananas per year. These eight facts highlight a few things you may not know about one of the planet’s most beloved fruits.
Bananas made their U.S. debut in Philadelphia in 1876, sold to fairgoers attending the Centennial Exhibition (the first world’s fair held in America). For 10 cents, visitors could purchase a foil-wrapped banana and get a taste of a fruit many had never seen before. Today, bananas are one of the most popular fruits among American snackers, who consume an average of 13.2 pounds per person each year.
Transporting bruisable, temperature-sensitive bananas by boat was no easy feat hundreds of years ago, which could be how sailors became wary of bringing the fruit aboard. Many fishermen and sailors believed that having bananas on a ship invited bad luck, leading to accidents, broken equipment, and a reduction in the number of fish caught. While the origin of the superstition is unclear, some believe it could have started after crew members got sick from eating spoiled bananas or skidded on the slippery peels.
While banana trees can reach upwards of 40 feet tall, these lumbering plants technically aren’t trees — they’re instead considered giant herbs. Botanists designate trees as having woody stems that contain lignin, a substance that makes them sturdier. Banana plants are instead large, herbaceous stalks made from cellulose, a material that decomposes much faster — a necessity considering that after the fruiting process, the large trunks die back and fall over to make way for new growth.
The way scientists classify berries doesn’t always jive with how fruit eaters categorize them. That’s certainly the case for bananas, which are botanically berries. To be considered a true berry, a fruit must develop from a flower that contains an ovary; bananas form from nearly foot-long flowers that meet this criteria. Botanists also require fruit to have three layers: an outer skin (the exocarp), a fleshy layer (called a mesocarp), and an interior that holds two or more seeds (the endocarp). While commercially grown bananas don’t have seeds, their wild counterparts do.
Don’t worry — you don’t need a Geiger counter to pick out a bunch of bananas at the supermarket. The potassium in bananas contains trace amounts of radioactive atoms, though because our bodies regularly flush the nutrient out, it’s unable to build up to dangerous levels in our system. Bananas aren’t the only radioactive food: spinach, potatoes, and oranges are, too.
Researchers experimenting with ways to remove heavy metals from water have found that banana peels can get the job done. While natural materials like coconut fibers and peanut shells have been used successfully, a 2011 study found that minced banana peels were able to quickly remove lead and copper from water at the same rate or better. The slippery peels can be reused up to 11 times before they lose their toxin-attracting properties.
There are more than 1,000 species in the banana family, though it’s rare to see more than one kind at the grocery store. More than 55 million tons of Cavendish bananas are harvested each year, making them the most popularly grown and consumed species. Cavendish bananas get their name from William Spencer Cavendish, Britain’s sixth Duke of Devonshire, whose estate was home to numerous exotic plants. The duke’s eponymous banana stalks would eventually play a huge role in worldwide banana production — all modern Cavendish banana plants are descendants from those grown at the U.K. estate in the 1830s.
Commercially grown Cavendish banana plants aren’t nurtured from seed, but cloned from existing plants. Farming this way means the species lacks genetic diversity, making it vulnerable to pests and diseases that other species might have evolved to fight off. The main threat concerning commercial bananas is Fusarium wilt, aka “Panama disease,” a soil-borne fungus that is fatal to some species — such as Gros Michel, which was the world’s previously preferred banana. The fungus wiped out most Gros Michel crops in the 1950s, after which farmers switched to the Cavendish species. Modern botanists worry that Panama disease could strike again, destroying the Cavendish species that was once (incorrectly) believed to be immune to the fungus. But with genetic crop engineering and other species to choose from, it’s unlikely bananas will disappear altogether.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
In America, the act of tipping — chipping in a little extra on top of a bill in recognition of service workers — is both customary and controversial. How much is polite to tip, and how did tipping even come to be in the first place? Is it just an American thing? These eight facts about tipping help answer some questions about when to tip, how tipping evolved, and where you might not want to tip at all.
There’s a persistent rumor that the word “tip,” when used to refer to a gratuity, is an acronym for “To Insure Promptness,” “To Insure Performance,” or “To Insure Prompt Service.” This is false. Around the 1700s, “tip” was underworld jargon among petty criminals as a verb meaning “give” or “share.” It’s been in the mainstream, both as a verb and a noun, since the 18th century.
The U.S. Minimum Wage Is Different for Tipped Workers
The federal minimum wage for most people in the United States is $7.25 per hour as of 2023, but for tipped workers it’s just $2.13. Those tipped workers need to get paid the equivalent of $7.25 an hour once tips are tallied, and their employer needs to make up the difference, but for staff at some restaurants, a tip isn’t always a bonus — up to a certain point, it’s just supplementing staff wages. (Many states and cities have higher minimum wages.)
As it exists now, tipping is a very American phenomenon, but it was customary in Europe for hundreds of years before wealthy Americans imported it back home. It dates back to the feudal system in the Middle Ages, when servants would perform duties for their wealthy masters and receive a paltry sum in exchange. Eventually, this evolved into gratuities for service industry workers from their customers. Wealthy Americans who traveled to Europe in the 19th century brought back the practice, just as a wave of poor European immigrants, used to working under the European tip system, were arriving. The idea got major pushback at the time as “un-American,” but tipping picked up after the Civil War because …
When recently freed former enslaved folks entered the U.S. workforce after the Civil War, many had few employment prospects — so restaurants would “hire” them, but force them to rely on tips for payment instead of a reliable wage. Six states then abolished tipping in an attempt to force employers to pay their employees, but those bans were eventually overturned. When the Fair Labor Standards Act established the minimum wage in 1938, tipping was codified as a way to earn those wages.
In America, tipping is customary to not only say thank you for good service but also to help service workers make ends meet — even in states without tipped wages — so not tipping or undertipping is considered an insult in many industries. Yet at most businesses in Japan, tipping is embarrassing at best and insulting at worst. Many restaurants do implement service charges though, so you should still prepare yourself for something on top of face value when the bill comes.
When you’re staying at a hotel — at least, an American hotel — it’s customary to tip the housekeeping staff. Don’t worry, the etiquette isn’t to leave 20% of your hotel stay. Expertsrecommend a minimum of anywhere from $1 to $5 per day, and to leave a bigger tip if you’ve been messy, you made special requests, or if there’s a pandemic on.
65% of Americans Always Tip at Sit-Down Restaurants
According to a poll by Bankrate, slightly less than two-thirds of all Americans always tip at sit-down restaurants. Furthermore, 42% say they always tip 20% or more. This doesn’t include takeout or coffee shops — Americans are much less likely to tip consistently there, at only 13% and 22%, respectively.
Americans are far less likely to tip when they get takeout vs. when they sit down to eat because they’re not getting table service, but restaurant staff still work to prepare your food and package it up. You may not be expected to tip as much, but a little something is still warranted — after all, the kitchen staff perform essentially the same job whether you sit down or take it to go.
Sarah Anne Lloyd
Writer
Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.