Original photo by nmedia/ Shutterstock

From the shimmering green of leaves on a forest tree to the deep azure of the oceans, vibrant color fills our world — but it’s more than just pretty to look at. Often there are fascinating stories behind the colors that exist both in nature and in human artifacts, whether they’re school bus yellow, Golden Gate Bridge orange, or the blue-green of surgical scrubs. Below, we’ve rounded up some of our most dazzling stories about color from around the rainbow (and beyond).

Close-up of cut up fresh oranges.
Credit: Sheraz Shaikh/ Unsplash

The Color Orange Is Named After the Fruit

The word “orange” refers to both a citrus fruit and the color of said fruit, so which usage came first? The color isn’t exclusive to the fruit, of course, but the term did come from it. The first use of “orange” as a color in English dates back to the 15th century, and was derived from pomme d’orenge, the French word for the citrus.

“Orange” started appearing in written English works as a color around the 16th century. Before that, English speakers just described it as “yellow-red.” Renowned 14th-century author Geoffrey Chaucer didn’t even have a singular word to describe a fox in his famous work The Canterbury Tales: “His colour was bitwixe yelow and reed [sic].”

Dslr camera with green screen on the tripod.
Credit: kurgenc/ Shutterstock

Hollywood Uses Green Screens Because of Human Skin Tones

If you’ve seen any big-budget Hollywood film, it probably used some variety of green screen-enabled special effects. In fact, some version of green screen technology, also known as “chroma keying,” has been around since the early days of film. The reason why screens are green is actually pretty simple — human skin is not green. When a camera analyzes chrominance, or color information, it can easily separate green (or blue) from the rest of the shot so that a video backdrop can be inserted.

However, the technology isn’t foolproof, as green clothes can blend in with backgrounds. (That’s why meteorologists don’t wear green on St. Patrick’s Day.) Because of this deficiency, among other reasons, some productions are shifting to high-tech LED panels to recreate otherworldly locations.

A baby laying on a red blanket.
Credit: Master the moment/ Shutterstock

The First Color Humans Can Perceive Is Red

After only a few weeks of life, babies begin to distinguish their first color (after white and black) — red. Humans perceive color thanks to three types of cones found in our eyes, each tuned to short (blue), medium (green), and long (red) wavelengths. Although cones perceive color from birth, it takes time for the human brain to make sense of those inputs. Because an infant’s vision is blurry during the first few weeks of life, red is the only color capable of being captured by the retina when viewed around 12 inches from a baby’s face. At around 2 months old, babies can begin to distinguish between reds and greens, followed closely by yellows and blues a few weeks later.

Abstract colorful background.
Credit: Gegham Davtyan/ Shutterstock

Some People Can See Nearly 100 Million Colors

Researchers estimate that some 300 million people around the world are colorblind, most of them male. On the opposite end of the spectrum are those with an exceedingly rare genetic condition that allows them to see nearly 100 million colors — or 100 times as many as the rest of us. It’s called tetrachromacy, or “super vision,” and it’s the result of having four types of cone cells in the retina rather than the usual three. Because of the way the condition is passed down via the X chromosome, the mutation occurs exclusively in women.

One tetrachromat describes her ability this way: “If you and I look at a leaf, I may see magenta running around the outside of the leaf or turquoise in certain parts where you would just see dark green. Where the light is making shadows on the walls, I’m seeing violets and lavenders and turquoise. You’re just seeing gray.” In short, tetrachromats see colors within colors, and even the tiniest change in the color balance of a particular hue will be apparent to them. It’s estimated that 12% of women have a fourth retina cone, but only a fraction of them experience tetrachromacy. In total, only about 1% of humans have the condition.

Black paint and brush on a wood background.
Credit: Ilya Karnaukhov/ Shutterstock

Black Isn’t Technically a Color

Picture a rainbow, which comprises the visible spectrum of light, and you’ll notice that black isn’t in it. Scientifically speaking, black is the absence of light, and because light is required for color, black contains no color. (Black’s opposite, white, is the total of all colors of the visible spectrum.)

However, people usually think of black as a color in an artistic sense: as a pigment that absorbs visible light and reflects almost none, approximating the absence of light. Thus, the “black” we see is really a reflection of a mix of very dark colors. Here’s another mind-bending fact: Nothing in nature can be pure, absence-of-light black except the inner reaches of a black hole (although researchers have come close with Vantablack and other materials).

Close-up of a woman with blue eyes.
Credit: Ashwin Vaswani/ Unsplash

Blue Eyes Aren’t Actually Blue

Between 6,000 and 10,000 years ago, all humans had brown eyes, until a single genetic mutation caused one human to be born without the usual brown-black melanin pigment that colors irises brown. Irises without this pigment experience what’s known as the Tyndall effect. Because of blue’s short wavelength, that spectrum of light is reflected most by the fibers in the iris, causing eyes to take on a bluish color even though there is no blue pigment present. Today about 10% of the world’s population has blue eyes, though that number is skewed heavily by northern Europeans. In Finland and Estonia, for example, 89% of people have blue eyes — the highest percentage in the world. The U.S. comes in much lower, at around 27%.

Beams of light refracting and creating a rainbow spectrum.
Credit: Andrew E Gardner/ Shutterstock

Human Eyes Are Most Sensitive to the Green Wavelength of Light

Electromagnetic radiation comes in a variety of types, including radio waves, gamma rays, and visible light. The human eye can perceive wavelengths around 380 to 740 nanometers (nm), also known as the visual light range. The size of the wavelength determines the color we see: For example, at 400 nm our eyes perceive the color violet (hence the name “ultraviolet” for wavelengths directly under 400 nm), whereas at 700 nm our eyes glimpse red (but can’t see the “infrared” wavelengths just beyond it).

In the middle of this spectrum of visible light is the color green, which occupies the range from 520 to 565 nm and peaks at 555 nm. Because this is right in the middle of our visual range, our eyes are particularly sensitive to the color under normal lighting conditions, which means we can more readily differentiate among different shades of green. Scientists have also found that the color green positively affects our mood in part because our visual system doesn’t strain to perceive the color — which allows our nervous system to relax.

View of a school bus coming up a mountain.
Credit: Denisse Leon/ Unsplash

There’s a Good Reason School Buses Are Yellow

Glimpse a fleet of buses parked at any U.S. public school, and you’ll notice they’re all the same deep yellow — and it’s been that way for nearly a century. In an effort to standardize school bus construction around the country, thus ideally making them both safer and cheaper to mass-produce, school transportation officials met at Columbia University in 1939 to discuss a universal color for these vehicles. Fifty shades were hung up on the walls, ranging from lemon to deep orange. The color that was finally selected — known today as National School Bus Glossy Yellow, or Color 13432 — was chosen because of its ability to stand out from the background. Education officials didn’t know it at the time, but Color 13432 is wired to capture our attention, as the shade stimulates two of the three types of cones in the human eye — sending double the transmission to the brain compared to many other colors. That’s one reason a big yellow school bus is just so hard to miss.

Protruding veins on female hands close-up.
Credit: Mariia Kurlova/ Shutterstock

Your Blue Veins Are Actually an Illusion

Look at your arm, and you’ll see blue veins crisscrossing just beneath the skin. That’s an optical illusion. Human veins are not blue, and are actually transparent. While deoxygenated blood is a darker hue of red (which you’ve likely seen if you’ve ever donated blood or had blood drawn), the blue color comes from your skin scattering light, so that we perceive the veins beneath the skin as blue. The color perception of veins can also change depending on skin tone, as darker skin will turn veins more of a greenish color. While blue blood doesn’t occur naturally in humans, it is found in animals such as the horseshoe crab (whose blue blood has saved countless human lives). This family of crab sports blue blood because it contains copper pigments instead of iron.

View of the Golden Gate Bridge.
Credit: Patrick Tomasso/ Shutterstock

The Golden Gate Bridge’s Orange Color Wasn’t Planned

San Francisco’s Golden Gate Bridge, completed in 1937, has a bright earthy tone dubbed “international orange” — but when construction began in 1933, it was on track to be a boring, standard bridge color like black or silver (although the Navy also suggested yellow and black stripes so that it would be highly visible for ships). Consulting architect Irving Morrow noticed that some of the beams were primed in a reddish-orange color, and made it his personal mission to bring a similar shade to the finished product.

The warm color, he argued, was uniquely suited to San Francisco. It would stand out even on foggy days, and when the sun was out, the hue would pop against the blue sky and water. Such a distinct look would highlight the massive scale and stunning architecture of the bridge.

Morrow made his case to the Department of War, the permitting agency for the bridge, in 1935, and successfully convinced them. Today, the color gets touched up in small segments, since repainting the whole bridge would be a massive undertaking.

Want to replicate the bridge’s tone in your own home? The exact mix is on the bridge’s website.

Cosmic Latte solid color.
Credit: Giffany/ Shutterstock

According to Astronomers at Johns Hopkins, the Color of the Universe Is “Cosmic Latte”

We tend to think of space as cold and dark, but that’s only because most stars are light-years away from the pale blue dot we call home. The universe is actually quite bright on the whole, and its color has been given an appropriately celestial name: “cosmic latte.” In 2002, astronomers at Johns Hopkins University determined the shade after studying the light emitted by 200,000 different galaxies. They held a contest to give the result — a kind of creamy beige — its evocative moniker. (Other entries in the contest included “univeige” and “skyvory.”)

As with just about everything in the universe, however, the color isn’t fixed: It’s become less blue and more red over the last 10 billion years, likely as a result of redder stars becoming more prevalent. In another 10 billion years, we may even need to rename the color entirely.

View of a beautiful blue sky.
Credit: Kumiko SHIMIZU/ Unsplash

The Sky Is Blue Because of the Color’s Wavelength

The sun sends its rays to Earth as white light, meaning they contain everything in the color spectrum (red, orange, yellow, green, blue, indigo, violet). But blue is unlike the other colors, because its specific wavelength (between 450 and 495 nm) is more frequently scattered by particles in the atmosphere in a process known as Rayleigh scattering. At midday, the sky is pale blue as the sun’s light travels through less of the atmosphere, but as the sun heads toward the horizon, the sky becomes a richer blue because light travels through more of the atmosphere (thus scattering more blue light).

However, this is only half of the answer, because indigo and violet have even shorter wavelengths than blue, which raises the question: Why isn’t the sky violet? Figuring out this conundrum means taking a closer look at the human eye. The cones inside the eye are coded to perceive red, green, and blue, and it’s the combinations of these inputs that determine variations of color. Because of the eye’s sensitivity to the color blue, the sky takes on that particular hue instead of violet. Other animals likely perceive the sky (and the rest of the world) in a different hue because most mammals have only two different types of cones.

Hand pointing an advertisement in Yellow Pages.
Credit: Michal Mrozek/ Shutterstock

Yellow Pages Were Created Because a Printer Ran Out of White Paper

One day in 1883, Cheyenne, Wyoming-based printer Reuben H. Donnelley was busy printing the latest edition of the phone directory when he unexpectedly ran out of white paper. Unwilling to put off production until he could restock, he instead resorted to finishing the job with yellow paper, unknowingly creating an icon of the then-nascent information age. After subscribers commented on how these yellow pages were easy to find amid piles of white-hued publications, Donnelley produced the first official Yellow Pages phone book three years later. Using the color yellow for telephone business directories then became the norm around the world.

National flags of various countries flying in the wind.
Credit: artpage/ Shutterstock

Red Is the Most Common Color on National Flags

Purple is the least common color found on national flags (gracing only the banner of the Caribbean country Dominica), while red is the opposite, and can be found on a whopping 74% of flags, according to Guinness World Records. Around 50% of these flags use red to represent the blood of those who fought for the country, extolling the virtues of bravery and valor. This includes the U.S. flag, with the red standing for “hardiness and valor.” Meanwhile, when red is considered alongside vexillology’s second-favorite color, blue, only nine countries in the world (out of 196 total) have flags without red and/or blue in them.

A dressing table in a "green room" back stage.
Credit: Pixel-Shot/ Shutterstock

No One Is Sure Why the Backstage Room Is Called a “Green Room”

One early reference to a “green room” in the sense of a waiting room appears in The Diary of Samuel Pepys, the famed journal kept by a civil servant in 1660s London. Pepys mentions a “green room” when going to meet the royal family — likely a reference to the color of the walls. A “green room” was then tied to the theater in English playwright Thomas Shadwell’s 1678 comedy A True Widow, which includes the line: “Selfish, this Evening, in a green Room, behind the Scenes.” However, Shadwell doesn’t mention why it was called a green room. One notable London theater did have a dressing room covered in green fabric, but other theories behind the term reference actors going “green” because of nervousness, amateur or young (aka “green”) actors, or a place where early actors literally waited “on the green” lawns of outdoor theaters — among many other ideas. It’s possible we’ll never know the origin of the phrase for sure.

Baby accessories in blue and pink.
Credit: Inspiration GP/ Shutterstock

Pink Was Once Considered a Color for Baby Boys, While Blue Was for Baby Girls

Before pink and blue, there was white. For much of the 19th century, most infants and toddlers wore white dresses regardless of their biological sex. Dresses facilitated diaper-changing, after all, and white cotton could easily be cleaned with bleach. But around 1900, childcare experts began to push for a greater distinction between little girls and boys, amid fears that boys were growing up “weaker” and “lazier” than their fathers had. Many U.S. publications and stores responded in part by recommending pink clothing for boys and blue clothing for girls, although some also recommended the opposite color scheme. According to Dressmaker magazine, “Blue is reserved for girls as it is considered paler, and the more dainty of the two colors, and pink is thought to be stronger (akin to red).”

But around World War II, everything changed. Soon pink was heavily marketed as the preferred color for girls, and blue for boys. It’s not entirely clear what led to the switch, and the colors chosen were somewhat arbitrary — the focus was primarily on creating clothes specific for each child in an attempt to curb hand-me-downs, and thus sell more product. Once the 1950s began, hospitals wrapped newborns in pink or blue blankets, based on their sex (today’s standard blankets contain pink and blue stripes). All of this likely didn’t matter much to the babies themselves: Research has shown that children generally do not become conscious of their gender until age 3 or 4.

Purple paint cans aerial view.
Credit: Anna Mente/ Shutterstock

The Color Purple Technically Doesn’t Exist

Our eyes perceive color in the visible spectrum due to particular wavelengths, and violet is the shortest, at 380 nm. This is why the invisible wavelengths just below this threshold are known as ultraviolet, or UV rays (and why wavelengths directly above 700 nm are known as “infrared”).

The color purple, however, is what physicists call a “nonspectral color,” meaning it isn’t represented by a particular wavelength of light, but is instead a mixture of them as perceived by our brain. While some people use violet and purple interchangeably, the two colors are distinct; violet (which is part of the visible spectrum) has a more bluish hue, whereas purple is more red. The cones in our eyes receive inputs, and our brain uses ratios of these inputs to represent subtleties of color. Purple is therefore a complete construction of our brain, as no wavelength represents the color naturally. But purple isn’t alone — the same can be said for other colors such as black and white, as well as particular hues mixed with gray scale, such as pink and brown.

Colorful autumn trees landscape during the fall season.
Credit: logoboom/Shutterstock

The Sky Looks Bluer in the Fall

Although the sky is blue throughout the year, it’s often a richer blue in the fall and winter, especially in latitudes farther from the equator. Why? Well, the answer has to do with both electromagnetism and the biology of the human eye. As a refresher: When sunlight enters Earth’s atmosphere, gas and dust particles reflect the shorter wavelengths of visible light (such as blue) more than longer wavelengths (such as red). That — and the sensitivity of the human eye to the color blue — is why the sky appears as a cool sapphire.

However, as the seasons progress, one part of this equation changes: the sun’s position. As the sun gets lower and lower in the sky during its annual journey back toward the equator (and eventually the Tropic of Capricorn), the angle of the sun’s light hitting the atmosphere causes even more blue light to scatter, while red and green light decrease. That causes the sky to turn an even richer blue. These blue skies are especially easy to see in much of North America as cooler temperatures mean less moisture (and therefore fewer clouds), giving you an uninterrupted view of that deep azure atmosphere.

Medical professionals walking down a corridor together.
Credit: SolStock/ iStock

Here’s Why Surgical Scrubs Are Blue-Green

Walk into any hospital (or watch any medical drama), and surgeons are almost always wearing bluish-green scrubs. Because blue and green are far removed on the color spectrum compared to red, these cooler colors help refresh a surgeon’s eyes when operating on a patient (whose insides are essentially various shades of red). Because surgeons are visually focused on red-hued environments, glancing at a white background (the chosen hospital color of times past) can leave a ghostly green after-image, much like what your eyes experience after a camera flash. However, if the surrounding environment is green, then those after-images simply blend into the background.

Woman wearing an Elizabethan era gown.
Credit: Yuri_Arcurs/ iStock

In Elizabethan England, It Could Be Illegal To Wear Purple

From ancient times until as recently as the 19th century, the color purple was closely associated with royalty — often because they were the only class that could afford such luxury, which was extremely expensive to produce in the days when the color was still made from sea snails. Persian kings and Egyptian rulers wore the illustrious hue, and Julius Caesar similarly donned a purple toga, setting a 1,500-year-long trend for subsequent emperors in Rome and Byzantium.

The color was so intimately tied with the ruling class that the children of kings, queens, and emperors were said to be “born to the purple.” By the 16th century, however, things slowly began to change, as a wealthy merchant class began snatching purple-dyed garments of their own. In 1577, fearing that such lavish spending on “unnecessary foreign wares” could bankrupt the kingdom, Queen Elizabeth I passed sumptuary laws that essentially outlined a strict dress code based on class. Of course, the color purple (and crimson) was reserved for her majesty and her extended royal family, “upon payne to forfett the seid apparel.”

Painting of walls in a white color.
Credit: 22 TREE HOUSE/ Shutterstock

Scientists Recently Created the World’s Whitest Paint

In April 2021, scientists from Purdue University revealed a new shade of white paint. At first glance, it may look like any other plain white hue found at the local paint store. But unlike those other pigments, Purdue’s white paint reflects 98.1% of the sun’s rays. (Most white paints, by contrast, reflect only about 80% to 90%.)

According to Guinness World Records, that reflective ability makes the paint the whitest white that’s ever been created. And what Purdue’s hue lacks in chromatic sophistication, it more than makes up for in utility. According to The New York Times, if 1% to 2% of the world’s surface (about half the size of the Sahara) could be coated with this ultra-white paint, “the planet would no longer absorb more heat than it was emitting.” Although painting half the Sahara is not in the cards, painting the many, many rooftops that dot the world could help fight our current planetary fever while also cutting A/C costs. At midday, for example, the new paint makes surfaces 8 degrees Fahrenheit cooler than the surrounding ambient air temperature.

Rare hummingbirds from Costa Rica.
Credit: Vaclav Sebek/ Shutterstock

Hummingbirds Can See Colors That Humans Can’t

Colorblindness is relative. Just as we can perceive hues that dogs can’t, hummingbirds can see colors that humans can’t. Whereas the three types of color-sensitive cone cells in our eyes allow us to see red, green, and blue light, hummingbirds (and most other birds) have a fourth type of cone attuned to ultraviolet light. In addition to UV light, birds may even be able to see combination colors like ultraviolet+green and ultraviolet+red — something we mere humans can only imagine. Having four types of cones cells, known as tetrachromacy, is also common in fish and reptiles, and researchers believe that dinosaurs possessed it as well.

Being able to see this way is especially useful for hummingbirds, whose endless quest for sugar is aided by their ability to discern different-colored flowers — including “nonspectral” colors that combine hues from widely different parts of the color spectrum. Purple is the only nonspectral color we humans can perceive (it involves both blue and red, or both short and long wavelengths of light), but some birds might see as many as five: purple, ultraviolet+red, ultraviolet+green, ultraviolet+yellow, and ultraviolet+purple.

Spanish bull in spain.
Credit: alberto clemares exposito/ Shutterstock

Bulls Can’t Actually See the Color Red

If the very idea of bullfights makes you see red, you’re not alone — even though bulls themselves can’t actually see the color. As is the case with other cattle and grazing animals such as sheep and horses, bulls’ eyes have two types of color receptor cells (as opposed to the three types that humans have) and are most attuned to yellows, greens, blues, and purples. This condition, a kind of colorblindness known as dichromatism, makes a bullfighter’s muleta (red cape) look yellowish-gray to the animals.

So why are bulls enraged by the sight of matadors waving their muletas? The answer is simple: motion. The muleta isn’t even brought out until the third and final stage of a bullfight. The reason it’s red is a little unsavory — it’s actually because the color masks bloodstains. In 2007, the TV show MythBusters even devoted a segment to the idea that bulls are angered by the color red, finding zero evidence that the charging animals care what color is being waved at them and ample evidence that sudden movements are what really aggravate the poor creatures.

Young Man Sleeping Cozily on a Bed.
Credit: Gorodenkoff/ Shutterstock

Around 12% of People Dream in Black and White

Whether they’re about showing up to school in your underwear or having your teeth fall out, most dreams have one thing in common: They’re in color. Not for everyone, though. Roughly 12% of people dream entirely in black and white, making their nightly visions much like watching an old movie. That comparison isn’t a coincidence, either. The number used to be much higher: In the 1940s, 75% of Americans reported seeing color in their dreams only rarely or never, and some researchers believe that black-and-white television is part of the reason why. Color TV didn’t become common until the 1950s and ’60s, so for many years, most people’s most common experiences with visual stories were in gray scale.

Giraffe's head trying to reach a leaf with its tongue.
Credit: marseus/ Shutterstock

Giraffes Have Purple Tongues

In addition to their spots and long necks, giraffes have another distinguishing feature: Their tongues are often dark purple. Whereas most animals have pink tongues, a giraffe’s is infused with melanin that makes it darker — sometimes it’s even blue or black rather than purple — although the base and back are pink. And while it hasn’t been proved definitively, the most widely accepted theory is that the melanin provides ultraviolet protection, preventing giraffe tongues from getting sunburned while the animals feed on tall trees. Giraffe tongues are also long (up to 21 inches) and covered in thick bumps known as papillae, which help protect them from the spiky defensive thorns of the animal’s favorite snack: acacia trees.

Giraffes aren’t the only creatures with darker tongues, of course; okapis, polar bears, impalas, and chow chow dogs have them as well, among other animals. However, giraffes are distinguished from their purple-tongued friends not only by their status as the world’s tallest mammal, but also because they give birth standing up. Newborn giraffes fall to the ground from a height of more than 5 feet, not that they mind — they can stand within half an hour and run within 10 hours, usually alongside their doting (and similarly dark-tongued) mother.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by nonnie192/ iStock

Cars really have one job: Getting you from point A to point B. But there are plenty of conveniences that can make your journey a little easier — some obvious, some a little more subtle. Nothing’s a replacement for driving safely and watching the road, of course, but some features are designed to give you a little backup. Others are just convenient, like special hidey-holes. Not sure what that dash light or that button on your key fob does? These seven stealthy features could make you feel like you have a brand-new car.

Fuel gauge resting past the empty position.
Credit: Hanjo Stier/ Shutterstock

Gas Tank Side Indicator

It can be hard to remember what side your gas cap is on, especially in a car you’re not used to driving. Fortunately, there’s usually a pretty easy way to tell: In many cars, there’s a little arrow next to the gas symbol on your fuel gauge that points to the side of the car that should be facing the pump. This tiny, easy-to-miss feature can save you a whole lot of awkwardness pulling a rental car into the gas station. Even if you’re pretty sure your car doesn’t have it, double-check — it’s sneaky!

Car dashboard panel icon.
Credit: Powerlightss/ Shutterstock

Tire Pressure Monitor

If you’re used to driving a car from before 2008, there’s a new standard dash light that may look unfamiliar: a yellow exclamation point in the middle of what looks like two parentheses with a dotted line below. It’s supposed to look like a cross section of a tire, and that light tells you that your tire pressure is low. Some cars have more fully featured tire pressure monitors that show all four wheels.

The dash light is designed to illuminate when at least one of your tires is 25% below the recommended tire pressure. If you’re seeing it, either check your tire pressure or head to a tire store — many of them will check for free. (But remember that you should be checking your tire pressure monthly anyway; tire pressure can become dangerously low before this light comes on.)

The indicators became mandatory in American vehicles after the United States Congress passed the TREAD act; the bill was from 2000, but the requirement didn’t kick in until 2008. Some car manufacturers got a head start and started including them in 2006 or 2007 models, too.

operating ESP (electronic stability program) control.
Credit: Luis Viegas/ Shutterstock

Stability Control

Do you sometimes see a dash light that looks like a car with wiggly lines underneath it? That means you have an electronic stability control (ESC) system, sometimes called vehicle stability control, electronic stability program, or dynamic control system. It closely monitors your steering to determine when your car might be out of control, and softly adjusts the brakes on each wheel to compensate for over- or understeering and to prevent rollovers.

If you see the dash light flickering, chances are the system has been activated and is trying to keep your car on track, or driving conditions are just slippery. If it’s steady, it could mean the system is malfunctioning. Some cars have a button that can turn it off.

Car shape keyring and remote control key in vehicle interior.
Credit: Brian A Jackson/ Shutterstock

Backup Mechanical Keys

If you drive a newer car, chances are you’re not turning a key in the ignition. Key fobs have become the standard way to unlock vehicles, which is convenient until your key battery dies or you have some other kind of tech malfunction.

The good news is that you might have a lower-tech backup plan built right in. Many key fobs have little mechanical keys hidden inside that you can usually release by pressing or sliding a small button, although you may have to check the owner’s manual to figure it out. Some key slots are better hidden than others; yours could be right next to the unlock button on the door, underneath the door handle, or under some sort of cap.

Special storage compartment in the back of an SUV.
Credit: Mariaprovector/ Shutterstock

Secret Storage Compartments

Whether you have something to hide or you’re just trying to squeeze a little extra storage out of your vehicle, it’s worth looking for secret pockets of space.

Some Toyota Prius models have storage underneath the floor of the trunk. The Buick Enclave has both subfloor storage and a false floor under the center console. The Infiniti G35 had a flap in the rear armrest with a small compartment behind it. Some Volkswagen models even have a little drawer under the driver’s seat that’s perfect for documents.

Blind Spot Monitoring system warning light/icon.
Credit: Yauhen_D/ Shutterstock

Blind Spot Monitoring

The rear sides of your car are called “blind spots” for a reason — they’re really hard to see, and even if you dutifully check them before changing lanes, accidents can happen. Some newer cars (and some not-so-new luxury vehicles) come with blind spot monitors (BSMs) that let you know when a car’s occupying this sneaky spot next to you. Some of these monitors are more obvious than others. It could be lights on your side mirrors, dashboard, or the pillar between your driver’s side window and your windshield. A few cars even have audible warnings if your turn signal is on but the lane next to you is occupied.

Even lower-end cars have BSMs now, but some higher-end indicators go above and beyond and steer your car away. A couple of trucks even have BSMs that extend to the trailers they’re towing. Just make sure to keep using your eyes — the monitors aren’t foolproof, and often miss vulnerable road users like pedestrians and cyclists.

Car window open with a view of the blue sky.
Credit: Pongsak A/ Shutterstock

Shortcut for Rolling Down Your Windows

If you look at your key fob, you’re probably not going to see a button that rolls down your windows, but that doesn’t mean it can’t do it. Check your manual, because sometimes a specific key sequence can lower all your windows from outside the vehicle so you can cool it down on a hot day. It’s not just newer models, either — cars more than a decade old have this function, too.

Even if you’re not planning on using this feature, you should at least figure out how not to do it by accident.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Suwan Wanawattanawong/ Shutterstock

Imagine a world without cameras — it’s almost impossible. Film, television, history, news, and even our memories are influenced by a technology that has been around for less than two centuries. Here are 10 facts that explore the amazing history of photography, and how it grew from a quirky laboratory experiment to redefining the human experience.

Johann Heinrich Schulze, German anticipator of photography.
Credit: Science & Society Picture Library via Getty Images

The Story of Photography Begins a Century Before the First Photo

View from the Window at Le Gras,” taken in 1826 or 1827 by the French inventor Nicéphore Niépce, is widely considered the oldest surviving photograph taken with a camera — but that’s not the beginning of the story of photography. To understand the technology’s origin story, you need to go back about a century and explore the work of German scientist and polymath Johann Heinrich Schulze. Although medieval alchemists often experimented with silver chloride, and even Leonardo da Vinci conceptualized the camera obscura, Schulze (who was actually an anatomist) conducted the first serious experiments in 1717 that established how light reacted to silver salts — the very basis of what would become photography. Schulze’s rudimentary images eventually faded to black because no method existed to prevent overexposure, but the experiments astounded those who saw them, with Schulze eventually writing, “… many who were curious about the experiment but ignorant of its nature took occasion to attribute the thing to some sort of trick.” But it was no trick — only science that had yet to be fully understood.

Photo believed to be the earliest photograph showing a living person ,1838.
Credit: GraphicaArtis/ Archive Photos via Getty Images

We Don’t Know the Names of the First People Ever Photographed

One day in 1838, Louis Daguerre — physicist, photography pioneer, and inventor of the daguerreotype (the earliest form of practical photography) — stood in a window overlooking Paris’ Boulevard du Temple and snapped a photograph. Since this was one of the first photographs ever taken, the image was actually less of a “snap” and more of a slog, as the process required around 10 minutes to gather enough light to produce an image on a highly polished silver-plated copper sheet. Because of this long exposure time, Daguerre’s photo captured what appeared to be an empty street, as the hustle and bustle of passing traffic didn’t stay long enough to show up in the photo. In fact, the only thing in the image except for immobile trees, sky, and concrete is a lone shadowy figure getting his boots shined, which explains why he stood still long enough to be fixed in the photo. Upon closer inspection, viewers can just barely make out the shoeshiner hard at work. Today, of course, no one knows the name of that man getting a shoeshine, or the person giving it.

View of Icelandic spiral northern lights.
Credit: Mike-Hubert.com/ Shutterstock

The First True Digital Camera Was Invented to Photograph the Aurora Borealis

The advent of the digital camera was made possible by the invention of a little-known piece of technology called a charged-couple device (CCD) in 1969. At its most basic, a CCD is a light sensor that sits behind a camera lens and effectively replaces the need for film. Eastman Kodak engineer Steven Sasson built the first digital camera prototype in 1975, but his creation was unwieldy, requiring 16 batteries and a special screen just to view the images. The first “true” digital camera came two years later, when the University of Calgary Canada ASI Science Team created the Fairchild All-Sky imager for snapping photos of the aurora borealis. A little more than a decade later, the technology came to consumers when Fujifilm released the FUJIX DS-1P in 1988.

Close-up of a digital camera lens mount and aperture inside.
Credit: Petr Svoboda/ Shutterstock

The World’s Largest Digital Camera Weighs 3 Tons

The world’s first digital camera scanned the night sky, and the same can be said for the largest digital camera ever made. The Legacy Survey of Space and Time (LSST) camera lies at the heart of a new telescope at the Vera C. Rubin Observatory in the Chilean mountains, but this isn’t your average DSLR. The LSST camera weighs 3 tons, contains a 3200 megapixel sensor (by comparison, an iPhone camera has only 48 megapixels), and its lens stretches a full 5 feet across. Once installed at the end of 2024, the camera will capture 15 terabytes every single night over the course of 10 years and will observe an estimated 20 billion galaxies. It’s a big camera for an equally big job.

A person photographing two others by the water.
Credit: Armen Poghosyan/ Unsplash+

The Left Side of Your Face Likely Looks Better in Photos

Want to capture your “good” side in your next photo? Show off that left cheek. According to a 2012 study from Wake Forest University, the left side of a person’s face often expresses more emotion than the right, and onlookers tend to find that more aesthetically pleasing. When people were asked to rate the pleasantness of male and female profiles presenting both a left and right cheek, the participants overwhelmingly chose the left as more pleasant. One theory for this left-faced bias is that emotion and spatial awareness is largely dominated by the right hemisphere of our brain but is lateralized to the left side of our body, so emotions are expressed more intensely on the left side of our face. Interestingly, Western artists throughout the centuries have had a bias for painting portraits with subjects displaying their left cheek, especially women, with “Mona Lisa” being a prime example.

Back side of a digital camera stock.
Credit: unomat/ iStock

In 1995 a 1 Megapixel Camera Cost $20,000

Every new technology comes with an early adopter tax. The price of the first Macintosh in 1984 comes out to about $6,000 in today’s dollars, and the first cellphone, the Motorola Dynatac 8000x, would cost around $12,000 today (with only 30 minutes of battery life). But those costs pale in comparison to the first 1MP pro camera. Released in 1995, this Fuji X/Nikon hybrid camera had a 1.3 megapixel sensor and a 131 MB removable memory card (capable of storing 70 photos), all for the eye-popping price of $20,000, which is around $38,000 today. Only 12 years later, Apple — which also made the impressive QuickTake camera in the mid-’90s — introduced a 2 megapixel camera on its original iPhone for a fraction of the cost. Today, professional photographers use cameras with 24 megapixels (or more).

The first colour photograph, 1861.
Credit: Science & Society Picture Library via Getty Images

The First Color Photograph Was Taken During the U.S. Civil War

Color photography is usually associated with the 20th century, but its origins date back to the early 1860s. On May 17, 1861, Scottish physicist James Clerk Maxwell, who later that very same year began publishing his world-changing electromagnetic equations, revealed the first color photograph to the Royal Institution of Great Britain. The photo showed the multiple hues of a tartan ribbon, and Maxwell created the image by having the same ribbon photographed three times using red, yellow, and blue filters and then combining them (known today as additive color theory). Maxwell first suggested this three-color method back in 1855, but it wasn’t until his collaboration with Thomas Sutton (inventor of the single-lens reflex camera), who actually snapped the images, that Maxwell’s vision finally came to life. Because the photographic plates were far less sensitive to red and green, the color wasn’t perfectly true to life, but it’s still considered the first color photo nonetheless.

A man walks past a Microsoft billboard featuring its latest software, Windows XP.
Credit: Kevin Lee/ Getty Images News via Getty Images

The Most-Viewed Photograph in History Is Probably the Windows XP Wallpaper

The most-viewed photo isn’t from the lens of legendary photographers like Ansel Adams or Jacob Riis, but a simple picture of a field in Sonoma, California — and chances are you’ve seen it, too. The photograph, originally captured by photographer Charles O’Rear and named “Bliss,” was taken in 1996. Four years later, Microsoft paid O’Rear an undisclosed (but north of $100,000) amount of money to use the image as the default desktop wallpaper for Windows XP in 2002. O’Rear says the image was so valuable that FedEx refused shipment, so he hand-delivered the photograph to Microsoft’s offices near Seattle, Washington. The brilliantly bright green rolling hill (which is now a vineyard) accompanied by a picturesque bright blue sky has likely been seen by billions of people around the world due to the software’s global ubiquity.

Little girl with a retro camera in an autumn park.
Credit: Sviatlana Lazarenka/ iStock via Getty Images Plus

The Red-Eye Effect Is Caused by Blood Vessels

Sometimes when you take a photo using flash, something strange happens — the image comes back with subjects sporting demonlike red eyes. What’s going on here? Well, it all has to do with the human iris. At the back of the eye are red blood vessels embedded in the choroid (a layer of tissue that nourishes the retina) that are vital to the function of the eye’s photosensitive cells and nerves. When a camera uses a flash, it’s usually to light a dim area, and in such environments, the human pupil is naturally dilated to let in more light. The pupil doesn’t have enough time to contract, so the flash illuminates the blood vessels in the back of the eye, which is then captured on the camera’s sensor. Many modern cameras now include a dual-flash system where the first flash contracts the pupil and the second flash lights the scene for the actual photo, thus eliminating that pesky red-eye effect.

Lots of photograph collections in one image.
Credit: BremecR/ iStock

We Probably Take More Photos Every Minute Than Were Taken in the Entire 19th Century

Two centuries ago, imaging pioneers were only beginning to tinker with ways to capture the world around them using chemicals and light, and now cameras are embedded in our daily lives. We don’t know how many photos were taken in the 19th century, but it’s likely it was a few million at most. In 2014, it was estimated that humanity took a staggering 1 trillion photos that year, which means that every two minutes of 2014, we likely took more photos than were taken in the entire 19th century. Experts believe that the number of cameras in the world passed 45 billion in 2022 (that’s more than five cameras for every person). As cameras continue to get better while shrinking (sometimes to the size of a grain of sand), that number is only increasing, with estimates suggesting humans will take 2 trillion photos in 2025. When that happens, every single minute of the day will create more photos than an entire century of human history.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Ridofranz/ iStock

It’s a fact of life — people grow old. While modern society tends to obsess about the negatives of aging, studies suggest that we often experience more happiness and contentment in our later years. These 12 facts investigate the phenomenon of growing old, debunk some persistent myths about aging, and explore the brighter side of those golden years.

Side view of a woman experiencing a migraine.
Credit: Mindful Media/ iStock

Say Goodbye To Migraines

As our bodies age, they naturally become more susceptible to a variety of illnesses and maladies — but migraines are the rare exception. Migraines often first develop in adolescence, and while both sexes are affected, women are three times more likely to develop migraines compared to men (often due to a fluctuation of estrogen levels). However, the frequency of migraines eventually peaks at the age of 40 and actually gets better as we enter our golden years. Stress and hormones are the most common triggers for migraines, and these two factors usually affect older people with less severity. That said, pain, smoking, and alcohol can still contribute to migraines in seniors, and although migraines generally subside with age, they are still the second-most-common headache disorder in older people (after tension headaches). One in 10 older adults still experience them about once a year.

Silver-haired mature woman from behind.
Credit: Studio Light and Shade/ Shutterstock

Hair Doesn’t Actually “Turn” Gray

One of the hallmarks of aging is that our lifelong hair color begins to turn gray, or in some cases, white. Although an entire industry is built around hiding this fact, human hair isn’t actually turning gray so much as no longer supplying the pigments necessary to produce color. This occurs when hydrogen peroxide builds up after wear-and-tear on the hair follicles. That blocks the normal synthesis of melanin, which is responsible for all shades of hair color.

A happy old people sitting in an autumn park.
Credit: Ruslan Huzau/ Shutterstock

Older Adults Are Happier Than People in Their 20s on Average

As people age, we also gain a certain calm. A study published in 2016 in the Journal of Clinical Psychiatry analyzed a random sample of 1,546 people ages 21 to 100 in San Diego. Although younger people in the survey responded positively in terms of physical health compared to older folks (as anticipated), older adults far outperformed younger generations in terms of mental well-being. Panic disorders are also reported as less common among older cohorts compared to younger people, and developing a panic disorder later in life is a rarity.

A fit senior couple exercising.
Credit: Ruslan Huzau/ Shutterstock

They Also Sweat Less, Too

As we age, our skin loses collagen, gets thinner, and presses our sweat glands close to the surface of our skin. This process is a bit of a double-edged sword. On the one hand, because these glands are squeezed, it’s harder for sweat to come out of our pores, meaning older people sweat less overall. This may be a check mark in the “pro” column for personal hygiene, but it does come with a few negative side effects. With a reduced ability to sweat, older adults can have trouble regulating temperature during strenuous exercise or excessive heat. Sweat also plays an important role in healing, as it helps stimulate wound closure in skin cells. Thankfully, a lifetime of physical fitness helps slow down this process so you can sweat long into your golden years.

A senior Caucasian man at a polling place.
Credit: LifestyleVisuals/ iStock

Older People Vote More Than Any Other Age Group

Older people may not feel as strong as they did in their youth, but in terms of political power, they’re as strong as ever. In 2018, 64% of people 65 and over voted in the U.S. midterm election — the highest turnout of any age group — and the 65- to 74-year-old cohort also had the highest turnout in the 2020 election. There are a couple of reasons why the older vote is particularly robust. The biggest may be that older Americans, as well as seniors in other democracies, have government programs and initiatives they rely on, such as Medicare, prescription drug pricing, and Social Security, and because these policies so directly affect them, elections tend to turn out seniors in higher numbers. (There are other factors at play, too — older folks may simply have more time on their hands.) Senior citizens also grease the wheels of democracy, as they’re the most likely age group to volunteer as poll workers on Election Day.

Close-up of the ear of an elderly person.
Credit: nafterphoto/ Shutterstock

Noses and Ears Don’t Keep Growing, But They Do Droop

While a common myth purports that our ears and nose continue to grow as we age (while the rest of us generally shrinks), that’s not entirely true. Like most other parts of our body, our ears and nose stop growing once we’re in adulthood, but the constant tug of gravity over the decades causes these cartilage-filled features to droop over time. This constant pull actually causes the collagen and elastic fibers in our ears and nose to elongate, and this lengthening, combined with surrounding facial structures losing overall volume, often produces the illusion of growing ears and noses as we age. This elongation is a slow and steady process; studies have shown that ears can lengthen some 0.22 millimeters a year. Interestingly, the process is so precise that you can discern a person’s age just by measuring their ears.

Roman mosaic filled with old people.
Credit: Heritage Images/ Hulton Fine Art Collection via Getty Images

Old Age Isn’t a Modern Phenomenon

A common misconception about old age is that it’s a relatively modern phenomenon, as our predecessors lived brutish lives cut short by disease and war. While modern medicine has certainly expanded life expectancy, many people in the past lived as long as people live today. For example, some ancient Roman offices sought by politically ambitious men couldn’t even be held until someone was 30 — not exactly a great idea if people didn’t live many years beyond that. Scientists have analyzed the pelvis joints (a reliable indicator of age) in skeletons from ancient civilizations and found that many people lived long lives. One study analyzing skeletons from Cholula, Mexico, between 900 and 1531 CE found that a majority of specimens lived beyond the age of 50. Low life expectancy in ancient times is impacted more by a high infant mortality rate than by people living unusually short lives. Luckily, modern science has helped more humans survive our vulnerable childhood years and life expectancy averages have risen as a result.

Senior couple lying down in bed, their feet come out from under the sheets.
Credit: FilippoBacci/ iStock

Older People Requiring Less Sleep Is a Myth

Another myth about getting old is that as we age, humans need less and less sleep, somehow magically subsisting on six hours or less when we enter our senior years. The truth is that the amount of sleep a person needs is only altered during childhood and adolescence as our bodies need more energy to do the tough work of growing. Once we’re in our 20s, humans require the same amount of sleep per night for the rest of their lives (though the exact amount differs from person to person). In fact, the elderly are more likely to be sleep-deprived because they receive lower-quality sleep caused by sickness, pain, medications, or a trip or two to the bathroom. This can be why napping during the day becomes more common as we grow older.

A doctor adjusting an elderly woman's shouldfer in an office.
Credit: ljubaphoto/ iStock

Some of Our Bones Never Stop Growing

The common perception of human biology is that our bones put on some serious inches in our youth, and then by the time we’re 20 or so, nature pumps the brakes and our skeleton stays static forever. While that’s true of a majority of our bones, some don’t quite follow this simplistic blueprint. A 2008 study for Duke University determined that the bones in the skull continue to grow, with the forehead moving forward and cheek bones moving backward. Unfortunately, this imperceptible bit of a facial movement exacerbates wrinkles, because as the skull shifts forward, the overlying skin sags.

The pelvis also keeps growing throughout your life. Scientists analyzing the pelvic width of 20-year-olds compared to 79-year-olds found a 1-inch difference in width, which adds an additional 3 inches to your waistband. That means our widening in the middle as we age isn’t just about a slower metabolism.

Senior woman having her eyes examined at the optician.
Credit: gilaxia/ iStock

Pupils Get Smaller As We Age

While our hips get bigger, our pupils get smaller. The human pupil is controlled by the circumferential sphincter and iris dilator muscles, and as we add on the years, those muscles weaken. Because of this loss of muscle function, pupils get smaller as we age, and are also less responsive to light. Smaller pupils make it harder to see at night, so people in their 60s need three times as much light to read comfortably as people in their 20s. Reading a menu in a dimly lit restaurant? Forget about it. Other eye changes include an increased likeliness of presbyopia, or farsightedness (which can often be resolved with readers), and cataracts, or a clouding of the eye’s lens. In fact, half of people over the age of 80 will have experienced a cataract of some kind.

Muslim nurse taking care of a senior patient in a wheelchair.
Credit: andresr/ iStock

Older People Have a Stronger “Immune Memory”

Although the body experiences some slowing down as we age, growing old isn’t all bad news. Researchers from the University of Queensland found that older people had stronger immunities than people in their 20s, as the body keeps a repository of illnesses that can stretch back decades. This extra line of defense begins to drop off in our 70s and 80s, but until then, our bodies generally just get better and better at fighting off disease due to biological experience. Additionally, as we age we experience fewer migraines, the severity of allergies declines, and we produce less sweat. Older people also exhibit higher levels of “crystalized intelligence” (or what some might call “wisdom”) than any other age group.

Scientist examining a molecule model.
Credit: Tom Merton/ iStock

The Atoms That Make Up All of Us Are Already Billions of Years Old

It’s true that age is just a number, and in the cosmic view of the universe, human age is pretty insignificant. The atoms that make up the human body are already billions of years old. For example, hydrogen — one of the key components of our bodies — formed in the Big Bang 13.7 billion years ago. Likewise, carbon, the primary component of all known life, formed in the fiery cauldron of stars at least 7 billion years ago. So when someone says we’re all made of “star stuff,” they’re very much telling the truth (we’re also made from various supernovae). And while we grow old on Earth, this is only the latest chapter of a story that stretches back to the beginning of everything — and it’s a story that’ll continue until the universe ends.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Tana Danyuk/ Shutterstock

A little sweet, a little sour — citrus fruits brighten up any meal, from breakfast to happy hour cocktails. Evolving over the course of 25 million years from just a few citrus species, today’s selection of grapefruits, oranges, lemons, and more has been eons in the making. These eight juicy facts about citrus may just give you a deeper appreciation for these pulp-packed fruits.

Oranges in red mesh bag.
Credit: Ugorenkov Aleksandr/ Shutterstock

Many Citrus Fruits Are Sold in Red Bags For a Reason

Citrus growers often bundle together bunches of oranges in mesh bags, which you may have noticed are made from red plastic. It’s no coincidence; red bags against orange peels create an optical illusion that makes the fruit appear more vibrantly hued and enticing. The trick works for other citrus — like mandarins, clementines, tangerines, and even some grapefruit — though not all. Yellow citrus, like lemons, are often sold in yellow or green bags to create a similar color-popping effect.

View of cut and uncut oranges.
Credit: muse studio/ Shutterstock

Most Citrus Fruits Are Hybrids

Many researchers believe that all citrus fruit can be traced back to just three species: pomelos, citrons, and mandarins. Citrus trees of different species are reproductively compatible with one another; over time, cross-breeding between these “ancestor fruits” created the hybrids known as grapefruits, oranges, lemons, limes, and other citrus varieties enjoyed today.

Half an orange peeled of skin.
Credit: Lapina Maria/ Shutterstock

Orange Peels Are Packed With Vitamin C

Oranges are a go-to food for many when battling a cold, thanks to the fact that they’re packed with vitamin C — though some research suggests that the nutrient doesn’t actually prevent colds, and may only slightly cut short how long a cold lasts. Another hitch in eating oranges to help illnesses is that much of the fruit’s vitamin C is found in its peel. Just one tablespoon of the outer rind contains 14% of the recommended daily dose of vitamin C, which is about three times more than what’s found in the inner flesh.

James Lind Giving Citrus Fruits to Sailors with Scurvy.
Credit: Bettmann via Getty Images

British Sailors Got Their Nickname From Limes

Regardless of the flag they were working under, nearly all sailors of the past shared a common enemy: scurvy. The disease, caused by a vitamin C deficiency, plagued sailors who were unable to regularly consume fresh fruits and vegetables while out at sea. The British Royal Navy began supplying its sailors with lemons and lime juice in 1795, though not all countries picked up on the practice. During the War of 1812, skeptical American sailors nicknamed their British counterparts “limeys” to mock the practice — though the U.S. Navy eventually started doing it too.

Fresh cut and whole pomelo fruits with green leaves.
Credit: Liudmila Chernetska/ iStock

The Largest Citrus Fruit Can Reach Basketball Size

Pomelos are grapefruit-like citrus native to Southeast Asia, known for their yellow rinds and pink inner flesh — not to mention their size. The jumbo citrus fruits can grow up to the size of a basketball and weigh as much as 22 pounds. In comparison, kumquats are the smallest known citrus, maturing at a max length of around 2 inches. (Kumquats are also the only citrus fruit you can easily eat without peeling.)

Ripe orange tangerines in basket.
Credit: zkolra/ iStock

Clementines and Mandarins Are Technically Different Fruits

What’s small, orange, and easy to peel? Both clementines and mandarins, which explains why these two nearly identical citrus fruits are often confused. However, botanists say there is a difference. Clementines are an offshoot variety of mandarins, created by crossing mandarins and sweet oranges. That means all clementines are technically a type of mandarin, though all mandarins are not clementines.

Grapefruits at a farmers market.
Credit: Moonstone Images/ iStock

Grapefruit Can Interfere With Some Medications

Grapefruits are packed with vitamins and fiber that support heart and gut health, though people who rely on some medications are often warned away from consuming the fruits. That’s because grapefruit juice can affect how medications work. Some drugs, like those for cholesterol and high blood pressure, are metabolized in the body by the CYP3A4 enzyme found in the small intestine. Grapefruit juice can block that enzyme, which stops the medication from breaking down and causes too much to enter the bloodstream. Other drugs, like fexofenadine (Allegra) for allergies, use proteins called transporters to enter cells in the body; grapefruit juice can block this process and cause too little of the drug to circulate, rendering it ineffective.

Plate with healthy oranges and slices.
Credit: Pixel-Shot/ Shutterstock

Oranges Were Once A Luxurious Christmas Gift

Oranges are relatively inexpensive today, though 19th-century Europeans who woke to find them in their stockings on Christmas morning considered the fruits a grand gift. The tradition of receiving an orange as a holiday present dates to the 1800s, when Christmas revelers widely began hanging stockings on the mantle, and is commonly linked to the tale of St. Nicholas of Myra, a fourth-century bishop who reportedly tossed bags of gold into the drying stockings of poor maidens. Oranges — which were generally a rare and expensive fruit in Victorian times — represented St. Nicholas’ gifted gold, and became linked with the holiday.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Interesting Facts

It’s easy to lose track of items in the back of a dark pantry, which is why expiration dates can be so helpful in determining when to toss old foods. However, the “best by” dates we rely on aren’t always a true picture of how long a food is shelf-stable. Food dating is mostly a voluntary process for grocery manufacturers, who often just estimate when their products will no longer be at their best quality. Luckily, there are some foods — like the six listed below — that are safe to keep using even if their expiration date has long passed.

Woman preparing a sauce with vinegar in the kitchen.
Credit: Pixel-Shot/ Shutterstock

Vinegar

Most foods produce a noxious smell when they’ve spoiled, but vinegar always smells pretty potent, so it may be hard to use the old-fashioned sniff test to guess at its quality. Luckily, you don’t have to, since vinegar doesn’t expire. Vinegar is a fermented product, created when yeast consume sugars or starches to create alcohol; that byproduct is then exposed to oxygen and a bacteria called Acetobacter, which continues fermenting to create the final acidic product. That acidity actually makes vinegar self-preserving, which is why it generally doesn’t need to be refrigerated. Over time, vinegar can become hazy or develop sediment, particularly a gelatinous substance called “mother,” though that doesn’t mean you need to toss it — in fact, vinegar mothers (aka a colony of healthy bacteria that forms in fermented liquids) can even be used to create a new batch of the multipurpose solution.

Rice in wooden bowl on top of rice.
Credit: surakit sawangchit/ iStock

White Rice

Comedian Mitch Hedberg once joked that rice is the perfect meal if you’re “really hungry and want to eat 2,000 of something.” It’s also a great food for long-term storage. White rice — which starts as brown rice but is milled to remove its exterior husk, bran, and germ — keeps best, so long as it’s properly stored away from moisture and pets. At temperatures under 40 degrees Fahrenheit, white rice’s life span pushes upwards of 25 to 30 years, but even when stored at warmer temperatures, it can last up to 10 years if packed with oxygen absorbers. However, not all rice keeps long-term; opened bags should be used within two years, and brown rice lasts about six months at room-temperature storage because of its naturally occurring oils, which can go rancid.

Close-up of a spoon full of sugar.
Credit: TinaFields/ iStock

Sugar

Sugar has a particularly sweet characteristic: It doesn’t really go “bad.” Granulated sugars (along with some syrups, like corn syrup) are so inhospitable for bacteria that they’re often the primary ingredient used to preserve jellies, jams, and canned fruits. However, like all long-stored pantry staples, helping sugar maintain a long shelf-life means keeping it away from any source of condensation or moisture, which is easily absorbed and can leave behind a hardened block. Even with its ability to last indefinitely, food storage experts say sugar is best consumed within two years of opening — just another reason to mix up a batch of fresh cookies.

Spilled salt and saltshaker on blue background.
Credit: gojak/ iStock

Salt

Vegetable, animal, or mineral? Salt falls in the latter category, which is one reason it can enjoy an indefinite stay in your pantry without spoiling. Salt has been used to preserve foods (especially meats) for centuries because it’s so effective at inhibiting bacteria; the mineral is able to break down enzymes that help germs grow while also dehydrating food and removing water that bacteria needs to thrive. Its ability to repel water keeps salt unlimitedly useful, though there are some kinds of processed salt that are more likely to deteriorate in quality over time — specifically those with additives such as iodine or anti-caking agents (these kinds are best used in under five years). As for plain salt — it can last forever, especially if kept in a cool, dry place.

Bottles with aromatic extract and dry vanilla beans on a napkin.
Credit: Africa Studio/ Shutterstock

Vanilla Extract

Pure vanilla extract can be a grocery store splurge, but if your oven is known for taking a hiatus between bursts of baking, it could be worth the extra cost. That’s because real vanilla extract doesn’t spoil thanks to its high alcohol content — over time, it can actually develop a deeper flavor. Imitation vanilla extract, however, has a drastically shorter shelf-life. While real vanilla is created by soaking vanilla beans in alcohol (which acts as a preservative), the flavoring dupe is made from vanillin, a manufactured substance that replicates the sweet and syrupy flavor. On the shelf, imitation vanilla lasts just six to 12 months before beginning to degrade and losing its flavor.

Close up of honey and a honey dipper.
Credit: bymuratdeniz/ iStock

Honey

Humans have risked bee swarms for thousands of years in the hopes of collecting a little honey. Beyond its use in cooking, the substance has also been used for healing wounds and even as a natural preservative — because the insect-produced food is one of the few that rarely expires. Honey’s indefinite shelf-life is thanks to its sugar-dense composition, with less than 20% of its makeup coming from water. The nectar also has two other preserving factors: It has an acidic pH level that is unsuitable for bacteria, and its viscous state creates an oxygen barrier that prevents pathogens from growing. However, there is a catch: To maintain these properties, honey must be stored in a sealed container safe from humid conditions. Even then, the USDA suggests honey is at its best when consumed within a year.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Interesting Facts

Most of us probably just enjoy our food without thinking too deeply about it. But the world of culinary delights holds many mysteries. What’s the secret history of the bagel? What does “continental breakfast” really mean? Which nut has been known to explode during transport, and which favored breakfast item is slightly radioactive? And finally, what’s the difference between sweet potatoes and regular potatoes? The following 25 facts will give you plenty of fodder for your next dinner party.

Credit: Douglas Rissing/ iStock

Green Bell Peppers Are Just Unripe Red Bell Peppers

If you’ve ever found yourself in the grocery store struggling to decide between red and green bell peppers, you may be interested to learn that they’re the very same vegetable. In fact, green bell peppers are just red bell peppers that haven’t ripened yet, while orange and yellow peppers are somewhere in between the two stages. As they ripen, bell peppers don’t just change color — they also become sweeter and drastically increase their beta-carotene, vitamin A, and vitamin C content. So while the green variety isn’t quite as nutritious as its red counterpart, the good news is that one eventually becomes the other.

Credit: graletta/ iStock

Pistachios Can Spontaneously Combust

It turns out there’s a price to pay for how tasty and nutritious pistachios are: Under the right circumstances, they can spontaneously combust. Everyone’s favorite shelled nut is especially rich in fat, which is highly flammable. Thankfully, that only becomes a problem when pistachios are packed too tightly during shipping or storage. It’s important to keep the nuts dry lest they become moldy — but if they’re kept too dry and there are too many of them bunched together, they can self-heat and catch fire without an external heat source.

Though exceedingly rare and easy to avoid if the proper instructions are followed, pistachio self-combustion is a real enough concern that the German Transport Information Service specifically advises that pistachios “not be stowed together with fibers/fibrous materials as oil-soaked fibers may promote self-heating/spontaneous combustion of the cargo.” Don’t worry, though: It won’t happen in your pantry with just a few bags, so you can indulge in the shelled snack without worrying about their flavor becoming unexpectedly smoky.

Credit: littleny/ iStock

Philadelphia Cream Cheese Isn’t Actually From Philadelphia

The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods. So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence branded his cheese under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.

Credit: LauriPatterson/ iStock

Bagels Were Once Given as Gifts to Women After Childbirth

After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challahs is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.

Credit: yipengge/ iStock

The Word for a Single Spaghetti Noodle Is “Spaghetto”

If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction — one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word “spago,” which means “twine” or “string.”

Credit: HAKINMHAN/ iStock

Ketchup Was Originally Made Out of Fish

If you asked for ketchup thousands of years ago in Asia, you might have been handed something that looks more like today’s soy sauce. Texts as old as 300 BCE show that southern Chinese cooks mixed together salty, fermented pastes made from fish entrails, meat byproducts, and soybeans. These easily shipped and stored concoctions — known in different dialects as “ge-thcup,” “koe-cheup,” “kêtsiap,” or “kicap” — were shared along Southeast Asian trade routes. By the early 18th century, they had become popular with British traders. Yet the recipe was tricky to recreate back in England because the country lacked soybeans. Instead, countless ketchup varieties were made by boiling down other ingredients, sometimes including anchovies or oysters, or marinating them in large quantities of salt (Jane Austen was said to be partial to mushroom ketchup). One crop that the English avoided in their ketchup experiments was tomatoes, which for centuries were thought to be poisonous.

Credit: laartist/ iStock

Jelly Beans Have Been to Space

What do Neil Armstrong, tortoises, and jelly beans have in common? Why, they’ve all been to space, of course. President Ronald Reagan was known for being a connoisseur of the chewy candy, so much so that he provided the astronauts aboard the Challenger shuttle with a bag full of them in 1983 — a gift that resulted in charming footage of them tossing the jelly beans in zero gravity before happily eating them. Reagan was also known to break the ice at high-level meetings by passing around jelly beans, even commenting that “you can tell a lot about a fella’s character by whether he picks out all of one color or just grabs a handful.”

Credit: LauriPatterson/ iStock

Eggo Waffles Were Originally Called “Froffles”

The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.

When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name Froffles, a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years.

Credit: Dorin_S/ iStock

Canada Has a Global Strategic Maple Syrup Reserve

A rainy-day cache of sweet, sticky maple syrup may seem more like a luxury than a necessity, but it’s a big deal to Canada, which produces more than 70% of the world’s supply from maple trees grown in the province of Quebec. As such, the Federation of Quebec Maple Syrup Producers (QMSP) founded the Global Strategic Maple Syrup Reserve in 2000 to help regulate the profitable business. Covering an area of 267,000 square feet across three facilities, the reserve has endured poor sugaring seasons and the dastardly theft of some $20 million worth of barrels in 2012. And even when the COVID-19 pandemic forced many families to fulfill their pancake cravings at home, the QMSP promised to keep pace by announcing that it would release more than half of its 100 million-pound reserve in 2022.

Credit: eclipse_images/ iStock

Pineapples Were Once So Valuable People Rented Them for Parties

In the 1700s, party hosts and guests looking to make a statement were in the rental market for a special kind of accessory: pineapples. The message they were trying to send? That they were extravagantly wealthy. Prior to the 20th century, when pineapple plantations made the fruit widely available, pineapples were incredibly expensive imports to Europe (and most other places). In the 18th century, a single fruit bought in Britain could cost upwards of $8,000 in today’s money.

Christopher Columbus is credited with introducing pineapples to Europe in the 1490s after voyaging to the Americas. Just one survived his return journey, and the bromeliad quickly had an impact. Dubbed the “king of fruits,” the pineapple became a symbol of opulence and royalty because of its scarcity. Pineapples were featured in paintings of kings, printed on linens and wallpaper, and even carved into furniture. Obtaining a rare pineapple meant the buyer had money and status — and for that reason, the fruit was also often featured decor at parties and events. Eventually, European botanists learned to grow pineapples in greenhouses and reduce their cost. But until the fruits were widely available, many partygoers in Britain would seek out a pineapple for just one night, renting the fruit for a fraction of its full price and sometimes even carrying it around at the party as the ultimate (uneaten) accessory.

Credit: Werner Schneider/ iStock

Carrots Weren’t Originally Orange

Today carrots are practically synonymous with the color orange, but their auburn hue is a relatively recent development. When the carrot was first cultivated 5,000 years ago in Central Asia, it was often a bright purple. Soon, two different groups emerged: Asiatic carrots and Western carrots. Eventually, yellow carrots in this Western group (which may have developed as mutants of the purple variety) developed into their recognizable orange color around the 16th century, helped along by the master agricultural traders of the time — the Dutch.

A common myth says the Dutch grew these carrots to honor William of Orange, the founding father of the Dutch Republic, but there’s no evidence of this. What’s more likely is that the Dutch took to the vegetable because it thrived in the country’s mild, wet climate. (Although the orange color may have first appeared naturally, Dutch farmers made it the predominant hue by selectively growing orange roots — scholars say these carrots likely performed more reliably, tasted better, and were less likely to stain than the purple versions.) The modern orange carrot evolved from this period of Dutch cultivation, and soon spread throughout Europe before making its way to the New World. Today, there are more than 40 varieties of carrots of various shapes, sizes, and colors — including several hues of purple.

Credit: Adam Smigielski/ iStock

Bananas Are Slightly Radioactive

Mentions of radioactivity can send the mind in a dramatic direction, but many ordinary items are technically radioactive — including the humble banana. Radioactivity occurs when elements decay, and for bananas, this radioactivity comes from a potassium isotope called K-40. Although it makes up only 0.012% of the atoms found in potassium, K-40 can spontaneously decay, which releases beta and gamma radiation. That amount of radiation is harmless in one banana, but a truckload of bananas has been known to fool radiation detectors designed to sniff out nuclear weapons. In fact, bananas are so well known for their radioactive properties that there’s even an informal radiation measurement named the Banana Equivalent Dose, or BED.

So does this mean bananas are unhealthy? Well, no. The human body always stores roughly 16 mg of K-40, which technically makes humans 280 times more radioactive than your average banana. Although bananas do introduce more of this radioactive isotope, the body keeps potassium in balance (or homeostasis), and your metabolism excretes any excess potassium. A person would have to eat many millions of bananas in one sitting to get a lethal dose (at which point you’d likely have lots of other problems).

Credit: dogayusufdokdok/ iStock

Cheese Is the World’s Most-Stolen Food

Each year, about 4% of the world’s cheese supply is stolen — making it the most-stolen food in the world. Cheese, after all, is big business: Global sales exceeded $114 billion in 2019. In Italy, Parmesan is so valuable it can be used as loan collateral, according to CBS News. Consequently, the black market for cheese is thriving. From 2014 to 2016, organized crime was responsible for stealing about $7 million of Parmesan. And dairy-based crime definitely isn’t limited to Italy: In 2009, a duo of cheese thieves in New Zealand led police on a high-octane car chase — and tried to throw off the pursuit by tossing boxes of cheddar out the window.

Credit: Liliia Bila/ iStock

The Ancient Romans Thought Eating Butter Was Barbaric

Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — including and especially gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.

The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose.

Credit: alvarez/ iStock

Sweet Potatoes Aren’t Potatoes

Sweet potatoes and common potatoes share part of a name and the spotlight at Thanksgiving meals, but the two are entirely different plants — and sweet potatoes aren’t even potatoes. While both root vegetable species are native to Central and South America, they’re classified as unrelated. Sweet potatoes belong to the Convolvulaceae family, a group of flowering plants that’s also called the morning glory family. Potatoes belong to the nightshade (Solanaceae) family, and are cousins to peppers, tomatoes, and eggplants. Both species get their name from an Indigenous Caribbean term, batata, which eventually morphed into the English “potato.” By the 1740s, “sweet” was added to the orange-fleshed tuber’s name to differentiate the two root crops.

Meanwhile, yams are biologically unrelated to either sweet potatoes or common potatoes. These tubers belong to the Dioscoreacea family, a group of flowering plants usually cultivated in tropical areas. Luckily, you don’t have to know their scientific classification to distinguish between the two non-spuds at the grocery store: Sweet potatoes have tapered ends and relatively smooth skin, while true yams are generally larger with rough bark and a more cylindrical shape. At most U.S. grocery stores, what you’re seeing labeled as a yam is probably actually a sweet potato.

Credit: seanrmcdermid/ iStock

Nutmeg Is a Hallucinogen

Today, nutmeg is used in the kitchen to add a little zing to baked goods and cool-weather drinks, though at various times in history it’s been used for fragrance, medicine … and its psychotropic properties. That’s possible thanks to myristicin, a chemical compound found in high concentrations in nutmeg, but also produced in other foods like parsley and carrots. Myristicin is able to cause hallucinations by disrupting the central nervous system, causing the body to produce too much norepinephrine — a hormone and neurotransmitter that transmits signals among nerve endings. While the idea of conjuring illusions of the mind might sound intriguing, nutmeg intoxication also comes with a litany of unpleasant side effects, including dizziness, confusion, drowsiness, and heart palpitations, so don’t try this at home.

Credit: Hiob/ iStock

Honey Never Expires

As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.

Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria. In most cases, honey can be safely stored for years on end — just make sure it’s in a sealed container (and check out these five other foods that almost never expire).

Credit: carterdayne/ iStock

Misting Produce Is a Clever Way To Make You Buy More

Many grocery stores display produce in open cases fitted with tiny jets to periodically bathe the veggies in a cool mist. (Some supermarkets even pipe in the sound of thundering rain to add to the rainy vibe.) The purpose behind misting is not to keep produce clean or extend its shelf life — it’s a clever way for grocers to make the fruits and vegetables look fresher and healthier so consumers purchase more. Water clinging to leafy greens also adds weight, which increases revenue for the store when vegetables are sold by the pound.

Ironically, misting actually shortens produce’s shelf life because water allows bacteria and mold to take hold. Misted veggies will likely not last as long in your fridge as those that weren’t misted in the produce aisle — which is another, perhaps sneakier, way to get you to buy produce more often.

Credit: jenifoto/ iStock

Plant Milks Have Been Around for 5,000 Years

For years, dairy producers have sued alternative milk companies for using the word “milk” on their packaging — but history is not on their side. Evidence suggests that Romans had a complex understanding of the word “milk,” as the root of the word “lettuce” comes from “lact” (as in “lactate”). Many medieval cookbooks make reference to almond milk, and the earliest mention of soy milk can be found on a Chinese stone slab from around the first to third century CE. However, coconut milk has the longest history; archaeologists have recovered coconut graters among relics from Madagascar and Southeast Asia that date back to around 3000 to 1500 BCE.

Credit: kirin_photo/ iStock

“Continental Breakfast” Is a British Term for Breakfast on the European Continent

Many hotels offer guests a free “continental” breakfast with their stay, but what exactly makes a breakfast “continental”? The term originated in the mid-19th century in Britain as a way to distinguish the hearty English breakfast — typically consisting of eggs, bacon, sausage, toast, and beans — from the lighter fare found in places like France and other Mediterranean countries in continental Europe. It typically consists of pastries, fruits, toast, and coffee served buffet-style. As American breakfasts also tended to feature outsized helpings of protein and fruits, the “continental” moniker proved useful for hotels on the other side of the Atlantic as well.

Credit: monticelllo/ iStock

Chickens Might Be Among the Closest Living Relatives of the Tyrannosaurus Rex

Dinosaurs still live among us — we just call them birds. Today, scientists consider all birds a type of dinosaur, descendants of creatures who survived the mass extinction event at the end of the Cretaceous period. And yes, that even includes the chicken. In 2008, scientists performed a molecular analysis of a shred of 68 million-year-old Tyrannosaurus rex protein, and compared it to a variety of proteins belonging to many different animals. Although proteins from alligators were relatively close, the best match by far belonged to ostriches — the largest flightless birds on Earth — and the humble chicken.

Following the initial 2008 study, further research has proved that a chicken’s genetic lineage closely resembles that of its avian dinosaur ancestors. Scientists have even concluded that a reconstruction of T. rex’s chromosomes would likely produce something similar to a chicken, duck, or ostrich. Meanwhile, some archaeological evidence supports an idea that the earliest human-raised chickens may not have been eaten, but instead revered and possibly even used as psychopomps, aka animals tasked with leading the deceased to the afterlife.

Credit: elenaleonova/ iStock

Botanically speaking, a nut is a fruit with a hard shell containing a single seed. The true nuts you might encounter in the produce aisle include hazelnuts and chestnuts. Many of the products sold as “culinary nuts” belong to other botanical classifications. Cashews, almonds, and pistachios are drupes, a type of fruit with thin skin and a pit containing the seed. (Peaches, mangos, cherries, and olives are also drupes.) And the jury is still out on whether walnuts and pecans fall into the nut or drupe category since they have characteristics of both. Some botanists call them drupaceous nuts.

Credit: Denisfilm/ iStock

Tomatoes Have More Genes Than Humans

We humans have somewhere between 20,000 and 25,000 genes — a sizable number to be sure, but still considerably fewer than the 31,760 in everyone’s favorite nightshade, the tomato. Though scientists still aren’t sure why tomatoes have such a complex genome, an emerging theory relates to the extinction of the dinosaurs. Around the time those giant creatures disappeared from Earth, the nightshade family (Solanaceae) tripled its number of genes. Eventually the superfluous copies of genes that served no biological purpose disappeared, but that still left a lot of functional ones; some believe the extra DNA helped tomatoes survive during an especially perilous time on the planet, when it was likely still recovering from the aftereffects of a devastating asteroid.

Humans, meanwhile, have two copies of every gene: one from their mother and one from their father. The number of genes doesn’t necessarily imply biological sophistication, but rather how an organism “manages its cells’ affairs” — simply put, humans make more efficient use of the genes they have.

Credit: Floortje/ iStock

Gummy Bears Owe Their Shape to Dancing Bears

In 19th-century Europe, it wasn’t uncommon to see trained bears frolicking down the streets in celebration of a parade or festival. Called “dancing bears,” these animals would skip, hop, whirl, twirl, and perform an array of tricks. Fast-forward to the 1920s, when German candymaker Hans Riegel was searching for a clever way to sell his gelatin-based confections to children. Recalling the two-stepping bears of yore, Riegel decided to make an Ursus-shaped candy called Tanzbär (literally “dancing bear”). The snacks were a huge success. Today, you probably know Riegel’s company as Haribo.

Credit: LauriPatterson/ iStock

Egg Creams Contain Neither Eggs nor Cream

Foods tend to get their names from their appearance or ingredients, though not all are so clear-cut. Take, for instance, the egg cream, a beverage that has delighted the taste buds of New Yorkers (and other diner patrons) since the 1890s. But if you’ve never sipped on the cool, fizzy drink known for its chocolate flavor and foamy top, you should know: There are no eggs or cream in a traditional egg cream drink.

According to culinary lore, the first egg cream was the accidental invention of Louis Auster, a late-19th- and early-20th-century candy shop owner in New York’s Lower East Side. Auster’s sweet treat arrived in the 1890s, at a time when soda fountains had started selling fancier drinks, and it was a hit — the enterprising inventor reportedly sold upwards of 3,000 egg creams per day by the 1920s and ’30s. However, Auster kept his recipe well guarded; the confectioner refused to sell his formula, and eventually took his recipe to the grave. The origins of the drink’s name have also been lost to time. Some believe the name “egg cream” came from Auster’s use of “Grade A” cream, which could have sounded like “egg cream” with a New York accent. Another possible explanation points to the Yiddish phrase “echt keem,” meaning “pure sweetness.”

Feature image credit: Original photo by kate_sept2004/ iStock

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by yipengge/ iStock

According to Ethiopian legend, the first beings to get a java jolt were a herd of goats who nibbled on the fruit of a coffee shrub; a goat herder named Kaldi quickly followed their lead. While it’s unclear whether Kaldi and his caprines truly discovered coffee, today the brewed beverage is one of the most widely consumed drinks in the world. Introduced to America in the mid-17th century, coffee quickly replaced heavily taxed tea as a more patriotic staple during the fight for independence. In the time since, soldiers have relied on coffee to boost morale overseas, children of U.S. Presidents have founded their own coffeehouses, and American coffee brands have expanded across the globe. Can’t get enough coffee? Discover 12 amazing facts you might not know about this beloved morning beverage.

Young woman holding pink cup of coffee.
Credit: Svitlana Hulko/ Shutterstock

Regularly Consuming Coffee May Have Health Benefits

Coffee was once believed to be an unhealthy (and possibly dangerous) indulgence, but newer research suggests it actually has health benefits that could extend your life. Researchers have found that people who drink moderate amounts of coffee — about two to five cups — have lower risks of developing Type 2 diabetes, Parkinson’s disease, heart disease, and some cancers. Both regular and decaf coffee offer these perks; it’s not the caffeine that helps, but likely the polyphenols, a plant compound found in coffee that works as an antioxidant.

View of an espresso machine about to pour espresso into white cup.
Credit: Henrique S. Ruzzon/ Unsplash

Espresso Was Invented to Speed Up Coffee Breaks

At least three Italian inventors played a role in creating espresso. Angelo Moriondo was the first; his “steam machinery for the economic and instantaneous confection of coffee beverage” was patented in 1884, though it could only brew large batches of coffee and never became commercially available. Nearly two decades later, Luigi Bezzera created his own espresso machine, which cut down brewing time from several minutes to just 30 seconds (much better for workers on coffee breaks). With the help of inventor Desiderio Pavoni, Bezzera’s reworked machine — which most closely resembles espresso machines used in coffee shops today — debuted at the 1906 World’s Fair in Milan, where the duo sold “caffè espresso.”

Chicory drink in a white mug with chicory flowers.
Credit: Kabachki.photo/ Shutterstock

New Orleans Is Known for an Herby Coffee Alternative

Coffee drinkers in France have long blended their brews with chicory, a blue-flowered herb native to Europe and Asia with roots that offer a coffee-like flavor when roasted. French settlers brought the practice to Louisiana, helping chicory coffee become a mainstay in times of conflict when real coffee imports were hard to come by — such as during the Napoleonic Wars in the early 1800s, and later during the Civil War. However, amid some conflicts, like World War I, chicory was in so much demand that the once-cheap substitute cost more than actual coffee. Today, many coffee drinkers in New Orleans continue to enjoy their java with chicory included.

19th Century illustration of a Capuchin monk.
Credit: Bettmann via Getty Images

The Word “Cappuccino” Was Inspired by Monks

Italy is often called the world’s coffee capital, so it’s no wonder that many of the words we use to describe a cup of joe come from Italian. Take, for example, the cappuccino (a drink made from espresso and steamed milk), which gets its name from a 16th-century order of Italian monks. The Capuchin friars were known for helping those experiencing poverty; as such, they themselves rebuked wealth and wore simple brown robes, with long, pointed hoods that were called “cappuccio.” The earliest cappuccino drinks, which emerged around the 1700s, were nicknamed after these religious figures, because adding milk to espresso resulted in a color similar to that of the monks’ attire.

The man is pouring hot coffee from the coffee pot into a white coffee.
Credit: snapper8S8/ Shutterstock

Black Coffee Cools Faster Than Coffee With Cream

Adding a little cream to your cup of coffee can improve the flavor and possibly help it stay warm for longer. Some food scientists believe that coffee with cream cools about 20% more slowly than plain black coffee, thanks to three rules of physics. Darker colors emit heat faster than lighter colors, so adding cream to lighten the drink’s hue may slow down heat loss. Hotter surfaces also radiate heat faster, so plain coffee will cool faster than a cup that’s been slightly chilled by adding cold cream. Viscosity is also a factor: Cream thickens coffee, making a steamy cup evaporate more slowly. Since evaporation causes heat loss, the less there is, the more time you’ll have to enjoy coffee before it’s too cold.

A glass of coffee jelly.
Credit: kaorinne/ iStock

Home cooks of the early 1800s could try their hand at making coffee jelly, a dessert that originated in England and later spread to the Eastern U.S. Coffee jelly was promoted as an alternative to the hot beverage for those who didn’t like the taste or whose stomachs didn’t agree with the acidity. The jiggly dessert was considered a healthy option for people who were sick, or could be eaten as an after-dinner curative for people who drank too much alcohol at mealtimes. While coffee jelly is now a rarity in the United States, it is commonly found in Japan, where it’s a popular treat.

A worker processing the coffee berries by putting them in a machine to be washed.
Credit: grandriver/ iStock

Coffee Beans Aren’t Actually Beans

It turns out that the name you’re familiar with for those tiny pods that are ground and brewed for a fresh cup of joe is a misnomer. Coffee “beans” are actually the seeds found within coffee cherries, a reddish fruit harvested from coffee trees. Farmers remove the skin and flesh from the cherry, leaving only the seed inside to be washed and roasted.

Coffee farming is a major time investment: On average, a tree takes three or four years to produce its first crop of cherries. In most of the Coffee Belt — a band along the equator where most coffee is grown that includes the countries of Brazil, Ethiopia, and Indonesia — coffee cherries are harvested just once per year. In many countries, the cherries are picked by hand, a laborious process.

Black coffee on a black background with inscription decaf made with white chalk.
Credit: Alena A/ Shutterstock

Decaf Coffee Is Still a Tiny Bit Caffeinated

Decaf coffee has helped coffee drinkers enjoy the taste of coffee without (much of) the jolting effects of caffeine, but its creation was entirely accidental. According to legend, around 1905 German coffee merchant Ludwig Roselius received a crate of coffee beans that had been drenched with seawater. Trying to salvage the beans, the salesman roasted them anyway, discovering that cups brewed with the beans retained their taste (with a little added salt) but didn’t have any jittery side effects. Today, the process for making decaf blends remains relatively similar: Beans are soaked in water or other solvents to remove the caffeine, then washed and roasted. However, no coffee is entirely free of caffeine. It’s estimated that 97% of caffeine is removed during preparation, but a cup of decaf has as little as 2 milligrams of caffeine — compared to regular coffee’s 95 milligrams.

Johann Sebastian, a German musician and composer playing the organ.
Credit: Rischgitz/ Hulton Archive via Getty Images

Bach Wrote an Opera About Coffee

Johann Bach is remembered as one of the world’s greatest composers, known for orchestral compositions such as the Brandenburg Concertos. But one of Bach’s lesser-known works is Schweigt stille, plaudert nicht (“Be Still, Stop Chattering”) — a humorous ode to coffee popularly known as the Coffee Cantata. Written sometime in the 1730s, Bach’s opera makes light of fears at the time that coffee was an immoral beverage entirely unfit for consumption. In the 18th century, coffee shops in Europe were known to be boisterous places of conversation, unchaperoned meeting places for young romantics, and the birthplaces of political plots. A reported lover of coffee, Bach wrote a 10-movement piece that pokes fun at the uproar over coffee. The opera tells the story of a father attempting to persuade his daughter to give up her coffee addiction so that she might get married, but in the end, she just becomes a coffee-imbibing bride.

Drinking coffee at breakfast.
Credit: yipengge/ iStock

The First Webcam Was Invented For a Coffee Pot

We can credit coffee-craving inventors for creating the first webcam. In the early 1990s, computer scientists working at the University of Cambridge grew tired of trekking to the office kitchen for a cup of joe only to find the carafe in need of a refill. The solution? They devised a makeshift digital monitor — a camera that uploaded three pictures per minute of the coffee maker to a shared computer network — to guarantee a fresh pot of coffee was waiting the moment their mugs emptied. By November 1993, the in-house camera footage made its internet debut, and viewers from around the globe tuned in to watch the grainy, real-time recording. The world’s first webcam generated so much excitement that computer enthusiasts even traveled to the U.K. lab to see the setup in real life. In 2003, the coffee pot sold at auction for nearly $5,000.

US President Bill Clinton and British Prime Minister Tony Blair confer over cups of coffee.
Credit: PAUL J. RICHARDS/AFP via Getty Images

Coffee Was Frequently a Staple in the Oval Office

Coffee has a long political history in the U.S. — colonists who tossed heavily-taxed tea into the Boston Harbor switched to drinking the caffeinated brew as part of their rebellion. But even after the Revolutionary War’s end, American leaders held an enduring love for the beverage. George Washington grew coffee shrubs at his Mount Vernon estate (though because of climate, they likely never produced beans), while Thomas Jefferson loved coffee so much that he estimated using a pound per day at Monticello during retirement. Similarly, Theodore Roosevelt reportedly consumed an entire gallon of coffee each day, and George H.W. Bush was known for imbibing up to 10 daily cups.

Aerial view of 3 cups of coffee.
Credit: Nathan Dumlao/ Unsplash

Your Genes Might Determine How Much Coffee You Drink

If you can’t get through the day without several cups of coffee, you may have your genes to blame. A 2018 study suggests inherited traits determine how sensitive humans are to bitter foods like caffeine and quinine (found in tonic water). Researchers found that people with genes that allow them to strongly taste bitter caffeine were more likely to be heavy coffee drinkers (defined as consuming four or more cups daily). It seems counterintuitive that people more perceptive to astringent tastes would drink more coffee than those with average sensitivity — after all, bitter-detecting taste buds likely developed as the body’s response to prevent poisoning. But some scientists think that human brains have learned to bypass this warning system in favor of caffeine’s energizing properties. The downside? Constant coffee consumers are at higher risk of developing caffeine addiction.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Sergey Kirsanov/ iStock

Fries without ketchup, pancakes without syrup — what would your favorite dishes be like without a little sauce? Condiments can make or break a meal; the word, after all, comes from the Latin condimentum, meaning “to season.” Take a moment to appreciate all the taste bud sensations that sidekick sauces can provide with these eight facts.

Pouring Soy Sauce into bowl.
Credit: ffolas/ Shutterstock

Soy Sauce Was Originally Made From Meat

The soy sauce you find at grocery stores today typically contains just four simple ingredients — soybeans, wheat, salt, and water — which are blended and fermented over several months or years to give the sauce its umami flavor. However, the oldest known types of soy sauce used meat in place of legumes. Called jiang, the flavoring was a thick and pasty blend of meat, a fermenting agent made from millet, and salt that fermented for about 100 days; it was ready when the meat had entirely dissolved. Food historians believe Chinese soy sauce makers eventually ditched using meat and switched to soybeans about 2,000 years ago.

Close-up of a small bowl of mayonnaise.
Credit: Sara Cervera/ Unsplash

The origins of mayonnaise are heavily debated among food historians, particularly regarding the issue of whether the creamy spread was invented by the Spanish or the French. One commonly told tale dates back to 1756 during the Seven Years’ War, when French forces set siege to Minorca’s Port Mahon (then ruled by the British). After the battle, a French chef working for the invading forces reportedly blended egg and oil together in a celebratory meal, calling the finished product “mahonnaise” for the region. However, some researchers believe residents of Port Mahon had already been making and using mayonnaise (their version was called Salsa Mahonesa). Regardless of who created it, mayo became linked with French cooking by the early 19th century, and the multipurpose dressing reached American menus by the 1830s.

Close-up of several Heinz ketchup packets.
Credit: Jacob Rice/ Unsplash

White House Staff Kept Ketchup on Hand for One President’s Breakfast

Among White House staff, Richard Nixon’s love of cottage cheese was well known. During his time in the Oval Office, the 37th President regularly enjoyed a breakfast of fruit, wheat germ, coffee, and cottage cheese topped with ketchup. (His last meal in office nixed the condiment, but did include a tall glass of milk and cottage cheese atop pineapple slices.)

A bowl of peanut butter next to sliced apples.
Credit: Natalie Behn/ Unsplash+

Nearly All American Shoppers Buy Peanut Butter

There’s one condiment you’ll have a good chance of finding in pantries across the country: peanut butter. In 2023, 90% of U.S. households included the smooth and creamy spread on their grocery lists. On average, Americans consumed 4.4 pounds of peanut butter per capita in 2023, a culinary craving that first became popular during World War I, when peanut butter was an inexpensive and easily accessible protein during wartime rationing.

Tapping a Maple Tree for Syrup.
Credit: brm/ Shutterstock

Syrup-Producing Trees Have a Special Name

Making pure maple syrup is a time-intensive labor that starts inside of “sugarbushes,” aka groves of maple trees. Syrup farmers can wait up to 40 years before a maple tree grows large enough to be tapped, and even when they are, the trees typically produce just 10 gallons of sap per tap hole per season. After boiling off excess water, that’s enough to make about 1 quart of maple syrup.

Close-up of Tabasco brand Louisiana style hot sauce.
Credit: Smith Collection/Gado/ Archive Photos via Getty Images

There’s a Hot Sauce-Themed Opera

Not many foods are the stars of an opera performance, though one kind of hot sauce is. Boston composer George Whitefield Chadwick debuted Tabasco: A Burlesque Opera in 1894. It tells the story of an Irish traveler lost at sea who washes ashore in Morocco and works as a chef, and who creates spicy dishes (his secret ingredient: Tabasco). Chadwick’s opera was partially financed by the McIlhenny Company — the maker of Tabasco. In its first week, it turned a profit of $26,000.

Ernest Hemingway standing looking out into nature.
Credit: Fotosearch/ Archive Photos via Getty Images

Ernest Hemingway’s Burger Recipe Used Tons of Condiments

One of Ernest Hemingway’s lesser-known creations wasn’t a novel, but a hamburger. His recipe included a smattering of condiments inside the mixture rather than on top. The author’s technique called for wine, garlic, and sometimes ground almonds, but also several different spice blends and relishes. His recommendation for getting the meat perfectly ready for the grill? Let it “sit out of the icebox for 10 to 15 minutes while you set the table and make the salad.”

Garum fish sauce.
Credit: VadimZakirov/ iStock

Historians Have Recreated a 2,000-Year-Old Condiment

You can get a taste for how ancient Romans and Greeks once ate with a little dash of garum, a fish sauce that was popular about 2,000 years ago. Historians relied on surviving recipes for instructions that included steps like leaving fish to break down in open containers for three months. However, it wasn’t until clay pots from a garum-making shop in Pompeii were unearthed that researchers found evidence of the sauce that could be analyzed for additional ingredients such as dill, fennel, and coriander that help the salty and umami-flavored sauce shine.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by bymuratdeniz/ iStock

It’s hard to resist the call of break room doughnuts, or the allure of a leisurely stroll down the grocery store snack food aisle. While it may not have much nutritional value, sometimes a good helping of a crispy, crunchy, sweet, salty, or sour snack is just what we need to remedy a bad mood or a tough day. Grab a snack and unwrap these eight facts about popular junk foods.

Making deep fried doughnuts.
Credit: nycshooter/ iStock

Doughnuts Cook Better Because of Their Holes

Ever wondered why doughnuts have holes? Historians aren’t certain why (or when) the doughy centers disappeared, but one theory suggests it may have been to help the pastries cook more evenly. According to food lore, American sailor Hansen Gregory created the doughnut’s modern shape around 1847 while at sea; by his account, doughnuts of the time were twisted or diamond-shaped and often cooked faster on the outsides than in the centers. Removing the dense middles helped create uniformly cooked treats that fried quickly and didn’t absorb as much oil.

Cake stuffed with cream, like a Twinkie.
Credit: Phacharason Mongkhonwikuldit/ iStock

Twinkies Got Their Name From a Shoe Advertisement

The spongy, cream-filled cakes we call Twinkies were first created in 1930 in an attempt to put unused bakery pans back into production. Creator James Dewar was a manager at the Continental Baking Company outside Chicago, where he noticed the factory’s strawberry shortcake-making equipment sat idle once strawberry season ended. Dewar used the pans to bake small cakes injected with cream fillings, naming his invention Twinkies after seeing a billboard for Twinkle Toe Shoes.

Putting a straw in a Big Gulp soda from 7-Eleven.
Credit: San Francisco Chronicle/Hearst Newspapers via Getty Images

7-Eleven and Coca-Cola Teamed Up To Create the Big Gulp

Supersized drinks are just one of the junk food finds you can pick up at 7-Eleven, thanks to a partnership between the convenience store chain and soda manufacturer Coca-Cola. Representatives from the beverage brand approached 7-Eleven leadership in 1976 about upgrading cup sizes from 20 ounces to 32 ounces; after successful market testing at locations in Southern California, 7-Eleven rolled out its larger cups nationally in 1980. However, supersizing drinks didn’t stop there — 7-Eleven rolled out its 44-ounce Super Gulp six years later, and launched the 64-ounce Double Gulp in 1989.

Close-up of potato chips in a bag.
Credit: Akarat Thongsatid/ Shutterstock

Potato Chips Were Nearly Discontinued During World War II

In the midst of World War II, the U.S. War Production Board was tasked with making the most of limited materials for the war effort, pausing manufacturing of noncritical foods and items. One of the items on the chopping block: potato chips. The snack was initially considered “nonessential,” a move that would stop factories from producing potato chips until the war ended. However, chip manufacturers lobbied to rescind the ruling and even secured contracts to produce chips for troops overseas and workers in manufacturing plants. One such company — Albany, New York’s Blue Ribbon potato chip brand — chipped in about 7 million pounds of crisps to the war effort in just nine months.

A teenage girl drinking a cold slushie.
Credit: Rick Rudnicki/ Alamy Stock Photo

Spoon straws make it easier to gulp down frosty drinks, but usually get little thought once the Slurpee is done. That’s exactly why the Museum of Modern Art keeps one in its collection. In 2004, a single spoon straw was featured as part of the museum’s “Humble Masterpieces” exhibit, which highlighted around 120 simple, everyday items. The spoon straw’s inventor — engineer Arthur Aykanian — held more than 40 patents, some straw-related, and others not so much, like medical tools used in skin cancer treatment.

Poptart toaster pastries with icing on top.
Credit: Cook Shoots Food/ Shutterstock

Pop-Tarts Originally Had a Different Name

The first toaster pastries — called Country Squares — hit grocery store shelves in 1963, created by Post Cereals. Kellogg Company released its own version six months later, called the Fruit Scone. After further workshopping, Kellogg changed the name to Pop-Tarts (a play on the Pop Art movement of the 1950s and ’60s), and produced the treats in four flavors: strawberry, blueberry, apple-currant, and brown sugar cinnamon. However, the iconic hard icing didn’t top the toaster treats until 1967, four years after the snacks debuted.

A spread of junk food snacks.
Credit: Flotsam/ Shutterstock

Eating Junk Food Rewires Our Brains

There’s a reason ordering your favorite fast-food snack or indulging in a candy bar feels good. It all has to do with dopamine, the “feel-good” hormone inside our brains that influences our moods, behaviors, and motivation. Eating junk food — especially items packed with sugar — triggers the brain’s reward system to release large amounts of dopamine, which makes us feel happy. Over time, our brains adapt to these dopamine rushes, causing junk food cravings and even growing more dopamine receptors that require larger amounts of junk food to have the same satisfying feeling.

A spread of junk food and desserts.
Credit: happy_lark/ iStock

You Can Celebrate National Junk Food Day Each Summer

Hot dogs aren’t the only food with their own day of celebration in July — the ever-expanding junk food category is honored in the same month. National Junk Food Day lands on July 21, giving you the chance to celebrate by indulging in all your favorite sweet and savory treats.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.