Original photo by andzher/ Shutterstock

One of the last places on Earth to be settled by humans, Iceland is in a world of its own. From its Martian-like geology to its stories of invisible stone-dwelling elves, the country has always been just a little bit mysterious. If you’re lucky, a visit to Iceland will reveal the magical glow of the northern lights and the majesty of wild reindeer herds. But if travel to the most northern national capital in the world (Reykjavík) or its environs isn’t on your agenda, these nine facts about Iceland might just tide you over.

Map of Iceland, travel destination.
Credit: AustralianCamera/ Shutterstock

It Was Almost Called “Snow Land”

Early historical tales called sagas (more on those below) record that Iceland – or Ísland in the local language – was not the original name for the place. According to one account, the first explorer to reach the island, when he was blown off his course to the Faroe Islands, called it Snæland because of all the snow. Later, a Viking who was trying to reach the Hebrides stumbled on Iceland and named it Garðar’s Isle, after himself.

Hearing about the wonders of this new land, a Viking named Flóki Vilgerðarson decided to settle there. After a brutal winter in which his livestock died, Flóki supposedly climbed a mountain hoping to spot better land. Seeing only a fjord full of icebergs, he complained that it was “Iceland” and, after he returned to Norway, told everyone that it was a terrible place. Fortunately, people didn’t believe him, and some soon returned to settle there permanently, likely sometime in the late ninth century.

Leif Eriksson Discovers America.
Credit: Hulton Archive via Getty Images

Icelandic Sagas Mention North America

Historical information about early Iceland comes from their sagas, narratives that deal with everything from family feuds to the push to convert Iceland to Christianity. Likely originally passed down orally, the sagas were written down around the 13th and 14th centuries. One Icelandic saga even recounts Norse explorer Leif Eriksson’s journey to North America around 1000 CE, where he encountered Indigenous people possibly as far south as New England.

Altarpiece from Ogur, West Iceland.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Iceland Once Mandated Christianity

The oldest parliament in the world, Iceland’s Alþingi (Althing or Assembly) decreed in 1000 CE that the country would become Christian. At the time, most of the settlers worshiped the Norse gods. Due to pressure from Norwegian King Olaf (who was Christian), and a growing divide within Iceland between pagans and Christians, a civil war threatened. The Althing decided to mandate conversion to Christianity — but with the understanding that certain pagan customs, such as eating horse meat, would not be prosecuted (at least initially) if done behind closed doors.

Hákarl, a culinary adventure from Iceland, showcases fermented shark meat.
Credit: Vimukthi avishka/ Shutterstock

Fermented Shark Is the National Dish

Hákarl is shark meat that has been fermented and cured for several months. Typically served as small, cheese-like cubes, hákarl reportedly tastes fishy and has an intense ammonia smell. For less adventurous eaters, puffin meat is a traditional Icelandic alternative. It is legal to hunt puffin in Iceland, and they often show up on restaurant menus, smoked and gamey. But the most popular Icelandic street food is pylsur, a hot dog made from lamb, pork, and beef and topped with a slew of chopped and fried onions.

Silfra Iceland underwater.
Credit: Hoiseung Jung/ Shutterstock

You Can Touch Europe and North America at the Same Time

Geographically, the rift between the North American and Eurasian continental plates runs straight through the middle of Iceland. Icelanders call the rift Silfra, and it can be visited within Thingvellir, a World Heritage Site that was also the location of the first meeting of the Althing parliament. Since the tectonic plates are drifting away from one another at about 2 centimeters per year, touching two continents at once requires diving down into the clear, cold, glacial water at Silfra, where they are closer together.

Reynisfjara black sand beach on a sunny day, Iceland.
Credit: Georgina Burrows/ iStock

Iceland Has a Martian Landscape

With geological formations unusual to many parts of the world — such as geysers, glaciers, and fjords — Iceland is a geologist’s dream. At the black sand beach near the town of Vik, for example, eagle-eyed beachcombers might spot a tiny bit of natural olivine (peridot) among the grains of sand. It’s a training ground for astronauts, too; Neil Armstrong and Buzz Aldrin went to Iceland in the 1960s to practice before their moon landing. Recent research shows that Iceland can also help scientists learn about Mars, particularly a fjord called Eyjafjörður in the north, which may be similar to the ancient Eridania Basin on Mars.

Myvatn Nature Baths near Lake Myvatn in Iceland.
Credit: Sylvia_Adams/ iStock

There’s Cheap, Unlimited Hot Water

Thanks to its highly geothermal location, the majority of Iceland’s energy use is renewable. While the Blue Lagoon is probably the most famous hot spring, the country’s water supply is generally put to more mundane uses, from supplying residential hot water at the turn of a tap to central heating to melting ice on sidewalks and parking lots.

Huldufolk dwelling in Vik, Iceland.
Credit: jurassicjay / Stockimo/ Alamy Stock Photo

Icelanders Believe That “Hidden Folk” Live in the Lava Fields

Icelanders have a long tradition of believing in huldufólk, “hidden people” described as elves who live in large rocks found among the lava fields and elsewhere. Huldufólk are usually considered to be peaceful creatures, living in a parallel world but helping humans at times. Many contemporary Icelanders have small elf shrines in their yards and stories about odd coincidences that they attribute to the huldufólk. While this elf tradition likely dates back to the pre-Christian settlement of Iceland, it has been brought into the present by superstitious Icelanders who leave out treats for the huldufólk on Christmas Eve.

A variety of books along with Christmas decorations.
Credit: Thomas Bethge/ Shutterstock

There’s a “Book Flood” on Christmas

When the house is cleaned and the huldufólk are happy, Christmas in Iceland is also a time to curl up with a good book. The tradition, called Jólabókaflóðið or “Christmas book flood,” began during World War II when most luxury items — but not paper — were rationed. These days, Icelandic booksellers send out a catalog each year in mid-November so that people can choose their favorite books to give or receive. There’s a lot to choose from: Iceland has more authors per capita than any other country.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Allstar Picture Library Limited/ Alamy Stock Photo

Humphrey Bogart’s parents wanted him to be a doctor. It didn’t work out that way, to the benefit of moviegoers everywhere. Instead, he became a top box-office attraction in the 1940s and 1950s, playing tough guys opposite actresses such as Lauren Bacall (whom he married in 1945), Ingrid Bergman, and Katharine Hepburn. Today, he’s often remembered for his role as Morocco nightclub owner Rick Blaine in the all-time classic Casablanca. Read on for a few fascinating facts about Bogie’s childhood, his favorite pastime, and the star who unwittingly helped pave his way.

Full-length studio portrait of American actor Humphrey Bogart as a toddler in overalls.
Credit: Hulton Archive via Getty Images

Humphrey Bogart Was a Christmas Baby

Bogart was born on December 25, 1899, in New York City. His father, a surgeon and heart and lung specialist, descended from New York’s first Dutch colonial settlers. As an adult, Bogart displayed the family coat of arms on his wall. His mother, known as “Lady Maud” for her imperious manner, was a suffragette known for standing on street corners selling balloons with the slogan “Votes for Women.” She worked as an illustrator and a portrait painter, and later as a magazine art director.

Humphrey Bogart at age ten months.
Credit: Bettmann via Getty Images

His Mother Dressed Him in Elaborate Clothing as a Child

Lady Maud liked to dress her son in Little Lord Fauntleroy suits she made herself. The outfit, named for a character in a novel, included velvet jackets and matching pants with a fancy blouse and a lace or ruffled collar. In his early teens, he wore white kid gloves and patent-leather pumps dancing at formal parties. His mother used him as a model for her drawings, but reportedly was not affectionate, and he was mainly taken care of by servants.

Actor Humphrey Bogart poses for a portrait circa 1940 in Los Angeles, California.
Credit: Donaldson Collection/ Moviepix via Getty Images

Bogart Was Expelled From Prep School

Young Bogart attended the elite Trinity School in New York City, where he earned poor grades and didn’t participate in social activities. For his last year of high school, his parents sent him to Phillips Academy in Andover, Massachusetts, a prep school his father had also attended. His parents hoped he would next study medicine at Yale. But Phillips expelled him for his poor academic performance and all-around bad attitude, and Bogart joined the U.S. Naval Reserve instead.

Actor Humphrey Bogart and his wife, actress Lauren Bacall, pictured playing chess.
Credit: Archive Photos via Getty Images

Bogart Loved Chess

Bogart famously plays chess in Casablanca, and the scenes may have been written into the script to please him. In real life, as a young man, he was said to hustle players for dimes and quarters in New York parks and at Coney Island. Bogart was also a chess tournament director, and active in a Hollywood chess club. In a June 1945 interview, he said that he played chess almost daily, and described the game as one of his main interests.

American actor Humphrey Bogart as Sam Spade in The Maltese Falcon.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Bogart Lived in the Shadow of Another Actor

A better-known actor at the time rejected the scripts for Dead End, High Sierra, The Maltese Falcon, and Double Indemnity, giving Bogart the chance to develop the roles in these future classics. The man who overshadowed Bogart back then? George Raft, hardly a household name today. At one point, Raft refused to accept Bogart as his co-star, in the 1941 film Manpower. Ironically, Raft, unlike Bogart, knew the world of tough guys firsthand. He was also born in New York, but in Hell’s Kitchen, then a violent slum. It’s even sometimes said that Raft turned down the chance to play Rick Blaine in Casablanca, but it’s more likely that the studio never offered it to him — despite him campaigning for it.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by simoly/ Shutterstock

Getting outside to see, hike, and sleep in the great outdoors is a classic summer activity, one that’s been popular among wilderness enthusiasts and nature novices for nearly 200 years. While camping has waxed and waned in popularity over the decades, the call of the wild beckoned more than 50 million Americans outdoors in 2020 and 2021, a pandemic-inspired trend that hasn’t let up. And with more than 130 national parks filled with campgrounds — plus thousands of state and local parks with their own overnight accommodations — there’s ample space to park an RV or set up a tent just about anywhere. Read on for five more facts about camping.

Camping tent behind union soldiers.
Credit: MPI/ Archive Photos via Getty Images

The Civil War Helped Popularize Camping in the U.S.

For Union and Confederate soldiers, camping wasn’t the fun activity we consider it today — it was a necessity of the conflict. Troop movements required soldiers on both sides to move long distances, carrying everything they needed to eat and sleep until they reached their next encampment (one possible origin for the word “camping”). While many Civil War conscripts did settle for longer periods of time in cabins and forts (especially during the freezing winter months), camping was a common occurrence.

At the time, sleeping under the stars wasn’t seen as glamorous, but that changed after the war’s end. In the years following the Civil War, camping slowly transformed from being a primitive military necessity to a romanticized activity. According to historian Phoebe S. K. Young, the idea of sitting around a campfire with friends, just like soldiers had, was one way the country tried to reframe the war’s impact during the tumultuous time of reconstruction. (In other words, maybe parts of the war hadn’t been that bad, or so the idea went.) Campers of the later Victorian era set off into nature to test their survival skills, looking to get away from the creature comforts of (then) modern society, and promoting camping as a vacation from the rigidity of daily life — an idea that’s stuck around ever since.

Group of expediters in sleeping bags.
Credit: UniversalImagesGroup via Getty Images

Early Sleeping Bags Had a Different Name

Bed rolls and other camp bedding have been around as long as humans have been trying to get comfortable z’s while dozing on the ground; some of the oldest surviving sleep sacks were made from warm animal hides. But in 1876, Welsh inventor Pryce Jones rolled out his version of the sleeping bag, which most closely resembles the ones we pack on our camping trips today. It had a different name, though: the Euklisia Rug.

Made from wool, the Euklisia Rug was essentially a blanket that could be folded over its occupant and fastened closed to keep them warm; the original design even included a pocket for an inflatable pillow. Jones’ invention was initially picked up by the Russian army, which bought his design in bulk; 60,000 of his so-called rugs were purchased for troops during the Russo-Turkish War, though not all would be delivered. The inventor was stuck with 17,000 after Russia canceled its order during the conflict. He sold them through his mail-order business, which helped the product catch on.

R.R. Conklin's double-decker auto bus.
Credit: HUM Images/ Universal Images Group via Getty Images

The First RVs Appeared in 1910

Just two years after Henry Ford unveiled his Model T car, eager outdoor enthusiasts were looking for ways their automobiles could get them out into nature, and sleep there, too. In 1910, Pierce-Arrow’s Touring Landau debuted in Madison Square Garden, complete with many of the amenities modern recreational vehicles have today. The Touring Landau featured a foldable back seat that transformed into a bed, a sink that folded from the chauffeur’s seat, a telephone to communicate with the driver, and a toilet. The car wouldn’t be the last of its kind; by 1915, New York inventor Roland R. Conklin rolled out his upgraded version, a bus that could hold 11 people and had a shower, a kitchen, and a hidden bookcase (although the vehicle was for his personal use only).

RV manufacturers continued expanding the portable campers, adding more of the comforts of home through the late 1920s, until the Great Depression led RV sales to drop. (However, some savvy Americans turned the campers into inexpensive mobile homes.) By World War II, RVs became the framework for mobile hospitals and other forms of war effort transportation, though they eventually returned to their original purpose — camping and vacationing — in the 1950s and beyond.

A group of Girl Scouts sits in a circle in the wilderness.
Credit: Historical/ Corbis Historical via Getty Images

You Can Thank Girl Scouts for S’mores

S’mores are the stuff of culinary legend — almost everyone enjoys them, but hardly anyone knows how they became so popular. Turns out the gooey, chocolatey treat dates back to around the 1920s, when they were called “Some-mores.” One of the first s’mores recipes appeared in 1927’s Tramping and Trailing with the Girl Scouts, a scouting guide that instructed brigades of campers on how to set up camp, hike safely, and build fires (a necessity for melting marshmallows). By the 1970s, Girl Scout manuals updated the name to the shortened “s’mores,” arguably a bit easier to say with a mouth full of sticky dessert. In the decades since, s’mores have become traditional campground fare, even honored with their own holiday on August 10.

"Leave No Trace, Take Your Waste" sign.
Credit: Photobyt/ Alamy Stock Photo

Seven Principles Can Help You Be a Superb Camper

Most hikers and outdoor explorers head outside for a chance to reconnect with nature, an experience that can be restoring and enjoyable. Unfortunately, the impact of humans on our natural world can sometimes dampen the adventure. That’s the motivation behind Leave No Trace, an outreach program that works to preserve natural spaces by educating the public about minimizing our recreational footprint. Emerging around the 1960s and ’70s when backpacking and camping boomed in popularity, Leave No Trace introduced seven principles, supported by national parks and other conservation groups, that help keep landscapes pristine and enjoyable for all. Most of the guidelines now seem like no-brainers: Properly dispose of trash where it belongs, respect wildlife by giving animals space, and plan ahead for your outdoor adventure to stay safe, for example. But the list of outdoor ethics also provides tips for keeping campfires forest-friendly and picking the perfect campsite without disturbing local flora and waterways. Familiarizing yourself with the long-standing outdoor code can help make your time at camp more enjoyable — now, and for years to come.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Andrii Zastrozhnov/ iStock

Neuroscience was once considered a subject so technical that even other kinds of scientists avoided it. But as it became a leading discipline, it sparked wide public interest, fed by books promising that it would solve age-old problems like how to get rich (Your Money and Your Brain: How the New Science of Neuroeconomics Can Help Make You Rich), whether we have souls (The Spiritual Brain: A Neuroscientist’s Case for the Existence of the Soul), and why men and women act differently (The Female Brain).

While brain science hasn’t yet solved every human puzzle (be patient!), it has the potential to increase our insight. At the same time, some popular myths have created misunderstandings about how our brains work. Is the brain really divided into two halves? Are left-handed people more intelligent? Do men and women truly have different brains? Find out below.

Human anatomy model of the left and right brain.
Credit: Bangkoker/ Shutterstock

True: The Brain Is Divided in Half

The human brain has two hemispheres that look like mirror images and are connected by fibers called the corpus callosum. The right side controls the left side of the body, and vice versa.

Beginning in the 1960s, research on “split-brain” patients established differences in the functions of the two sides. In these patients, the corpus callosum had been cut out as a treatment for epilepsy. Early experiments showed, for example, that the left side does most of our language processing. However, the corpus callosum allows for much cooperation between the two sides. The right side perceives humor and intonation in speech, and in people who suffer strokes on the left side, the right side can compensate, picking up language functions.

Close up of brown spiral notepad with a colorful sketch of the right brain.
Credit: Peshkova/ Shutterstock

False: Right-Brained People Are More Creative

There’s no evidence that anyone is “right-brained” or “left-brained.” Brain imaging studies of more than 1,000 children and adults through age 29 failed to find evidence that one side was stronger or more active than the other. People use their right and left sides depending on what they’re doing.

As for which side is “more creative,” there’s no good reason to give that status to the right. The left side of the brain is the storyteller, filling in the blanks when we have incomplete information — definitely creative!

Left hand holding a pen and writing text in a notebook.
Credit: vetre/ Shutterstock

False (Probably): Left-Handers Are More Intelligent

People tend to use their left hands for some tasks and their right hands for others, so whether you are left-handed or right-handed is a matter of degree. There may be some differences in how left-handers and right-handers process information, but this area is largely unstudied.

There is a little bit of evidence that people who are inconsistent about which hand they use may be more flexible thinkers. But in large meta-analyses using the classifications “left” and “right” for handedness, there was no serious difference in IQ.

Silhouette of a man and woman with working brains in blackboard style.
Credit: T. L. Furrer/ Shutterstock

True: Male and Female Brains Are Different

Women usually have slightly smaller brains than men do, even after adjusting for their overall size. (It’s worth noting that human brain size does not correlate with intelligence; Albert Einstein had a smaller-than-average brain, for example.) The volume of certain regions in the brain also differs between men and women, perhaps because of how genes and hormones play a role in brain development. For example, women have more volume in the prefrontal cortex, and men have more in the occipital region.

Differences between male and female brains are worth taking seriously, as they may help explain why men and women are not equally vulnerable to some mental illnesses. Women are more likely to be diagnosed with depression and anxiety, for example, while men suffer from more substance abuse.

Group of toy wooden color blocks.
Credit: Sergey Novikov/ Shutterstock

False: Men Have Better Spatial Awareness

For decades, it has been thought that men perform better on tests of “spatial cognition,” most often a test recognizing a shape that has been rotated. Ability on that test seems to predict better performance in math and science.

However, a 2020 study of students at the University of Limerick, Ireland, found that men and women approached the task differently but performed just as well.

Books in bookshelf in human brain form inside a head.
Credit: adventtr/ iStock

False: Your Brain Stores Memories

Your computer has files that you can pull up as needed. That’s not how the brain works. Instead, it reconstructs memories when called upon, starting with the big picture and then filling in details. Each time you ask it to remember an event, the reconstruction will occur differently and likely will have a slightly different result.

We need to trust our memories to function, though it’s worth staying open-minded about inaccuracies. Some research suggests that if you can quickly produce a memory and feel confident about it, you are more likely to be accurate.

Serotonin written on a pink paper below a drawing of a human brain.
Credit: evan_huang/ Shutterstock

False: Depression Involves Lack of Serotonin

Prozac became available to Americans in late 1987, followed by Zoloft in 1992. Pfizer advertised Zoloft as addressing a “chemical imbalance” in the brain, paving the way for a number of selective serotonin reuptake inhibitor (SSRIs) drugs that would be prescribed to people with symptoms of depression and anxiety.

The problem: No one knows what the correct levels for serotonin or other neurochemicals should be, and depressed or anxious people do not consistently show any difference in their serotonin function, as journalist and neuroscientist Christian Jarrett explains in Great Myths of the Brain. L-tryptophan, which boosts serotonin, doesn’t reliably give depressed people a boost, and tamping down healthy people’s levels of serotonin depresses some but not others. One antidepressant that is not approved in the United States but is used in Europe, tianeptine, actually reduces circulating serotonin. It is also not clear whether these widely popular antidepressants are more powerful than a placebo. When they are, one possibility is that the medications are, in fact, addressing a brain problem — just not a lack of serotonin.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by michellegibson/ iStock

Allergies are a pain — and can sometimes even be fatal. For those with food allergies, obsessive label-checking is a way of life, to seek out sneaky ingredients like soy or wheat. Some allergens have to be prominently displayed on packaging, but people with less common allergies have to be extra-careful.

Where do food allergies come from, and how do they work? How are they different from intolerances? Can seasonal allergies affect how we react to food? These nine fast facts about food allergies could help shed some light on something millions of people struggle with.

A jar with peanut butter on a peanuts background.
Credit: AtlasStudio/ Shutterstock

Nine Foods Cause the Most Food Allergy Reactions

Just about any food can cause an allergic reaction, but nine foods, known as the “big nine,” are responsible for around 90% of food allergies: milk, soy, eggs, wheat, peanuts, tree nuts, sesame, fish, and shellfish. For children, milk is the most common allergy; for adults, it’s shellfish.

 Sesame dessert with caramel on a linen tablecloth.
Credit: SedovaY/ Shutterstock

Sesame Allergies Are on the Rise

Until very recently, the “big nine” was the “big eight” — sesame wasn’t included until a few years ago. Because sesame allergies affect around 1 million people, however, the United States declared sesame a major allergen in 2021. As of 2023, the Food and Drug Administration (FDA) requires clear labeling of all foods that contain sesame.

Tired woman with allergy looking at food on table.
Credit: LightField Studios/ Shutterstock

Food Intolerances Are Fundamentally Different From Food Allergies

Allergic reactions are caused by the immune system. The body identifies a food — say, shrimp — as an invader, even though it’s not actually harmful, and produces antibodies to defend against it. Those antibodies spread to other cells in the body, which release chemicals that cause an allergic reaction. A serious, sudden reaction is called ​​anaphylaxis, and it can be life-threatening.

Intolerances are very real problems, but they work differently. When someone has an intolerance, they’re not able to properly digest certain foods, whether it’s because of a sensitivity, an enzyme deficiency, or a condition such as irritable bowel syndrome. This can cause gastrointestinal distress or other symptoms, although reactions are generally less severe than they are with allergies.

A young woman's hands with a soap bar.
Credit: Yauheniya Julia/ Shutterstock

Nonfood Items Can Trigger Food Allergies

Food allergies can apply to more than just food, and people with food allergies may have to avoid lotions, soaps, or even medication. Someone with an almond allergy may have a reaction to a lotion with almond oil in it, for example. Food additives show up in unexpected ways; milk or egg can show up in shampoo, and sesame oil is a common ingredient in lotions and soaps.

Celery stems and leaves on wooden cutting board.
Credit: Redmond135/ Shutterstock

Food Can Trigger Pollen Allergies

Are you prone to hay fever? Some fruits and vegetables may trigger a mild reaction when you eat them — usually itching, tingling, or swelling of your lips, mouth, and throat. This is sometimes called pollen food allergy syndrome, and it’s not usually serious. Not everybody with seasonal allergies gets these reactions, but allergens tend to be associated with certain foods; if you have a grass allergy, for example, you’re more likely to react to celery, melons, oranges, peaches, or tomatoes.

Woman scratching arm with allergy symptoms.
Credit: New Africa/ Shutterstock

33 Million People Have Food Allergies in the United States

There are around 333 million people in the United States, and around 33 million of them have some kind of food allergy, according to the advocacy group Food Allergy Research and Education (FARE). That’s about 1 in 10 adults and 1 in 13 children. Every year, around 200,000 people have medical emergencies related to food allergies.

Several brown eggs on tablecloth.
Credit: Sea Wave/ Shutterstock

People With Egg Allergies Don’t Have to Avoid Egg-Containing Vaccines

It’s a common misconception that people with egg allergies have to avoid certain vaccines — you’ve probably been asked if you have an egg allergy before getting one. This is because most flu vaccines have some egg protein in them. But according to recent research, it’s unlikely to trigger an allergic reaction, even among people with severe egg allergies. (Still, it’s always good to check with your doctor beforehand.)

Legs of little child with red rashes, closeup.
Credit: Africa Studio/ Shutterstock

Some Kids Outgrow Food Allergies

Childhood allergies aren’t necessarily forever, although the likelihood that a kid will outgrow them depends on what they’re allergic to and how severe the allergy is. About 60% to 80% of children with milk or egg allergies will outgrow them by age 16, especially if they’re able to eat these items in a baked good. Only 20% of kids with peanut allergies will outgrow them, though — and it’s even less likely to outgrow a tree nut or shellfish allergy.

A woman have allergy reactions to shrimp or seafood.
Credit: Doucefleur/ Shutterstock

Food Allergies Can Develop at Any Age

Allergies are often thought of as a childhood issue or something you’re born with — but the reality is that you can develop an allergy to anything at any point in your life, even foods that you’ve enjoyed for years. Fish and shellfish are the most common allergies to develop after childhood. In a 2019 survey, around half of adults with allergies reported developing them as adults.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by LaylaBird/ iStock

They say beauty only runs skin deep, but learning even a little bit about your largest organ may give you new appreciation for the external. From literally holding us in to protecting us from the ravages of the world without, our skin is pretty incredible. Here are six beautiful facts you might not know about human skin.

Dermatologist examining patient for signs of skin cancer.
Credit: kali9/ iStock

Skin Really Is Your Biggest Organ

It can be tempting to think of skin as the scientific equivalent of the frosting on your body’s cake, but it’s an organ as complex and important as the heart or liver. It’s also really heavy: The scientific consensus is that the entire, three-layer organ makes up about 16% of a human’s weight — equal to about 20 pounds for a 125-pound person. That’s the equivalent of four bricks or a miniature schnauzer. If you consider only the external surface of the skin, that weighs about 6% of the weight of any given person.

Your next-heaviest organs — the liver (3 pounds), brain (3 pounds), and lungs (2.2 pounds) — don’t even compare. Thanks to its vast surface area and triple-layered composition, your skin efficiently keeps your other organs from getting out and other stuff from getting in, all while flexing, stretching, and renewing itself continuously.

Microorganisms such as bacteria, viruses and fungi on skin surface.
Credit: Artur Plawgo/ iStock

Skin Is a Habitat All Its Own

You may have heard of the gut microbiome, but did you know your skin has its own microbiome, too? Your skin plays host to millions of microorganisms — tiny microbial communities that hitch a ride on, and even help, the exterior of your body.

The types of microorganisms that live on your skin vary depending on what type of terrain they encounter: moist or oily, exposed or enfolded, hairy or bare. They fall into four categories: viruses, bacteria, fungi, and mites. And they don’t just live there rent-free: Some of the critters crawling on your skin right now are thought to perhaps even play a part in teaching your T cells how to respond to harmful invaders. Others, like the common S. epidermidis, actually help your body defend itself against water loss and other damage.

Your skin colonies also change as you age. In fact, baby skin is thought to be sterile until the moment it encounters the world outside the womb. That’s when an important time for skin microbiome growth begins — a developmental heyday for your skin’s immune system. As a result, the skin microbiomes of babies and adults are thought to vary significantly, though research on both is still in its infancy.

Immune boosting fruit and vegetables for good health.
Credit: marilyn barbone/ Shutterstock

Your Diet Can Influence Your Skin’s Color

Folktales about carrots improving your night vision are mere propaganda. But there’s truth to the old line that you “are what you eat” — and it can be found on your skin’s surface, which can actually change color when you consume lots of the pigments found in red, yellow, and orange veggies like carrots. Known as carotenoids, these pigments impart what scientists call an “attractive yellow-orange color to skin.”

Sound like a ploy by Big Carrot? It isn’t. Carotenoid pigments can build up in the skin, producing a yellowish hue that is associated with a healthier body. That’s where the attraction part comes in: In a variety of studies, researchers have shown that people prefer the appearance of people who eat lots of carotenoids, likely because they signal a person has a healthy diet and higher perceived health. Both indicate the person is a desirable mate — all the more reason to grab a carrot and chow down.

American astronaut Joseph Tanner waves to the camera during a space walk.
Credit: NASA/ Hulton Archive via Getty Images

If We’re Going to Mars, We’re Going to Need Better Skin Care

Astronauts returning to Earth don’t just have to readjust to gravity: Many contend with skin issues. In fact, researchers note that skin problems are the most common health conditions experienced by astronauts, far outpacing skin ailments on Earth. Astronauts’ skin has endured everything from irritation due to on-board equipment to dryness and infections on space flights and at the International Space Station. Research has even shown that the complex skin microbiome undergoes changes while in space.

Microgravity, radiation, and the harsh environment of spacecraft seem to be the culprit, but scientists are still learning more about how space travel affects skin. With long-term space missions to Mars and elsewhere on the horizon, finding out how space affects skin has become a priority for researchers and private industry alike. As a result, a Colgate-Palmolive skin care company staged the first-ever in-space skin care experiment in 2022, and in 2023 another private sector experiment on the International Space Station is expected to test how lab-grown skin tissue grows in space. Insights from those studies and future inquiries could lead to the development of new skin protectants or inform skin care products down here on Earth, helping the planet-bound protect themselves against the ordinary ravages of aging.

Tattooist demonstrates the tattoo process on a hand.
Credit: Belyjmishka/ iStock

Tattoos Are Permanent Thanks to Dead Immune Cells

If your mom warned you that your ink is forever when you announced your intention to get a tattoo, she was right. But not for the reason you might think. While common wisdom has it that tattoo ink bypasses the permeable top layers of the skin and remains embedded in the dermis, the second skin layer, the truth is a bit more complicated — and more interesting. Recent research has revealed that macrophages, a type of white blood cell that specializes in gobbling up invasive pathogens, mistake tattoo ink for an infectious cell and flock to the scene to protect the body from the foreign substance. They show up, encircle the ink, and process it.

If macrophages clean up the ink, and the immune cells aren’t immortal, then why do tattoo markings last so long? Scientists asked the same question, and mouse studies led to an eerie answer: Once the macrophages die off, even more macrophages are thought to swoop in, eating the ink their dead brethren once contained. This generational turnover means tattoo ink can last for years with minimal fading — and keep Mom’s name intact for a lifetime.

Body oil in a bottle, being held up.
Credit: Natalia44/ Shutterstock

The Bible Contains Ancient Skin-Care Advice

The Bible may be a religious text, but it’s also packed with information and stories about skin care. Researchers consider the Book of Job’s description of the long-suffering Job’s chronic skin boils an accurate and early depiction of an actual genetic disease called AD-HIES (loss-of-function, autosomal dominant hyper-IgE syndrome). Nicknamed Job syndrome, the genetic disorder causes, you guessed it, boils all over the body.

A slightly lovelier tale can be found in the Old Testament story of Esther, which refers to a 12-month-long beauty and purification ritual undergone by would-be concubines of King Ahasuerus of Persia. Along with her fellow applicants to the king’s harem, Esther spends six months applying oil of myrrh to her body, and another six months putting on perfumes and other cosmetics. Only after the women have spent months doing the biblical equivalent of a makeover are they qualified for the king’s romantic consideration.

But what is myrrh, and why was it part of the elaborate beauty ritual? The name comes from Commiphora myrrha, a spiny, squat tree with fragrant sap that was used in religious rituals, to perfume cosmetic oils, and even as medicine to treat achy muscles and wounds. It wasn’t just placebo: The prized resin is stuffed full of substances that have anti-inflammatory and antimicrobial properties, which may also have improved the look of the skin.

Erin Blakemore
Writer

Erin Blakemore is a Boulder, Colorado journalist who writes about history, science, and the unexpected for National Geographic, the Washington Post, Smithsonian, and others.

Original photo by Matejay/ iStock

There are few household supplies more useful than salt. It’s not just a mandatory ingredient in the kitchen; it’s also a garden helper, skin exfoliator, brass polisher, ice melter, and much more. We started using salt to keep our food fresh eons before refrigeration existed. Many spiritual traditions even use it to banish or ward off evil spirits. This is all to say: We would be absolutely lost without it, or at least out of jerky. Let these eight facts about salt add a bit of seasoning to your day.

Coarse sea salt in a jar and spoon.
Credit: Cozy Home/ Shutterstock

You Need Salt to Live

Sodium helps your body manage water; without it, your cells can start to swell. This condition is called hyponatremia, and it can cause serious medical problems. It’s generally a good idea to watch your salt intake to make sure that you don’t get too much — but sodium is really about balance, and it’s possible to not get enough.

Hyponatremia is often caused by medication and certain underlying health problems, but it can also be caused by drinking too much water (this is very hard to do) or alcohol.

Woman walks on salt farm in the morning.
Credit: Munkie Dang/ Shutterstock

Humans Have Been Harvesting Salt for Around 8,000 Years

Salt is both delicious and essential, so it shouldn’t be too surprising that humans have been collecting it for thousands of years. In 2004, researchers in Romania found a salt collection well that was later carbon-dated to the early Neolithic period, somewhere between 6050 and 5500 BCE.

Mummification in Egypt.
Credit: R1F1/ Shutterstock

Egyptians Used Salt to Preserve the Dead

Ancient Egyptians used a mummification process to preserve dead bodies, now known as mummies. Specially trained priests removed all excess moisture to prevent decay, and were so successful that we can still see their work thousands of years later. After removing the organs, these priests would pack the body in natron — a sodium salt compound also used in cooking and medicine — inside and out, and wait for it to dry out before washing off the salt and wrapping the body in linen.

Pink Himalayan salt, in a bowl and scattered.
Credit: Alexander Ruiz Acevedo/ Shutterstock

Himalayan Pink Salt Gets Its Color From Iron (and Other Minerals)

Himalayan salt is harvested from salt mines in Pakistan. Unlike standard table salt, or even sea salt, it has a rosy color that’s highly sought-after for both lamps and kitchen tables, but its cult following isn’t because of the color alone — it’s what causes the color, too. The salt contains trace minerals such as calcium, potassium, magnesium, and iron. But while it looks pretty, there’s no evidence that it’s any healthier (or unhealthier) than table salt, especially since there are far more abundant sources of those minerals in most people’s diets.

Riot against corrupt taxation (especially salt).
Credit: Sovfoto/ Universal Images Group via Getty Images

People Have Gone to War Over Salt

Throughout history, salt has been considered a precious commodity — so much so that several wars have broken out over salt mining, selling, and taxation. A few notable examples include a 14th-century war between Venice and Padua over a Venetian salt monopoly, a 16th-century revolt against the Papal Army by the city of Perugia over salt pricing, and a 19th-century conflict between Mexican Americans, who had long been using a salt flat as communal property when it was part of Mexico, and a cadre of white American businessmen who decided to lay claim to it.

Mahatma Gandhi on his famous March to the Sea to make salt.
Credit: Bettmann via Getty Images

Gandhi Led a Massive Nonviolent Protest About Salt

Mahatma Gandhi is famous for his use of nonviolent protest against British rule in India. One of his biggest efforts was the Salt March in March and April of 1930. Through a series of laws, Great Britain had made it illegal for Indian people to sell or even produce their own salt, forcing them to buy expensive and heavily taxed salt from Britain. He started his 240-mile walk from his ashram near Ahmedabad with a group of followers on March 12, stopping at different villages and picking up more people on the way to Dandi, a town on the Arabian Sea, where he intended to make salt from the seawater there. By the time they reached their destination, the crowd had grown to tens of thousands. The coastline was full of naturally occurring salt deposits, and police, knowing the crowd was coming, had stomped them into the mud — but Gandhi picked a small lump from the beach, which was enough to break the law. Salt-making as civil disobedience spread throughout India, and around 60,000 people were arrested by British authorities as a result.

Salt is scattered and dollars.
Credit: Photosiber/ Shutterstock

“Salt” and “Salary” Have the Same Root Word

“Salt” in Latin is sal, which eventually grew into both “salt” and “salary” in English. Roman soldiers were given an allowance for salt purchase — a salārium. That eventually made its way into Anglo-French as salarie, which was, in turn, borrowed into English as “salary.” Just a fun fact to remember the next time you use your salary to buy salt.

Turtle nesting on beach.
Credit: GKlps/ Shutterstock

Sea Turtles Cry Out Excess Salt

If you look closely, you may notice a sea turtle crying when it comes ashore. It’s not because it’s sad — it’s because their bodies take in more salt from the sea than they can excrete in their urine. They have a gland in each eye that excretes salt into their tears. It’s always working, but it only looks like tears outside the water. Some butterflies in the western Amazon, low on reliable sources of sodium, gather around river turtles and drink their tears, too.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by bhofack2/ iStock

There’s more to autumn than pumpkin spice — it’s also filled with pumpkin pie, pumpkin patches, and even a semi-obscure sport known as punkin chunkin (not to mention other non-squash-related customs). If you’ve ever wondered why you have the sudden urge to wander through a corn maze in the fall, or what it is about October that’s so conducive to bobbing for apples, read on — here are the surprising origins of eight autumn traditions.

Aerial View of a Corn Maze in Eastern South Dakota during October.
Credit: Jacob Boomsma/ Shutterstock

Corn Mazes

In 1993, Joanne Marx and Don Frantz created the “Amazing Maize Maze,” a 1.92-mile labyrinth on three acres of land at Annville, Pennsylvania’s Lebanon Valley College. Though they and other maize maze makers have a long tradition of hedge mazes to inspire them, the newer form distinguished itself in notable ways — namely, that corn mazes are usually cut in fun shapes or interesting images when seen from above, rather than basic geometric patterns. That, and they often have stacks of hay and piles of pumpkins around for pictures.

Scenic view of a car driving on a highway through a beautiful autumn forest.
Credit: CHEN MIN CHUN/ Shutterstock

Leaf Peeping

This one goes back more than 1,200 years, which is another way of saying it didn’t originate in America. Rather, it appears we have Japan to thank for the custom. Their version of it, which carries the considerably more evocative name of momijigari (“autumn leaf hunting”), dates back to at least the Heian Era of 794-1185. A renaissance of sorts, that epoch brought about both visual art that celebrated the vibrant colors of fall and the endlessly influential Tale of Genji, which explicitly mentions “an imperial celebration of autumn foliage.”

As for how it became an American tradition, a professor of Asian art history has a theory: Japan and New England were connected via shipping routes, resulting in New Englanders being exposed to Japanese lacquerware featuring a maple-leaf motif that made them more inclined to seek out gorgeous leaves without traveling halfway across the world.

Waiter with traditional costume serving beers.
Credit: rawf8/ Shutterstock

Oktoberfest

Beginning in the third weekend of September and lasting until the first Sunday in October, Oktoberfest has long served as an excuse for revelers to do as the Germans do and wet their whistle at the local beer hall (lederhosen optional). The first Oktoberfest was a wedding reception: On October 12, 1810, the citizens of Munich gathered at the city’s gates to celebrate the marriage of Bavarian Crown Prince Ludwig to Princess Therese of Saxony-Hildburghausen. The event (known locally as d’Wiesn) was so popular that it took place again the following year — and the year after that, and so on and so forth until it became the world-famous festival of Bavarian culture that it is today.

A senior black man voting at a voting booth.
Credit: adamkaz/ iStock

Election Day

Though rarely thought of in the same way as apple cider and leaf-peeping, American elections take place in autumn for a reason. Out of consideration for farming schedules, Congress chose November (when the harvest was finished but it hadn’t usually begun to snow yet) in its 1845 decree establishing the date. As for Tuesday? Weekends were a no-go due to church, and Wednesdays were off the table because farmers usually went to the market to sell their goods. Thus, Tuesday emerged as a sort of compromise, and the tradition stuck.

High angle closeup of a metal tub filled with water and apples for the Halloween.
Credit: Steve Cukrov/ Shutterstock

Bobbing for Apples

It may not be as popular now as it was a century ago, but bobbing for apples persists as an autumnal activity, especially on Halloween. Long before kiddos dressed up on October 31, however, British singles played the game as a sort of courting ritual. Each apple represented a different eligible bachelor and, if the young woman bobbing for said apple bit into it on her first try, the two would live happily ever after; succeeding on the second attempt meant that the two would be together for a time but the romance would fade; and not getting it right until the third try foretold doom.

 A pumpkin purees as it lands hard on the pavement after being tossed.
Credit: MediaNews Group/Orange County Register via Getty Images

Punkin Chunkin

What exactly is Punkin Chunkin? For the past two decades, “chunkers” have created slingshots, trebuchets, and even pneumatic cannons to hurl pumpkins as far as possible. The World Championship Punkin Chunkin Contest has taken place in Bridgeville, Delaware, every November since 1986, with First State native Bill Thompson claiming credit for inventing the sport.

Group of people tailgating and drinking beer.
Credit: Capuski/ iStock

Tailgating

Tailgating is now a year-round activity at sporting events and concerts, but it’s always been especially popular at football games. One theory posits that it dates all the way back to the first college football game, a contest between Rutgers and Princeton that took place in 1869, when some in attendance sat at their horses’ “tail end” while grilling sausages before the game began. Another theory centers around the Green Bay Packers, whose fans are said to have coined the term “tailgating” when the cheeseheads first began supporting the team in 1919. Ever industrious, they positioned their trucks around the field and sat in the beds for comfortable viewing while enjoying their food and drinks.

Colorful candy corn for Halloween on a background.
Credit: bhofack2/ iStock

Candy Corn

It may be the year’s most polarizing candy, but its history is long and sweet. Candy corn dates back to the 1880s, when a confectioner at the Wunderle Candy Company began producing it under the even-less-appetizing name of Chicken Feed. The corn-shaped sugar molds were then manufactured by the Goelitz Confectionery Company, who made the product famous (you may now know Goelitz as Jelly Belly). More than 35 million pounds (or nine billion individual pieces) of candy corn are produced every year, so someone must like the stuff.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by RaspberryStudio/ Shutterstock

Your home is full of quotidian mysteries. Why can’t I get Wi-Fi in this one room? Why am I constantly getting shocked when I touch a door handle? Why do my hands feel slimy after I just washed them? Some of these questions have relatively simple scientific answers, while others involve understanding a little physics. Read on to discover the science behind some everyday household enigmas — plus tips for how to deal with them.

Man plugs Ethernet cable into a router.
Credit: Proxima Studio/ Shutterstock

Why Does Wi-Fi Have Dead Zones?

Wi-Fi, which stands for “Wireless Fidelity,” uses radio waves to send signals, and just as with regular radio waves, it can be disrupted or impeded — mainly by thick walls and metal barriers. It also degrades over distance.

While AM and FM radio broadcasts cover the ranges of kilohertz and megahertz, respectively, Wi-Fi transmits in gigahertz (or a billion cycles per second). That allows Wi-Fi signals to carry an immense amount of information, while severely limiting its range. Many Wi-Fi routers come with two network frequencies — 2.4 GHz and 5 GHz. The first frequency carries a lower bandwidth (aka speed) but can reach farther distances, whereas 5 GHz is faster but can’t travel as far. If you’re experiencing Wi-Fi dead zones, try switching to the farther-reaching 2.4 GHz frequency (if available), or investing in a Wi-Fi extender that can boost these data-heavy signals.

Interestingly, microwave ovens operate on a frequency very similar to Wi-Fi networks, at around 2.412 GHz to 2.472 GHz, so sometimes Wi-Fi can be disrupted when you’re warming up your dinner. Speaking of which …

Woman putting plate of rice with vegetables into a microwave oven.
Credit: Pixel-Shot/ Shutterstock

How Do Microwaves Work? (And Why Shouldn’t You Microwave Metal?)

You might think the science behind microwaves is relatively simple — the machine produces microwaves (as its name suggests), which in turn warm up your food. However, that’s only partially correct. While it’s true that microwaves warm up food, it’s not the food itself that’s being warmed, but the water inside the food. Microwaves, along with X-rays, radio waves, gamma rays, and visible light, are part of the electromagnetic spectrum. But microwaves contain an interesting property — they’re readily absorbed by water, fats, and sugars. These materials absorb microwaves as atomic motion, essentially vibrating the water molecules in food, which in turn produces heat. Of course, this is only part of the microwave’s convenient food-warming equation, as this slice of the electromagnetic spectrum isn’t absorbed by plastic, glass, and most ceramics, making it a convenient way to heat up food without heating up its container.

But have you ever accidentally put a piece of metal in a microwave? Yikes. What’s happening to create this explosive situation is that the microwaves are moving the free electrons found in the conductive metal’s surface, which causes the object to reflect microwaves. If the metal is smooth all over, such as a spoon, this won’t likely be a problem, but sharp edges like the tines on a fork will produce sparks and potentially catch some flammable material in the microwave on fire. A microwaved metal will also create an imbalance with dielectric breakdown in air, which can cause an electrical arc that can punch a hole in the interior wall, rendering the microwave too unsafe to use. Long story short, just don’t put metal in the microwave — ever.

A child raises their hair whilst exploring static electricity with an inflated orange balloon.
Credit: Raylui321/ Shutterstock

What Causes Static Electricity?

Static electricity was the doorway through which humanity first explored the phenomenon known as electromagnetism. The ancient Greek thinker Thales of Miletus, occasionally considered the world’s first scientist, rubbed a piece of amber against hemp and cat fur to study its effects (the name for “amber” in Greek is actually elektron). Today scientists call Thales’ discovery the triboelectric effect, or the process of producing electricity by rubbing. This effect sometimes occurs when two insulators (materials that inhibit the flow of electricity) rub together. One insulating object, such as a wool carpet, will lose electrons, while another, maybe a rubber-soled shoe, gains electrons. After rubbing these two insulators together, your body now possesses a small electric charge. Because those rubber shoes on your feet inhibit electrons, you can’t dissipate this charge, so once your body touches a conductor (such as a metal door handle), you experience an electric shock. Because dry air is also an insulator, the triboelectric effect is particularly prominent during the cold and dry winter months.

Want to circumvent those nasty little shocks? Avoid wearing wool socks and rubber-soled shoes in the house, and run a humidifier or two. Because water is a great conductor, more moisture in the air will help dissipate an electric shock accumulated within your body.

Close-up of water flowing from a faucet.
Credit: VVVproduct/ Shutterstock

What Is Hard Water?

Not all water is the same, though it may look similar at first glance. Towns and cities throughout the U.S. have varying degrees of “hard” or “soft” water. Despite the more common uses of these adjectives, “hard” and “soft” don’t refer to the actual texture of water, but instead serve as shorthand for the amount of calcium, magnesium, and other dissolved metals found in a particular water supply. Hard water, for example, has high levels of these two minerals, at around 121 to 180 mg/L, whereas soft water comes in at less than 60 mg/L.

There are a few easy ways to tell if your home has hard or soft water. The most common side effect of hard water is that it tends to leave your hands somewhat slimy after washing with soap. That’s because soap reacts with the calcium in the water to create a thin film of soap scum. Hard water can also leave residue on plates and cutlery run through the dishwasher, decrease a home’s water pressure due to mineral deposits in pipes, and also affect the performance of appliances such as electric water heaters and coffee makers.

Luckily, hard water isn’t considered hazardous to a person’s health (and can even supply a healthy dose of calcium and magnesium). However, due to some of its annoying side effects, some homeowners choose to install water-softening systems that use a porous plastic resin to replace calcium and magnesium ions with less disruptive sodium ions.

A hand plugging in a charger to a surge protector.
Credit: nipastock/ Shutterstock

How Do Surge Protectors Work?

According to a 2022 survey, the average American home has 22 connected devices, and many of those gadgets — along with others not connected to Wi-Fi — need to be protected from transient voltage, otherwise known as a power surge. The most common example of a power surge is often lightning strikes, but that’s actually the least common cause (and a surge protector can’t withstand the millions of volts delivered by a lightning strike anyway). What’s more common are power-hungry devices, whether hair dryers, refrigerators, or air conditioners, disrupting the flow of electricity in the home with their voracious energy demands. That’s why surge protectors are always a good idea, but especially when you need to plug a lot of things into a single outlet, such as for a media center or office.

Most outlets in U.S. homes are rated for 120 volts. If an unexpected surge sends voltage higher than that threshold for even a few nanoseconds, the results can be disastrous. Surge protectors serve as a kind of pressure valve protecting your gadgets from an electronic meltdown. Any time a surge protector detects voltage above the 120-volt threshold, it diverts excess voltage with a metal oxide varistor to its own grounding wire, protecting all your plugged-in devices in the process. Also, if you think you can rely on your home’s circuit breaker to protect your stuff — think again. Circuit breakers are only designed to protect against current overload, not surges and spikes.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Pictorial Press Ltd/ Alamy Stock Photo

Since the character was first introduced in Ian Fleming’s 1953 spy novel Casino Royale, James Bond has only grown in popularity, serving as the protagonist of one of the most beloved film franchises of all time. From Sean Connery to Daniel Craig and every actor in between, those who have portrayed Bond have done so with a charisma that has become synonymous with Agent 007. In honor of the franchise’s 70 years of history, here are six not-so-secretive facts about James Bond.

Man wildlife researcher makes field observation in the wilderness.
Credit: Евгений Харитонов/ iStock

The Character Is Named After a Real Ornithologist

Though many of us know James Bond as a debonair spy, the character’s name was actually inspired by an unassuming real-life ornithologist. The true James Bond was an American researcher and scientist known for writing the 1936 book Birds of the West Indies. The work was a trusted resource for avid bird lovers, and it was relied upon by Bond creator Ian Fleming, who lived in Jamaica. While developing his most famous character in 1952, Fleming co-opted the name “James Bond.” (He later said, “I wanted a really flat, quiet name.”)

As the character grew in popularity during the early 1960s, the real Bond and his wife began receiving phone calls asking to speak to the super spy. Concerned, the couple reached out to Fleming regarding the use of the name. Fleming admitted that he was directly inspired by the birding book, and invited the couple to his Jamaican estate to discuss further. The pair arrived one day in 1964, and the meeting was an amicable one. As the pair departed, Fleming gave them a signed first edition of his new novel, You Only Live Twice, with the inscription: “To the real James Bond from the thief of his identity, Ian Fleming, Feb. 5, 1964 (a great day!).”

Scottish actor Sean Connery relaxes on the set of the James Bond film 'Diamonds Are Forever'.
Credit: Anwar Hussein/ Hulton Archive via Getty Images

Sean Connery Earned a Record Paycheck for “Diamonds Are Forever”

After appearing in the first five Bond films, Sean Connery departed the titular role. The character was taken over by George Lazenby, whose only Bond film proved to be 1969’s On Her Majesty’s Secret Service. Connery returned to play James Bond in 1971’s Diamonds Are Forever, but not before negotiating a lucrative contract. Connery was lured back in part by a record paycheck that saw the actor earn $1.25 million. That number was good enough to be enshrined in Guinness World Records as the highest salary earned for a single acting role at the time.

Connery also earned 12.5% of the movie’s gross, and negotiated the right to not have to interact with the film producers (with whom he was feuding). It wasn’t entirely about getting a major payday for Connery, however: The leading man ended up donating his salary to a charity he founded, known as the Scottish International Education Trust.

Roger Moore at the wheel of a speed boat in a scene from the film 'Live And Let Die'.
Credit: Archive Photos/ Moviepix via Getty Images

Three James Bond Theme Songs Have Won an Oscar

Not only are the Bond movies renowned for their intense action sequences and complicated characters, but the films’ soundtracks have earned critical acclaim as well. Seven James Bond-related films have earned Oscar nominations for Best Original Song, though oddly enough, the first such nomination came for the theme song of a 1967 parody of the novel Casino Royale, starring David Niven. That song was titled “The Look of Love,” composed by Burt Bacharach and Hal David, and sung by Dusty Springfield.

The first official Bond film to be nominated for Best Original Song was Live and Let Die, whose title track was written by Linda and Paul McCartney. Later, the 1977 Bond movie The Spy Who Loved Me earned a musical nomination for Carly Simon’s “Nobody Does It Better.” 1981’s “For Your Eyes Only” was also nominated for its title song, sung by Sheena Easton.

The Bond franchise finally took home a Best Original Song Oscar for the 2012 movie Skyfall, an award won thanks to a haunting tune performed by Adele. The series followed that up with yet another win in 2015 for Sam Smith’s “Writing’s on the Wall” from the movie Spectre, and most recently in 2021 for Billie Eilish’s performance of the title track from No Time to Die. Eilish not only became the youngest artist to record a Bond song at just 17 years old, but was also the first person born in the 21st century to win an Academy Award.

Actors Pierce Brosnan and Desmond Llewelyn, in a scene from a James Bond film.
Credit: Keith Hamshere/ Moviepix via Getty Images

Desmond Llewelyn Appeared in the Most James Bond Films

No individual has appeared more frequently as James Bond than Roger Moore, who starred as the spy in seven official films. But in terms of total appearances by any character, actor Desmond Llewelyn has Moore beat. Beginning with 1963’s From Russia With Love, Llewelyn portrayed Q — the quartermaster of the MI6 lab known for coming up with inventive gadgets and his humorous interactions with Bond — on 17 different occasions. Llewelyn continued to play Q through 1999’s The World Is Not Enough, when he retired from the role.

Though Llewelyn tops the list of most acting appearances in the Bond franchise, Lois Maxwell also reached double digits. Maxwell showed up as Moneypenny, a secretary at MI6, in 14 different films, first portraying the role in the very first Bond movie, 1962’s Dr. No. Maxwell reprised the character for each canonical Bond film starring Sean Connery, George Lazenby, and Roger Moore, before both she and Moore finally retired after 1985’s A View to a Kill. In doing so, Maxwell became the last actor from the original film to depart the franchise.

Daniela Bianchi And Sean Connery In 'James Bond: From Russia With Love'.
Credit: Archive Photos/ Moviepix via Getty Images

“From Russia With Love” Was the Last Movie John F. Kennedy Ever Saw

President John F. Kennedy was among the Bond franchise’s fervent early supporters, having been gifted a copy of the novel Casino Royale while recovering from back surgery in 1954, long before Bond became widely popular. Kennedy was such a fan that he even invited Ian Fleming to his house during the 1960 presidential campaign, where the pair discussed foreign affairs. In 1961, shortly after JFK’s inauguration, the new President told reporters that Fleming’s novel From Russia With Love was among his favorite books. The remark caused James Bond to spike in popularity.

After Kennedy’s comments, Bond books began flying off the shelves, which in turn helped convince Eon Productions to produce a film version of one of Fleming’s novels. The result was 1962’s Dr. No, which was screened at the White House shortly after its release. During that event, Kennedy commented that they should turn his favorite novel — From Russia With Love — into a movie, which producers did the following year. On November 20, 1963, JFK was shown a rough cut of the film From Russia With Love, just one day before he left on his fateful trip to Dallas.

Novelist and screenwriter Roald Dahl, 1976.
Credit: Tony Evans/Timelapse Library Ltd/ Hulton Archive via Getty Images

Children’s Author Roald Dahl Wrote “You Only Live Twice”

Roald Dahl is one of history’s most prolific children’s authors, responsible for literary works including Charlie and the Chocolate Factory and James and the Giant Peach. Dahl’s work didn’t just cater to youths, however, as the author also embarked on a screenwriting career that saw him produce the script for 1967’s You Only Live Twice.

Dahl coincidentally lived a life akin to James Bond in a lot of ways, having served in a New York-based branch of the British intelligence service during World War II known as the British Security Co-ordination. According to a 2010 Dahl biography, the author was viewed by his peers as a cunning flirt who successfully seduced high-profile American women into supporting the British wartime effort. Given Dahl’s experience, it only made sense that he was brought aboard to adapt Fleming’s 1964 novel into a film. During pre-production, Dahl scrapped much of the book’s original story line, though he retained villainous characters such as Blofeld, who became a mainstay of the Bond franchise. Dahl was also inspired by the sci-fi elements of Goldfinger — the only Bond movie Dahl had seen to that point — and wrote a story centered around space-age technology that in turn became a smash hit.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.