Original photo by Moviestore Collection Ltd/ Alamy Stock Photo

Readers love a rags-to-riches story — which could be why “Cinderella” has such a cultural hold on us, even centuries after the tale was first recounted. Most versions of the famous fairy tale follow the same pattern: A destitute girl yearning for a better life makes a magical friend and gets a boost into better circumstances thanks to a shoe. But not every detail of the fictional servant-turned-queen’s background is predictable — here are six fascinating facts you might not know about the “Cinderella” folktale and movie.

Detail Rhodopis on Plinth by Charles Francis Fuller.
Credit: Frank Nowikowski/ Alamy Stock Photo

The First Cinderella Story May Have Come From Ancient Greece

The ball-gown-bedazzled Cinderella we know today is far from her origins, which may have been in ancient Greece. Some researchers point to the tale of Rhodopis, a story recorded by Greek geographer Strabo around the first century BCE, as a possible beginning. In that account, Rhodopis is a courtesan whose shoe is stolen by an eagle and dropped into the lap of an Egyptian pharaoh. Seeing the shoe as an omen from the gods, the royal sends soldiers throughout the kingdom to track down the shoeless woman, who eventually becomes his wife. However, not everyone agrees that the tale of Rhodopis is truly the first “Cinderella” story. Some historians say that Strabo’s brief description of the tale is only similar to today’s version in that it hinges on a shoe; the centuries-old version lacks a fairy godmother, cruel stepmother, and other key components we now think of as standard.

View of a Cinderella poster.
Credit: swim ink 2 llc/ Corbis Historical via Getty Images

There Are More Than 700 Versions of the Story

Whether or not Rhodopis was the first Cinderella, she certainly wasn’t the last. Fairy tales with similar shoe-based plots have cropped up worldwide — some librarians count more than 500 versions found in Europe alone, while global counts are as high as 700.

Culture has played a heavy role in each story’s details. One Italian rendition renames the princess “Zucchettina” because she was born inside of a squash. In the Danish tale, Cinderella (there called “Askepot”) wears rain boots, a detail particularly fine-tuned to Denmark’s rainy climate. However, in the version that has had the most recent popularity, first penned by French author Charles Perrault in 1697, “Cendrillon” is eventually found by her prince thanks to a glass slipper — the first edition of the story to include such a delicate shoe.

Cinderella's glass slipper at Disney's "Cinderella" Library of Congress National Film Registry Ball.
Credit: Kris Connor/ Getty Images Entertainment via Getty Images

The Famed Glass Slipper May Have Been a Political Statement

Perrault’s choice to cast Cinderella’s sparkling shoes from glass may have been less about fashion and more about politics, according to some academic researchers. Historian Genevieve Warwick at the University of Edinburgh believes that the detail was actually meant in part to poke fun at Louis XIV, king of France from 1642 to 1715. During his reign, Louis XIV (who was responsible for developing Versailles into a lavish palace) was known for donning extravagant clothing, particularly shoes. Perrault, who worked as a secretary overseeing construction at Versailles — known for its Hall of Mirrors — and the Louvre (especially glasswork), may have added the glass slipper detail as a bit of satire, mocking the increasingly ostentatious and impractical French fashions of the time; after all, it would be incredibly difficult to actually dance in shoes made of glass.

Yet there may have also been a layer of economic nationalism: Perrault was in charge of setting up a royal glassworks for France, which meant the nation no longer needed to be dependent on the glassmakers of Venice. Warwick thinks Cinderella’s transformation may have been read by contemporary readers as a metaphor for France’s self-determinism, and newfound ability to make the king’s beloved luxury products for itself.

Walt Disney working in his studio.
Credit: Bettmann via Getty Images

Walt Disney Sketched His First Cinderella Nearly 30 Years Before The Feature Film

Disney’s feature-length adaptation of “Cinderella” premiered in 1950, though the illustrator actually began tinkering with the story some three decades before. At Laugh-O-Gram, Disney’s first studio in Kansas City, the artist tested out his animation skills through an interest in fairy tales. In 1922, the young animator produced a silent, seven-minute version of “Cinderella” in which her only friend was a cat who helped with housework, and her fairy godmother sent her off to the ball in flapper attire and a car instead of a pumpkin. That same year, Disney also put out cartoon shorts of “Little Red Riding Hood” and “Beauty and the Beast” (which the company would successfully return to in 1991).

Cinderella screen-grab from 1950.
Credit: LMPC via Getty Images

“Cinderella” Saved Walt Disney From Bankruptcy

Cinderella was Walt Disney’s sixth full-length animated film (following Snow White and Bambi, among others), but it was the project that finally solidified his studio’s success. Disney and a team of animators spent six years developing Cinderella before its 1950 premiere, and the production wasn’t just a major investment of time — it was a huge financial gamble. World War II had slowed the studio’s projects and Disney had racked up nearly $4 million in debts to keep the business running; Cinderella cost around $2 million to produce and would likely have shuttered Disney’s business if it flopped. Luckily, the film grossed more than $4 million at the box office and gained three Oscar nominations for its soundtrack, which helped usher in a new era for Disney’s studio.

Actress Julie Andrews poses with a glass slipper in the role of Cinderella, circa 1957.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Rodgers and Hammerstein’s Adaptation Was Their Only TV Musical

Broadway superstars Richard Rodgers and Oscar Hammerstein II wrote 11 musicals during their partnership, though the duo created only one specifically for television viewers: Cinderella. The 90-minute production featured actress Julie Andrews in the leading role, to glowing reviews. Rodgers and Hammerstein’s sole TV musical debuted on March 31, 1957, and drew more than 100 million viewers — more than 60% of American households tuned in. Like the everlasting story, Rodgers and Hammerstein’s version has been remade for TV and stage time and again in the decades since it aired.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Try_my_best/ Shutterstock

We spend more than a third of our lives unconscious, sleeping in beds (or elsewhere) to prepare our minds and bodies for the day ahead. Although this activity takes up a significant portion of daily life, scientists are still discovering fascinating attributes of the human sleep-wake cycle, developing a more nuanced understanding of dreams, and coming to grips with the devastating effects of sleep deprivation and disorders. These five facts delve into the science of sleep.

Man comfortably sleeping in his bed at night, with a dreaming cloud above.
Credit: Fer Gregory/ Shutterstock

12% of People Dream in Black and White

Dreams are an important mechanism of the human mind. What seems like a series of random thoughts and events is actually the brain trying to make sense of the day, remembering things that are important, forgetting things that aren’t, and overall preparing our biological computers for tomorrow. While most people dream in full color, around 12% of the population is tuned to TCM (so to speak), and often experiences dreams in only black and white. The analogy to television is an apt one, as researchers discovered in 2008 that people under the age of 25 almost never dreamed in monochrome, while members of the boomer generation and older had dreams devoid of color roughly a quarter of the time. Although it is difficult to prove definitively that TV is to blame, the number of people who reportedly dream in grayscale has slowly fallen over subsequent decades.

Portrait of tired young man sleeping while sitting at dining table in kitchen.
Credit: Prostock-Studio/ iStock

Poor Sleep Reduces a Human’s Pain Threshold

Having a poor night’s sleep comes with a multitude of real-world side effects, including sluggishness, irritability, and poor concentration. Over the long term, things get even more dire, as poor sleep can contribute to obesity, high blood pressure, and an overall weaker immune system. Sleep can also have a surprising correlation with how much pain a human can withstand. In 2015, a National Sleep Foundation poll discovered that two out of every three people experiencing chronic pain also suffered from sleep deprivation. Statistics like this inspired scientists from UC Berkeley to figure out how exactly sleep is entwined with pain tolerance. After studying two dozen healthy young adults, the researchers realized the neural mechanisms that evaluate pain signals and activate appropriate relief measures are disrupted when someone doesn’t get enough sleep. Just another reason (among many) that you should always try to get a good night’s rest.

Above view of smiling woman sleeping in bed.
Credit: skynesher/ iStock

Not Every Person Needs the Same Amount of Sleep

Some people seem to tick along just fine on five hours of sleep while others can’t even think straight on anything less than nine hours. That’s because the common recommendation of getting eight total hours of sleep is really an average — not a rule. Although a common indicator for how much sleep you need is often based on age (for example, kids need more sleep than adults because they’re still growing), differences also occur from person to person. Scientists have identified a significant portion of humans who require less than six hours to feel well rested because these sleep champions actually have a mutated gene that codes certain receptors that affect the sleep-wake cycle. These people experience higher-quality sleep that takes up less time than the average human needs to spend getting shut-eye.

An alarm clock, sleeping pills, an eye mask and a black board reading rem sleep.
Credit: Ben Gingell/ Shutterstock

Your Muscles Are Paralyzed During REM Sleep

Dreaming occurs during a process known as REM (rapid eye movement) sleep. The name comes from the physical movement of our eyes while experiencing dreams. During these bouts of REM sleep, of which there are four to six per night, brain activity changes and causes paralysis in our muscles. This normal effect of REM sleep is what’s known as muscle atonia, and it’s designed to keep humans from injuring themselves in their sleep. However, sometimes a person’s muscles still retain function during REM sleep and can cause a person to act out their dreams. This is known as REM sleep behavior disorder, and can be a real danger to the dreamer, or in some cases, the dreamer’s partner.

The reverse is also possible, as sleep paralysis occurs when someone wakes from REM sleep only to discover that they can’t move their body or speak. Both of these sleep disorders (along with many others) are types of parasomnias.

A stressed women sitting next to her bed.
Credit: PonyWang/ iStock

Extreme Sleep Deprivation Can Lead to Psychosis

While being a poor sleeper can have serious side effects, getting no sleep at all can be downright deadly. Throughout the day, our bodies burn energy and create a byproduct in the brain known as adenosine. The buildup of this nucleoside is what causes us to feel sleepy. In fact, caffeine works by blocking adenosine from binding, making us more alert as a result. While we sleep, a waste clearance system known as the “glymphatic system” essentially removes this buildup of adenosine while using cerebrospinal fluid to remove toxic byproducts throughout the central nervous system. After sleeping the required eight (or so) hours, the brain is refreshed and ready for the day ahead. However, if someone puts off going to sleep for a long period of time, adenosine builds up in the brain and eventually disrupts our visual processing system, which in turn triggers hallucinations and, in rare cases, even death. In other words, spending one-third of our lives in bed may seem like a waste of time, but sleeping may be the most important thing we do every day.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Michelle Bridges/ Alamy Stock Photo

Some have days of the week named after them and others are Marvel superheroes, but many Norse gods haven’t been thought about much outside of academic circles since the Icelandic poet and historian Snorri Sturluson wrote about them in the 13th century. That’s a shame, since the pantheon of Norse mythology extends far beyond the likes of Thor and Odin — and includes such deities as the crone who brought the god of thunder to one knee in a wrestling match. (Snorri wrote the Prose Edda, which, alongside the Poetic Edda, whose author is unknown, remains the foundational text for modern understanding of Norse mythology.) Here are five lesser-known mythological figures Snorri wrote about, and why they’re worth knowing — or perhaps even making a movie or comic book — about.

Vintage illustration of Thor being defeated by Elli.
Credit: Michelle Bridges/ Alamy Stock Photo

Elli

Few mythical figures, whether gods or otherwise, can claim to have held their own against Thor. Even fewer can say they beat him, but the giantess Elli is one of those who can rightfully make the boast.

Admittedly, there’s some trickery involved in the story. The giantess, considered the personification of Old Age, is said to beat Thor in a wrestling match while the god of thunder visits the giant king Utgard-Loki in his castle. As part of a series of tests of strength, Thor agrees to wrestle Utgard-Loki’s nurse — a challenge he accepts without realizing his opponent’s true identity. Thor struggles throughout the contest until Elli forces him to one knee, at which point Utgard-Loki declares the match over, and commends Thor for faring as well against old age as he did.
The tale is recounted in the Gylfaginning section of the Prose Edda, and sadly marks Elli’s only mention in the text.

Heart is locked in a cage, similar to Norse mythology Lofn.
Credit: oatintro/ iStock

Lofn

Norse mythology tends to evoke images of strength, battle, and violence. One exception is Lofn, a kind of matchmaker who specializes in forbidden love affairs. She’s described by Snorri as “so gracious and good to call on that she gets permission from Alfodr [Odin] or Frigg for the intercourse of people, men and women, although otherwise it would be banned or forbidden.” Also known as “The Comforter,” the goddess of love and gentleness has a special fondness for small and/or helpless beings. “Lof,” meaning “praise,” is derived from her name.

Víðarr on horseback.
Credit: Historic Images/ Alamy Stock Photo

Víðarr

Sometimes known as the Silent God, Víðarr (also anglicized as Vidar and Vithar) is the son of Odin and the jötunn (a being akin to a giantess) Gríðr — making him Thor’s half-brother. He’s often associated with vengeance, and with good reason: Odin’s ultimate fate is to be killed by the wolf Fenrir during Ragnarök, the “Twilight of the Gods” that’s essentially Norse mythology’s end of the world; Víðarr’s destiny, meanwhile, is to avenge his father by slaying Fenrir. Víðarr is also one of the few gods who survives Ragnarök (at least in some accounts), though little is written about him beyond his actions during these cataclysmic events other than to mention his status as the second-strongest god after Thor.

Víðarr is mentioned in both the Prose Edda and Poetic Edda, with the latter describing his most important deed thusly:

“Then comes Sigfather’s | mighty son,

Vithar, to fight | with the foaming wolf;

In the giant’s son | does he thrust his sword

Full to the heart: | his father is avenged.”

The giantess Angrboda.
Credit: Ivy Close Images/ Alamy Stock Photo

Angrboða

With a name that’s been translated as “she who brings sorrow” and “grief-bringer,” Angrboða has a lot to live up to. For better and (mostly) for worse, she does. A giantess (jötunn) and one of the trickster god Loki’s lovers, she ultimately gives birth to three monsters: Fenrir, the wolf fated to kill Odin during Ragnarök; Hel, who rules over the dead; and Jörmungandr, the serpent who encircles the entire world and is Thor’s archnemesis. The mother of monsters is indirectly responsible for some of Norse mythology’s most catastrophic events, though there’s no indication that Angrboða herself is evil — after birthing that terrible trio, she’s mostly known to reside in Jötunheim (the land of the giants) on her lonesome without any contact with either Loki or the monstrous spawn they had together. Some people’s children, as they say.

Drawing of Hoenir.
Credit: Alto Vintage Images/ Alamy Stock Photo

Hoenir

Hoenir — whose name is spelled several different ways (Hönir is also common) — works alongside Odin and Loki to create the first humans, Ask and Embla, by imbuing two pieces of driftwood with “essential gifts” whose exact properties remain a matter of debate centuries later. Here’s how the moment is described in Völuspá (Prophecy of the Seeress), the first poem in the Poetic Edda:

“They had no breath,

they had no soul,

they had neither hair nor voice,

nor a good appearance.

Odin gave them breath,

Hoenir gave them a soul,

Lodur / Loki gave them hair

and a good appearance.”

Here’s where it gets confusing. Hoenir’s gift imbues the humans with óðr, an untranslatable Old Norse word that can encompass everything from understanding to poetic inspiration to frenzy on the battlefield. But since óðr is the root of Odin’s name and another Norse tale suggests that humans derive their óðr from Odin himself, some consider this mention of Hoenir to be an extension of Odin himself. Hoenir remains important not despite this ambiguity but because of it — much of Norse mythology is murky and ambiguous, with few figures embodying those qualities quite like he does.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Juanmonino/ iStock

Plenty of dishes have names that have nothing to do with their ingredients: No frogs are harmed in the making of toad in the hole, sweetbreads are neither sweet nor baked, and puppies are definitely not included in hot dogs. Boston cream pie is delicious, but not pie — and Welsh rabbit (aka “rarebit”) is vegetarian. The culinary misdirections continue when it comes to dishes containing place names. Here are five foods with names that are miles from the places where they actually originated.

Aerial view of a Hawaiian pizza slice.
Credit: Vasin Lee/ iStock

Hawaiian Pizza

While the war over pineapple as a pizza topping divides the world, the controversy originated nowhere near the Aloha State. Hawaiian pizza, the savory pie combining the salty umami of ham (or Canadian bacon) with the sweetness of pineapple, was the product of a Greek immigrant operating a restaurant in Ontario, Canada. Sam Panopoulos added Hawaiian-brand canned pineapple as a novelty topping in 1962, and the combination (along with the ’60s fascination with all things “tiki”) slowly gained popularity. In 1999, Hawaiian even became the most popular pizza style in Australia, accounting for 15% of sales.

Meatballs in a pan with cream sauce.
Credit: Yulia_Kotina/ iStock

Swedish Meatballs

Springy and savory, these meatballs are practically synonymous with Sweden — but everyone’s favorite IKEA offering is likely based on a dish from the Ottoman Empire. King Charles XII of Sweden was impressed by the entree while in exile in what is now Moldova during the early part of the 18th century. The meatballs, called kötbullar in Sweden, may be derived from the spiced lamb and beef recipe for köfte, a signature dish in Turkish cuisine. The Swedes substituted pork for lamb, and the dish is traditionally served with a silky sour cream-based gravy atop a bed of mashed potatoes or egg noodles and accompanied with tangy lingonberry jelly.

Baked Alaska Ice Cream Cake.
Credit: Katheryn Moran/ Shutterstock

Baked Alaska

In 1867, the U.S. bought 375 million acres from Russia, land that would become Alaska. The purchase also inspired Delmonico’s chef Charles Ranhofer in New York to create a confection he dubbed “Alaska, Florida.” Spice cake was topped with a dome of banana ice cream — an expensive and exotic luxury at the time — then crowned with a layer of meringue toasted to a golden brown. A simplified version called “Alaska Bake” showed up in a Philadelphia recipe book in 1886, and within a few years “baked Alaska” was being offered on several menus around New York. Since then, baked Alaska has become a celebratory sweet, and the fancy dessert is a favorite for birthdays and other special occasions.

Meat cutlet with boiled egg, pieces on a dark wooden background.
Credit: Iaroshenko Maryna/ Shutterstock

Scotch Eggs

The pub food and picnic staple known as a Scotch egg is a popular snack across the U.K., but its origins may lie far from the British Isles. Along with curries and chutney, British soldiers returning from the occupation of India may have imported nargisi kofta — a dish of shelled hard-boiled eggs wrapped in spiced ground lamb, deep-fried, and served with an aromatic tomato sauce. Iconic department store Fortnum & Mason claims to have invented the British version in 1738, but the northern England county of Yorkshire maintains that the “Scotch” in the name came from eatery William J Scott & Sons, where the original version was wrapped in fish paste and the treats were nicknamed “Scotties.” Modern versions are usually coated in sausage and rolled in breadcrumbs before being deep-fried.

A set of fresh sushi rolls with salmon, avocado and black sesame seeds.
Credit: Andrei Iakhniuk/ Shutterstock

California Roll

Many Americans’ first introduction to sushi comes in the form of a California roll, but the approachable offering probably doesn’t come from Japan via the Golden State (although a couple of Los Angeles chefs do claim credit, and the origin is somewhat uncertain). Chef Hidekazu Tojo studied in Osaka before emigrating to Vancouver, B.C., in 1971. Noting that his new customers were intimidated by raw fish and seaweed, Tojo reversed the traditional roll process, encasing the unfamiliar ingredients inside a layer of rice. The “inside-out” rolls were popular with guests from California and also included avocado — popular in dishes from the state — which led to the name. At Tojo’s own restaurant, they’re simply known as “Tojo rolls.”

Cynthia Barnes
Writer

Cynthia Barnes has written for the Boston Globe, National Geographic, the Toronto Star and the Discoverer. After loving life in Bangkok, she happily calls Colorado home.

Original photo by Science History Images/ Alamy Stock Photo

Louis Armstrong changed the face of jazz in the 20th century, with enduring hits such as “West End Blues,” “Hello, Dolly!” and “What a Wonderful World.” Born in 1901, the influential trumpeter and vocalist started playing gigs as a child in New Orleans, and long after his death in 1971, remains a giant of the genre.

Satchmo (as he was lovingly nicknamed) had a long and rich career, but was he always a singer? Which enduring hit went unnoticed for decades? And how did he revolutionize the trumpet? Take a journey through the “wonderful world” of Louis Armstrong with these seven amazing facts about his life.

Publicity photo of American jazz trumpeter Louis Armstrong.
Credit: JP Jazz Archive/ Redferns via Getty Images

Louis Armstrong’s Childhood Nickname Was “Dippermouth”

Long before “Satchmo” came along, Armstrong was known in his childhood home in the Storyville district of New Orleans as “Dippermouth,” or “Dipper” for short. He supposedly got the moniker from his wide smile as a child, although the nickname later came to be associated with his embouchure (the way a player puts their mouth around an instrument).

Armstrong’s mentor, King Oliver — a fixture in the Storyville jazz scene during Armstrong’s youth — recorded a song in 1923 called “Dippermouth Blues,” which he co-wrote with Armstrong. Dipper himself would later go on to record his own version in 1936.

Armstrong in the band at the Colored Waifs Home in New Orleans.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

Armstrong Honed His Skills in a “Waif’s Home”

After firing off six blanks at a New Year’s Eve party in New Orleans in 1912, 11-year-old Armstrong was arrested and sent to the Colored Waif’s Home for Boys, a facility that was part juvenile detention facility, part orphanage, and part reform school. It was his second stay at the home — according to recently uncovered records, Armstrong did a brief stint there when he was only nine, after he and five of his friends were arrested for being “dangerous and suspicious characters,” a charge used often at the time to detain people without cause.

By the time of his second stay, the Waif’s Home had hired a music teacher and started a band program. Under the tutelage of instructor Peter Davis, Armstrong learned the bugle and the coronet, and spent some time as the bandleader. Early on, he showed a skill for harmonizing and improvising that seemed beyond his years. It was far from his first exposure to the instruments, but it was the first time he received proper training. Armstrong started playing gigs after his release from the home in 1914.

Famed jazz trumpeter Louis "Satchmo" Armstrong.
Credit: Bettmann via Getty Images

Armstrong Didn’t Start Out as a Trumpet Player

While he is remembered today for his distinctive voice and rich trumpet solos, the trumpet wasn’t Armstrong’s original instrument of choice — even years into his career. Satchmo rose to prominence in his mid-teens playing the cornet, which is similar to a trumpet but smaller and with a few subtle differences.

For example, a trumpet is a cylindrical brass instrument, meaning the tube stays the same diameter throughout, but a cornet’s tube tapers off on its way to the mouthpiece, giving it a mellower tone. From the 1800s to the mid-1900s, the cornet was a standard part of a brass or jazz ensemble, as well as a popular solo instrument. While trumpets were also played, they weren’t typically solo instruments.

Armstrong, however, has been credited with reinventing the trumpet in the public consciousness. In the mid-1920s, as Armstrong tells it, the bandleader of an orchestra he played with said he “looked funny… with that stubby cornet.” The band’s other brass player played the trumpet, and Armstrong thought the sound of two trumpets sounded better. He started playing the trumpet as he would the cornet, with extensive improvisation and crowd-pleasing solos. He wasn’t the only jazz musician doing this, but as he rose to national prominence, his inventive style helped change public opinion about what a trumpet could sound like.

Group portrait of American jazz musician Louis Armstrong and his orchestra.
Credit: Charles Peterson/ Archive Photos via Getty Images

He Used to Play in a Silent Movie Orchestra

In the mid-1920s, Armstrong played with Erskine Tate’s Vendome Orchestra — the same band that inspired him to pick up a trumpet. The Big Band ensemble was one of the early players on Chicago’s jazz scene and performed at the Vendome Theatre in Chicago, providing accompaniment and intermission entertainment for silent films. Armstrong not only played jazz solos, but also performed operatic arias.

Louis Armstrong pictured moisturizing his lips while traveling on a train.
Credit: Pictorial Parade/ Archive Photos via Getty Images

Lip Injuries Were a Common Ailment for Armstrong

There was one bad habit Armstrong picked up at the Waif’s Home: a poor embouchure that proved unsafe for his face. Bad form is especially dangerous with brass instruments, including cornets and trumpets, because shifting one’s embouchure is fundamental to playing a melody, requiring near-constant lip and tongue movement.

According to Armstrong, the damage started early in his career. “In my teens, playing in that honky tonk all night long, I split my lip wide open,” Armstrong recalled in a 1966 Life interview. “Split my lip so bad in Memphis, there’s meat still missing. Happened many times. Awful. Blood run all down my shirt.”

While some of his peers sought professional help and even plastic surgery, Satchmo treated his lips using home remedies. He had a special salve he’d apply to his lips, and when callouses built up, he’d shave them down himself with a razor and take some time away from performing. One particularly nasty split in 1935 took him offstage for a year. While embouchure overuse syndrome can be common among brass players, it’s perhaps associated with Armstrong more than any other musician. Some doctors even use the term Satchmo syndrome for a tear in the lip muscle.

Jazz trumpeter Louis Armstrong performs at the Newport Jazz Festival in 1970.
Credit: Tom Copi/ Michael Ochs Archives via Getty Images

Armstrong Insisted on Adding Singing to His Act

Armstrong is almost as well known today for his distinctive, gravelly singing voice as he is for his trumpet skill. While he formed a vocal quartet with other kids in his neighborhood and sang in a choir at the Waif’s Home, Satchmo built his early career on the cornet and later the trumpet, not singing.

In 1924, he joined the Fletcher Henderson Orchestra, then a big name in the New York City music scene, for an engagement at the Roseland Ballroom. Armstrong asked repeatedly to sing, yet recalled that Henderson wasn’t interested. But according to jazz drummer Kaiser Marshall, Satchmo found a way to sneak it in anyway: Roseland would host a Thursday revue of amateur performers (similar to an open mic), and one night Armstrong went on stage and performed “Everybody Loves My Baby,” both on cornet and vocals. Marshall recalled that “the crowd surely went for it … from then on they used to cry for Louis every Thursday night.”

Cover of vinyl album What A Wonderful World by Louis Armstrong.
Credit: EyeBrowz/ Alamy Stock Photo

“What A Wonderful World” Took 20 Years to Reach the U.S. Charts

Armstrong’s most popular song, “What a Wonderful World,” topped the British music charts upon its 1967 release, staying at No. 1 for 13 weeks. The inspiring tune was a hit elsewhere in Europe and South Africa, too, but because the president of Armstrong’s record company, Larry Newton, disliked the song, the record was never actually promoted in the United States. According to the song’s co-writer and producer Bob Thiele, it didn’t even crack 1,000 copies in the U.S. after its initial release.

But in 1987, 16 years after Armstrong’s death, “What a Wonderful World” was featured in the film Good Morning Vietnam. Only then did the song reach the Billboard Hot 100, where it peaked at No. 33. The original album was re-released and certified gold.

Armstrong was drawn to the song because it reminded him of Corona, Queens, where he and his last wife Lucille settled down permanently in 1942. “Lucille and I, ever since we’re married, we’ve been right there in that block,” Armstrong said in 1968, according to the Louis Armstrong House. “And everybody keeps their little homes up like we do and it’s just like one big family. I saw three generations come up on that block. And they’re all with their children, grandchildren, they come back to see Uncle Satchmo and Aunt Lucille. That’s why I can say, ‘I hear babies cry/I watch them grow /they’ll learn much more/than I’ll never know.’ … So when they hand me this ‘Wonderful World,’ I didn’t look no further, that was it.” Since then, the song has become a timeless classic, featured in many other films and shows and covered by artists such as Rod Stewart and Stevie Wonder.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by John Gaffen 2/ Alamy Stock Photo

On October 16, 1923, Walt Disney formally agreed to send a new series of short films to a New York distributor, thereby launching the Disney Brothers Cartoon Studio. Needless to say, a few things besides the company name have changed since then, as Disney has gone from a bare-bones operation to the creator of groundbreaking talking funnies, the stewards of iconic characters and franchises, and, finally, the overlords of a sweeping enterprise with interests all over the globe. With 100 years of movie magic in the rearview mirror, here’s a look at six facts about all things Disney.

a sketch by Ub Iwerks of Mickey Mouse.
Credit: JIM WATSON/ AFP via Getty Images

Animator Ub Iwerks Was Integral to Disney’s Early Success

Disney’s history is filled with the hard work of unsung geniuses, but none was as integral to the company’s foundational success as Ub Iwerks. Indispensable to Walt Disney since their days together at Kansas City’s Laugh-O-Gram Studio, Iwerks joined his friend in Hollywood in the 1920s to produce a groundbreaking live-action/animated series of short films called the Alice Comedies. Iwerks remained loyal to Disney after a distributor stole their creation of Oswald the Lucky Rabbit and hired away the studio’s animators. He’s credited with sketching the very first Mickey Mouse, and single-handedly animated the landmark 1928 Mickey cartoon Plane Crazy, with an output that reached 700 drawings in a single day. Although personal and creative differences prompted Iwerks to branch out on his own in 1930, he returned to the fold 10 years later as a special effects expert, and went on to bolster the studio’s animation capabilities with his innovations in optical printing and xerography.

FROZEN II scene in 2019.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Disney Has Created More Than 800 Films

More than 800 feature films have been made under the Disney banner since Snow White and the Seven Dwarfs hit theaters in 1937. The studio’s first full-length, live-action feature was Treasure Island, in 1950. Its first R-rated flick was Down and Out in Beverly Hills, developed under the then-recently inaugurated Touchstone Pictures subsidiary in 1986. Disney’s highest-grossing entry was (unsurprisingly) a Marvel movie, 2019’s Avengers: Endgame, while its highest-grossing animated feature also arrived that year with Frozen 2.

DER FUEHRER'S FACE.
Credit: RGR Collection/ Alamy Stock Photo

Disney Essentially Served as a Media Branch of the U.S. Military During World War II

After the attack on Pearl Harbor led to the requisitioning of Disney’s Southern California studio as an anti-aircraft base in late 1941, the company turned its focus to supporting the war effort. Several films produced during this time were used to train Army and Navy personnel; others, like Der Fuehrer’s Face (1943), were propaganda fare that portrayed stereotyped and inept versions of enemy leaders such as Adolf Hitler and Benito Mussolini. Additionally, the studio designed more than 1,200 insignia for various military units and helped raise funds by permitting its characters to appear on war bonds. All told, Disney was devoting more than 90% of its output to war-related material by 1943, enabling the studio to weather lean financial times and survive to deliver the next wave of classics, which included Cinderella (1950) and Peter Pan (1953).

Crowds walking around the Disneyland theme park in Anaheim, California.
Credit: Archive Photos/ Archive Photos via Getty Images

Disneyland’s Disastrous Grand Opening Was Dubbed “Black Sunday” by Employees

Although Walt Disney’s long-gestating dream of a theme park was realized with the televised grand opening of Disneyland in Anaheim, California, in July 1955, the disaster that unfolded was better suited for a nightmare. Most attractions remained unopened despite the rushed construction, and the sweltering heat transformed the fresh asphalt of Main Street, USA, into a sticky mess. Meanwhile, overcrowding from thousands of counterfeit tickets contributed to a 7-mile backup on the Santa Ana Freeway, and resulted in the park’s restaurants running out of food. But Disney remained unbowed by what was internally dubbed “Black Sunday,” and apparently so did the paying public: Disneyland surpassed 1 million in attendance just seven weeks later, and the company eventually doubled down on the theme park experience with the unveiling of Florida’s Walt Disney World in October 1971.

Promotional portrait of cast members of The Micky Mouse Club' television show.
Credit: Pictorial Parade/ Archive Photos via Getty Images

Disney Kick-Started the Careers of Numerous Celebrities

The House of Mouse has nurtured an impressive roster of young talents since Annette Funicello emerged as an original Mouseketeer in 1955. A teenaged Kurt Russell became a Disney film regular in the 1960s, before subsequent incarnations of The Mickey Mouse Club fueled the rises of mega pop stars Britney Spears, Justin Timberlake, and Christina Aguilera, along with A-list actor Ryan Gosling. Miley Cyrus and Olivia Rodrigo both starred on their own Disney shows before becoming chart-topping singers, while fellow Disney alums Zac Efron, Demi Lovato, Selena Gomez, and Zendaya achieved fame as musicians, actors, or both. And then there’s Steve Martin, who didn’t appear in a Disney feature until 1991’s Father of the Bride, but nevertheless learned to perform in public as a Disneyland employee from ages 10 to 18.

American film production label owned by Disney & Marvel Studios.
Credit: SOPA Images/ LightRocket via Getty Images

Disney Is a Very, Very Big Business

It’s been a long time since Disney was merely a studio of ink-stained animators and noisy voice actors, but even its visionary founder would likely be staggered by its multifaceted presence across numerous businesses today. Along with resorts in Paris, Tokyo, Hong Kong, and Shanghai, the Mouse Kingdom oversees a line of cruise ships, Hollywood Records, the Adventures by Disney travel company, and the Steamboat Ventures venture capitalist firm. Among its media subsidiaries, Disney owns 20th Century Studios, ABC, National Geographic, LucasFilm, and the massive cash cow that is Marvel Studios. Altogether, the century-old conglomerate was valued at just under $150 billion as of September 2023.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by INTERFOTO/ Alamy Stock Photo

More than five centuries after the Aztec empire’s fall to Spanish conquistadors in 1521, history buffs can’t seem to learn enough about the fascinating history of the legendary civilization. In fact, secrets are still being unearthed below the streets of Mexico City. Here are some of the fascinating facts we’ve unearthed about the daily life of this once-thriving society.

Aztec civilization, 15th century.
Credit: DEA / G. DAGLI ORTI/ De Agostini via Getty Images

They Didn’t Call Themselves Aztecs

As with many ancient societies, much of what we know about the Aztecs comes from written accounts from outside their culture — in this case, descriptions from Spanish conquistadors who arrived in modern-day Mexico around 1519. However, the community that modern historians call “the Aztecs” actually referred to themselves as the Mexica or Tenochca people. Both names come from the region where the empire once flourished — southern and central Mexico, along with the capital city of Tenochtitlan (modern-day Mexico City). The Aztec name likely comes from the Mexica origin story describing their homeland of Aztlan (the location of which remains unknown).

Nahuatl language historic sculpture.
Credit: Jeffrey Isaac Greenberg 18+/ Alamy Stock Photo

The Aztec Language Is Still Alive Today

At the height of the Aztec empire’s reign, Nahuatl was the primary language used throughout Mexico, and had been for centuries. Colonists arriving from Spain around the early 16th century introduced Spanish, which would eventually replace Nahuatl. But the Indigenous language isn’t at all dead; more than 1.5 million people speak Nahuatl in communities throughout Mexico, plus there are efforts in the southern U.S. to teach and revive the language. Spanish and English speakers who’ve never heard Nahuatl still know a few words with Aztec origins, such as tomato (“tomatl”), coyote (“coyōtl”), and chili (“chīlli”).

Manuscript of Aztec texts.
Credit: DE AGOSTINI PICTURE LIBRARY/ De Agostini via Getty Images

The Aztec Empire Had Vast Libraries

Surviving accounts from Spanish colonists describe the voluminous libraries of the Aztecs, filled with thousands of books on medicine, law, and religion. But early historians didn’t give the Aztecs enough credit when it came to written language skills, once considering the hieroglyphic style used by scribes as primitive. Few written documents have survived the centuries since the Aztec empire’s disappearance, most destroyed by Spanish conquistadors. But more recent evaluation of the last remaining texts shows that the Mexica people had a sophisticated writing system on par with Japanese that may have been the most advanced in the early Americas.

Engraving print of Aztec women.
Credit: Historical Picture Archive/ Corbis Historical via Getty Images

The Mexica People Were Highly Educated

Aztec society had a rigid caste system dividing communities into four main classes: nobility, commoners, laborers, and enslaved people. Regardless of social standing, every child in the community attended school to receive specialized education, often for a role that was chosen at birth. Schools were divided by gender and social standing, though all Mexica children learned about religion, language, and acceptable social behavior. Children of nobility often received law, religion, and ethics training to prepare them for future leadership positions, and schools for commoners taught trade skills like sculpting, architecture, and medicine. Because Aztec culture centered on expansion and advancement through military strategy, teenage boys of all ages received military and combat training, while girls were educated in cooking, domestic tasks, and midwifery.

Aztec Tonalpohualli calendar.
Credit: Science History Images/ Alamy Stock Photo

Aztecs Used Two Calendars

Mesoamerican calendars from societies of old have remained an interest to many people, especially those who speculate about astrological events or end of the world scenarios. But calendars used by the Aztecs weren’t too dissimilar from our own. The Mexica people relied on two simultaneous calendars: one 365-day solar calendar called the Xiuhpōhualli and a 260-day religious almanac called the Tōnalpōhualli. The solar calendar consisted of 18 months with 20 days, each month named for a significant festival or event. The religious calendar dictated auspicious times for weddings, crop plantings, and other events, using a 13-month calendar with each day represented by an animal or natural element instead of numerals.

Shoes of an Aztec chief.
Credit: Chris Hellier/ Corbis Historical via Getty Images

Aztecs Wore the First Rubber-Soled Shoes

Centuries before rubber became an everyday mainstay in modern products, the ancient Mexica people were harvesting and collecting rubber tree sap for a variety of uses. Archaeological digs throughout Mesoamerica have excavated rubber balls likely used in ceremonial games or for religious offerings, but historians in the early 2000s found that Aztecs also created rubber soles for more comfortable and protective shoes. Researchers believe that Mexica artisans blended and heated rubber tree sap and extract from plants to create the rubber mixture, which could then be shaped and used for shoes, rubber bands, statues, and more.

 Construction of the city of Tenochtitlan with floating fields.
Credit: DEA / G. DAGLI ORTI/ De Agostini via Getty Images

Farmers Created Floating Fields

Constructing the city of Tenochtitlan was no small feat for the early Aztec settlers, mostly because the city was built on water. While centered on an island within Lake Texcoco, the city expanded across the lake with bridges that reached its shores with aqueducts and canals that supplied Tenochtitlan with fresh water. Farmland wasn’t vastly available on an island of more than 400,000 people, leading Mexica farmers to create floating fields called chinampas. Gardens were constructed by weaving tree branches, reeds, and sticks between poles to create an anchored base covered with mud and dead plants that broke down into nutritious soil. Chinampas doubled as a sanitation system using human waste as fertilizer, which helped crops grow vigorously while protecting drinking water from contamination.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Vladimir Vladimirov/ iStock

Even those unfamiliar with the details of the civil rights movement of the 1950s and ’60s can point to Martin Luther King Jr. and Rosa Parks as key figures of the era. A deeper dive will reveal names including A. Philip Randolph, James Farmer, Whitney M. Young, and Medgar Evers — leaders who earned their place among the luminaries of the period for spurring legal and social progress.

However, not everyone earned due recognition for their contributions, whether because of personality clashes or deep-seated prejudices that went beyond matters of race. Here are five lesser-known civil rights influencers who helped to change the course of history.

American civil rights activist Bayard Rustin.
Credit: Edward A. Hausner/ Archive Photos via Getty Images

Bayard Rustin

Well before the nation watched the struggle for Black equality unfold on television, Bayard Rustin was at the forefront of a previous generation of activists as a co-founder of the Congress of Racial Equality (CORE). CORE’s main objective was to use “nonviolent direct action” while fighting for civil rights. Rustin later helped Martin Luther King launch the Southern Christian Leadership Conference (SCLC), and is credited as a primary organizing force behind the 1957 Prayer Pilgrimage for Freedom and the 1963 March on Washington.

But Rustin was also an openly gay man, and as such, was always in danger of being marginalized despite his obvious brilliance as an adviser and strategist. He was forced out of the SCLC after a congressman threatened to spread rumors about an affair between King and Rustin, and while he returned to pull together the March on Washington, internal opposition forced him to accept a lesser public role in the proceedings.

Rustin later served as president and co-chair of the A. Philip Randolph Institute, and continued his push for economic progress even as the wider public movement lost steam. By the time of his death in 1987, Rustin was something of a historical footnote, despite having his fingerprints all over the major civil rights victories of his day.

American Civil rights activist Claudette Colvin.
Credit: The Washington Post via Getty Images

Claudette Colvin

Nine months before Parks was arrested for refusing to surrender her bus seat to a white passenger in Montgomery, Alabama, the same thing happened to 15-year-old Claudette Colvin. So why was the Parks incident the one that ignited the Montgomery bus boycott and transformed the issue into a national story? As Colvin herself later conceded, the then-42-year-old Parks, a secretary for the NAACP, was considered by some to be a more respectable symbol for the boycott, particularly after it was discovered that the unwed teenager had become pregnant.

Nevertheless, Colvin wound up playing a crucial role as events unfolded, as she was named a plaintiff in the 1956 Browder v. Gayle case that challenged the constitutionality of Alabama’s segregated buses and provided the legal backbone for the boycott’s triumph. Colvin left Alabama soon after and spent most of the following decades living anonymously in New York City, though her contributions have finally earned some long-overdue recognition in recent years.

portrait of civil rights leader Fannie Lou Harmer.
Credit: Afro Newspaper/Gado/ Archive Photos via Getty Images

Fannie Lou Hamer

If King served as the face and eloquent voice of the civil rights struggle, then Fannie Lou Hamer represented its rank-and-file members who were sparked to action because they were “sick and tired of being sick and tired.” Born into a Mississippi family of sharecroppers, Hamer was fired after attempting to register to vote in 1962. She used that experience to fuel a tireless dedication to voting rights and launch the Mississippi Freedom Democratic Party (MFDP) in 1964.

That summer, Hamer entered the national spotlight with a powerful speech before the Democratic National Committee’s credentials panel in which she recalled being subjected to a brutal beating in jail. But her presence also underscored the limitations of her position in the pecking order; President Lyndon B. Johnson dismissed her as an ” illiterate woman,” and even ostensible ally Roy Wilkins of the NAACP said she was “ignorant.”

Still, Hamer kept up the fight for equal rights even as she struggled to summon the respect she deserved. She later spearheaded the foundation of the Freedom Farm Cooperative in 1969 and the National Women’s Political Caucus in 1971.

James Meredith relaxing at Howard House located within Dillard University.
Credit: Bettmann via Getty Images

James Meredith

Even when compared to other activists who overcame intimidation and violence to participate in demonstrations, James Meredith stands out for his astonishing displays of courage. In the fall of 1962, the 29-year-old Air Force veteran integrated the University of Mississippi. His mere presence at the university caused an uproar and ignited a massive riot that drew 30,000 U.S. troops, federal marshals, and national guardsmen into the fray. Four years later, Meredith embarked on a solo “March Against Fear” out of Memphis, Tennessee, but was shot before he could complete the planned 220-mile walk to Jackson, Mississippi.

While he drew praise from King, most notably in the famed “Letter from Birmingham Jail,” Meredith was never one to conform to the expectations of others. In 1967, he raised eyebrows by endorsing the reelection campaign of former Mississippi Governor Ross Barnett, who once vehemently opposed Meredith’s entry into the state’s flagship university. Two decades later, after several failed attempts to run for office, Meredith supported the Louisiana gubernatorial campaign of former KKK grand wizard David Duke.

Today, a statue commemorating Meredith’s achievement stands on the Ole Miss campus, though the rest of his complicated story is often omitted from history lessons.

Portrait of Pauli Murray.
Credit: Everett Collection Historical/ Alamy Stock Photo

Pauli Murray

Pauli Murray was enormously influential as a lawyer, writer, and teacher. She became California’s first Black deputy attorney general in 1945, as well as the first African American to earn a Doctor of Juridical Science from Yale Law School two decades later. Additionally, the acclaimed scholar saw her legal arguments used in the groundbreaking cases of Brown v. Board of Education (1954), which struck down segregation in public schools, and Reed v. Reed (1971), which extended the rights under the 14th Amendment’s Equal Protection Clause to women.

Publicly critical of the sexism rife within the ranks of the civil rights movement, Murray helped launch the National Organization for Women (NOW) in 1966. Eventually, she found herself out-of-step with its leadership and stepped away. On her own once again, Murray resigned from her teaching post and entered New York’s General Theological Seminary, en route to one final historic achievement in 1977 as the first African American woman to be vested as an Episcopal priest.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by ANNA GRANT/ Shutterstock

It’s a shame that we don’t remember being babies, because infants have a fundamentally separate way of existing in the world. Their brains are wired differently from adult brains to help them survive as brand-new humans, and they experience everyday things for the first time. While we might not know a lot about lived baby experiences, we do know a lot of fascinating and weird things about their bodies and minds — including these seven facts.

Orthopedist examining a babies bones.
Credit: New Africa/ Shutterstock

Babies Have Nearly 50% More Bones Than Adults

The typical adult human body has 206 to 213 bones. Babies, on the other hand, have closer to 300 when they’re born. Many human bones start out as multiple bones and fuse into one as flexible, connective tissue called cartilage hardens. In infants, the skull is actually six bones; they overlap to help make birth a little easier. As the baby gets older, the skull stays flexible to allow the brain to grow.

Adults do, however, have at least two bones that babies don’t: kneecaps. When babies are born, those future bones are completely made from cartilage.

The birth of a child in a maternity hospital.
Credit: Katya Tsiganok/ Shutterstock

Newborn Babies Have No Tears

Crying is very closely associated with babies, but oddly enough, newborn babies can’t make tears, just a lot of noise. The lacrimal gland, which produces tears, isn’t fully developed until after two weeks. It takes even more time for the lacrimal gland to produce enough liquid for the tears to actually be noticeable. Babies get their “real tears” at 2 or 3 months old.

Close-up of a crying baby girl.
Credit: morrowlight/ Shutterstock

Babies May Cry With an Accent

It seems babies soak up the language around them starting before birth. In a 2009 study, researchers recorded 60 French and German babies crying and found that subtleties in their cries mimicked each language. French cries had a slight lilt, while German babies abruptly started and dropped off at the end. This means that not only can babies recognize and mimic the musicality of their parents’ language, but they likely pick it up while still in the womb.

A baby hand grasping on an adult thumb.
Credit: Jsnow my wolrd/ Shutterstock

Infants Have Reflexes That Adults Don’t

We all have reflexes, like putting our hands out when we fall or kicking when a doctor hits our knee with a mallet. These movements happen involuntarily. Babies are born with a ton of them, which eventually go away. One is the Moro reflex, more commonly known as the startle reflex. If a baby is spooked by something or their head moves rapidly, they’ll respond by flailing their arms out really quickly. Babies will also automatically grasp when the palm of their hand or the sole of their foot is touched. Some of these eventually become learned behaviors — like rooting when their face is touched, which helps them find a nipple — but most newborn reflexes are gone within a year, and some last only a couple of months.

A baby girl learning to walk with fathers support.
Credit: mae_chaba/ Shutterstock

Babies Can Sense Rhythm

In 2009, a group of researchers in the Netherlands played a rock drum beat to 14 newborn babies hooked up to an EEG, which measures electrical activity in the brain. Sometimes, they skipped a beat without disrupting the rhythm. Other times, they stumbled the beat in a way that broke the rhythm entirely. When they broke the rhythm, the babies had a brain response consistent with that of an adult control group. Basically, babies expected the next beat to come along in time, much like an adult would.

4 Months old baby girl lying on colorful play mat.
Credit: Ekaterina Pokrovsky/ Shutterstock

Babies See Red First

Newborns that can see aren’t colorblind, but their brains don’t perceive colors the same way that older children and adults do. They can also only see about 8 to 10 inches in front of their faces at first. This is why babies tend to enjoy high-contrast, black and white images — they’re easy for them to see. A few weeks after birth, red comes into focus, followed by green. Infants can see a full range of color by about 5 months old, although still not quite as vividly as adults.

Detail of a newborn baby ear.
Credit: Roman Sorkin/ Shutterstock

Infants May Experience Multiple Senses at Once

Up to 4% of adults experience synesthesia, which means that two or more senses are tied together; for example, colors will play sound. (Notable names with the ability include painter Wassily Kandinsky, writer Vladimir Nabokov, and composer Franz Liszt.) Scientists have long suspected that infant senses are completely tied together, but since babies can’t describe their senses in the same way adults can, the hypothesis is pretty difficult to prove. One study from 2009, however, supports it. Researchers found that young infants associate shapes (a stand-in for graphemes, or written language) with colors — the most common kind of synesthesia in adults. The association wasn’t as strong with 8-month-olds, and was absent in an adult control group.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Alones/ Shutterstock

When it comes to human spaceflight, NASA and other space agencies around the world have their sights set on Mars. Humans have gazed upon this small pinprick of red in the night sky for millennia, and in that time, ancient astronomers, Enlightenment philosophers, and high-tech robots have learned a lot about our planetary neighbor. Named after the Roman god of war, Mars lies some 33.9 million miles from Earth at its closest possible approach. It’s during this precious moment of planetary alignment, which occurs roughly every two years, that NASA sends its scientific cargo toward the red planet. These six fascinating facts are the result of centuries of tireless research and scientific discovery, even as they hint at other mysteries yet to be answered.

Red planet with arid landscape, rocky hills and mountains.
Credit: Artmim/ Shutterstock

Mars Isn’t Really Red

“The red planet” is a slight misnomer. Martian rocks are filled with iron, and much like on Earth, if you leave iron outside in the elements it’ll eventually rust. The dust from these oxidized rocks gets kicked up into the atmosphere, creating the red hue stargazing humans see. But over the millennia, we’ve crept closer to our planetary neighbor for a better look — even dropping a few robotic rovers to do some poking around — and scientists have discovered that the surface of Mars is more yellowy-brown, sort of like butterscotch. In fact, Mars is a vibrant palette of gold, tan, brown, and even some green. NASA’s Curiosity rover also discovered in 2015 that if you dig only a few inches beneath the oxidized outer layer of the Martian surface, the soil is actually bluish-gray — not red at all.

Mars with its two cratered moons, Phobos and Deimos.
Credit: Elena11/ Shutterstock

Mars’ Moons Are Nothing Like Earth’s Moon

Ever since their discovery in 1877, the moons Phobos and Deimos — named after the Greek gods of fear and dread, respectively — have been something of a curiosity. Phobos orbits only 3,700 miles above Mars (compared to our moon’s 238,855 miles). Deimos, meanwhile, is relatively tiny at some 6.8 miles in diameter, making it one of the smallest moons in the solar system.

However, the biggest mystery about Mars’ moons is where they came from. One theory suggests that the two moons could have been formed from asteroids impacting the Martian surface, much like our moon. Alternatively, they could possibly be asteroids themselves, captured in orbit by Mars’ gravitational pull. Unfortunately, neither moon will be around forever. Phobos is slowly being pulled toward Mars and will eventually (in 50 million years or so) break apart, either forming a ring around Mars or impacting the surface. Deimos, on the other hand, is slowly escaping Mars’ gravitational clutches and will one day be flung into space.

A full disk view of the north polar ice cap of Mars.
Credit: Historical/ Corbis Historical via Getty Images

Mars Also Has Four Seasons

Seasons might seem like a feature exclusive to planet Earth, but Mars also experiences four distinct seasons. Because the Martian year is twice as long as Earth’s, its seasons are also double in length — stretching from 142 days in autumn to nearly 200 days in spring in its Northern Hemisphere. (Days on Mars are 24 hours, 39 minutes, 35.244 seconds.)

Mars’ ice caps grow during its winter period and recede, almost disappearing entirely, when spring turns to summer. Summer on Mars can be tumultuous: Because the Red Planet is closest to the sun when the Southern Hemisphere is tilted toward it, Martian summers in the Southern Hemisphere are much hotter than summers in the Northern Hemisphere, and this temperature difference creates strong storms. Martian summers are also far from hospitable, as lows reach -284 degrees Fahrenheit. However, summer highs can reach a balmy 68 degrees if you’re willing to brave those chilly nights.

Crater Water Ice on Mars.
Credit: NG Images/ Alamy Stock Photo

Mars Has Liquid Water

Earth is a water planet — 71% of its surface is covered with the stuff. Mars, on the other hand, has more in common with the Mojave Desert, but that doesn’t mean it isn’t sporting some H2O of its own.

Scientists have known for a while that water flowed on Mars in its distant past, but until recently, many believed that any water on the planet was currently locked up in its frozen ice caps or in Martian rocks. But in 2018, the European Space Agency’s Mars Express mission used ground-penetrating radar to explore Mars’ southern ice cap, and scientists were astounded to find that liquid water flowed a mile beneath the surface of a subglacial lake. Although temperatures there are far below water’s typical freezing point, salt deposits keep the water in liquid form. These pools beneath the icy surface are similar to Lake Vostok in Antarctica, and their discovery opens up an exciting new area for exploration.

Sample scoop and arm, Viking 1 Mission to Mars.
Credit: Heritage Images/ Hulton Archive via Getty Images

Mars’ Soil Is Poisonous

With a nearly nonexistent atmosphere, freezing temperatures, and scarce water, Mars isn’t a place you want to stay if you’re a living, breathing organism. Even microbes can’t survive on the surface, because Mars’ soil is poisonous.

For more than 20 years, Mars rovers have analyzed soil samples in different parts of the planet and have found a ubiquitous compound known as perchlorate, a substance toxic to humans. Usually, microbes love perchlorates, but Mars’ particular conditions — especially its high abundance of UV light — turn the perchlorates into a toxic cocktail. In 2017, scientists recreated Martian conditions in a lab and found that UV rays broke down perchlorates into hypochlorite and chlorite, a mixture that’s fatal to bacteria. Within 30 seconds, all microbes exposed to this Martian soil facsimile were sterilized.

This photograph shows the Vehicle System Test Bed (VSTB) rover.
Credit: Photo 12/ Universal Images Group via Getty Images

Mars Is the Only Planet Entirely Inhabited by Robots

Scientists haven’t found life on Mars (yet), but that doesn’t mean Mars is a boring place. On July 4, 1997, NASA’s Pathfinder rover landed on the red planet, and in the quarter-decade since, NASA has sent four more rovers — Spirit, Opportunity, Curiosity, and its latest robotic addition, Perseverance (2020-2021) — to follow in its footsteps (or in this case, treads). The European Space Agency also hopes to send its rover, the Rosalind Franklin, to Mars by 2028. In addition to these rovers, Mars is also populated by robotic landers such as NASA’s InSight, several orbiters from space agencies around the world, and even a pint-sized robot helicopter. Mars might be void of life, but until humans put boots on Martian soil, the planet will continue to be a playground for one thing: robots.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.