Three Midwestern babies born in 1958 grew up to change the course of pop music and culture forever. Prince Rogers Nelson was born on June 7 in Minneapolis, Minnesota; Madonna Louise Ciccone was born on August 16 in Bay City, Michigan; and Michael Joseph Jackson was born on August 29 in Gary, Indiana. Each of the future genre-busting superstars was welcomed into a big family: Eventually, Prince had eight siblings, Madonna had seven, and Jackson had 10. Jackson was the first to begin his performing career, at age 5. Six years later, he made his television debut as the youngest member of the Jackson Five when the group sang “It’s Your Thing” at the Miss Black America Pageant. Prince debuted at 21 on American Bandstand, singing "I Wanna Be Your Lover" and "Why You Wanna Treat Me So Bad?”; Madonna made one of her earliest TV appearances on the same show four years later, performing “Holiday,” and telling Dick Clark of her wish to “rule the world.”
In addition to exhilarating music, era-defining music videos, and various business ventures, all three made films. Precocious Jackson appeared in the first movie of the group, 1978’s The Wiz, a straight-from-Broadway reimagining of The Wizard of Oz featuring an all-Black cast (Jackson played the Scarecrow). In 1985 — the year Prince and Madonna briefly dated — Prince won an Oscar for Best Music, Original Song Score for his first film, the semi-autobiographical Purple Rain. Meanwhile, shortly after Madonna (in a Marilyn Monroe-inspired look) took Jackson as her plus-one to the 1991 Academy Awards, she shot A League of Their Own, a tribute to the all-female baseball teams that entertained fans during World War II. (The so-called Queen of Reinvention has also appeared in dozens of other films over the years.)
Madonna holds the world record for the most costume changes by a character in a film.
Costume designer Penny Rose dressed Madonna in 85 outfits when she portrayed Argentinian actress, activist, and First Lady Eva Perón in the 1996 film Evita. Those clothes were accessorized with 45 pairs of shoes, 56 pairs of earrings, and 39 hats. In a savvy branding tie-in, Bergdorf Goodman began selling Perón-inspired hairpieces, and Madonna — pregnant with her first child, Lourdes Leon, during filming — even channeled her alter-ego for the cover of Vogue. Madonna won a Golden Globe for her performance in director Alan Parker’s musical biopic (an adaptation of the Andrew Lloyd Webber musical), and Rose received a BAFTA nomination for her costumes.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Beloved film character Mary Poppins is known for sweetly singing that “a spoonful of sugar makes the medicine go down.” While it works wonders on-screen, the trick didn’t start with the fictitious nanny; healers, doctors, and pharmacists have relied on sugar to help patients choke down unsavory medications for thousands of years. But at one time, the sweet stuff wasn’t just an add-in — it was often the featured ingredient in healing remedies believed to cure all kinds of ailments. Sugar was used to treat sickness and injury as far back as the first century, when Middle Eastern practitioners prescribed it for dehydration, kidney issues, failing eyesight, and more. During the 11th century, English monks noted sugar’s ability to soothe upset stomachs and digestive issues, and by the Middle Ages doctors tried treating bubonic plague with concoctions of hemp, sugar, and more unpleasant ingredients. As recently as the 1700s, pharmacists recommended a glass of lemon juice and sugar water for asthma attacks.
Mary Poppins’ “A Spoonful of Sugar” was inspired by the polio vaccine.
Disney’s 1964 film is known for its song urging sick children to palate bitter medicine, but the ditty takes its inspiration from children’s vaccines. Songwriter Richard Sherman penned the tune after hearing how his son received the oral polio immunization on a sugar cube.
Part of sugar’s allure — and perhaps perceived medicinal benefits — may have been connected to its former rarity. Some historians believe sugarcane originated in Southeast Asia, where farmers may have grown it as early as 8000 BCE, but refining began around 2,500 years ago in India — a process that made sugar shelf-stable and allowed it to spread to other regions. With far to travel, the sweetener was expensive by the time it reached medieval Europe, and for centuries was mostly reserved for the wealthy. But in 1747, German chemist Andreas Sigismund Marggraf discovered a way to produce sugar that didn’t require the sweltering climates in which sugarcane plants grow. Instead, sugar could be harvested in colder regions from the sugar beet, a root vegetable that grows in about three months. Over the next 100 years, sugar beet factories sprang up across Europe and then America, driving down the price of sugar and eventually giving people of all means a chance to savor a little sweetness — with their medicine or otherwise.
Sugar was once considered a spice, not a sweetener.
Advertisement
Sugar has been found in space.
Granulated, brown, powdered, pearl, cubed — there’s a lot of sugar on Earth. And surprisingly, there’s sugar in space, too. Researchers first discovered evidence of glycolaldehyde, a type of simple sugar, in 2000 while looking for molecules in space that could support life. Glycolaldehyde is much less complex than cultivated Earth sugars, with only eight atoms compared to cane sugar’s 45. But when it’s found in space, researchers believe the stuff could play an important role in jump-starting life beyond our planet. That’s because glycolaldehyde can combine with a chemical called propenal to make ribose, a component of ribonucleic acid, which is similar to DNA and found in all living things. So far, glycolaldehyde has only been found both in the interstellar gas cloud at the Milky Way’s center and in the gases surrounding a young star 400 light-years from Earth.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
By the time most of us reach age 20 or so, the bones in our body are pretty much done growing. The growth plates that cause us to put on inches in our youth are now hardened bone, and in fact, adults tend to drop an inch or two in height as worn-out cartilage causes our spines to shrink over time. However, there are a few bones that buck this biological trend. Skulls, for example, never fully stop growing, and the bones also shift as we age. A 2008 study from Duke University determined that as we grow older, the forehead moves forward, while cheek bones tend to move backward. As the skull tilts forward, overlying skin droops and sags.
The largest skull ever discovered belongs to a T. rex.
The world’s largest skull does belong to a dinosaur — just not a T. rex. The record-breaking skull, standing some 10 feet, 6 inches tall, belongs to a Cretaceous Period Pentaceratops, currently on display at the Sam Noble Oklahoma Museum of Natural History.
The skull isn’t the only bone that has a positive correlation with age. Human hips also continue to widen as the decades pass, meaning those extra inches aren’t only due to a loss of youthful metabolism. In 2011, researchers from the University of North Carolina School of Medicine discovered that hips continue to grow well into our 70s, and found that an average 79-year-old’s pelvic width was 1 inch wider than an average 20-year-old’s. So while it’s mostly true that humans stop growing after the age of 20, nature always likes to throw in a few exceptions to the rule.
The pirate flag bearing a skull and crossbones is famously known as the Jolly Roger.
Advertisement
The idea that your ears and nose never stop growing is a myth.
It’s a common misconception that our ears and noses continue to grow throughout our lifetime — though they sometimes do appear to take on almost cartoonish proportions toward the end of our lives. So what’s going on here? Turns out, it’s not really the cartilage in our nose and ears that’s to blame. Instead, the culprit is decades of experiencing gravity. This constant gravitational pull — along with general degradation due to age — causes the collagen and elastic fibers in our ears and nose to droop and elongate. The surrounding skin supporting these structures also breaks down and droops over time while simultaneously losing volume. Studies have shown that the average human ear elongates approximately 0.22 millimeters a year. It’s a process that’s so reliable, ears can actually be used as a tool for determining a person’s age.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Andrea Skjold Mink/ Shutterstock
If Tater Tots are your favorite fast-food side, you have the ingenuity of two brothers — Golden and Francis Nephi Grigg — to thank. However, when the pair invented the crispy potato composites in the 1950s, they didn’t set out to change snack food history. Instead, their potato creation came from a quest to reduce the amount of food waste produced at their frozen foods plant.
Before becoming successful spud salesmen, Golden and Francis sold frozen corn. Around 1949, they decided to diversify into other fruits and vegetables, and converted a factory in Ontario, Oregon (on the border with Idaho), into a potato-processing plant they were later able to purchase. In 1952, the Griggs launched the Ore-Ida brand, which became popular for its frozen french fries. The crispy potato spears were a hit among home cooks at a time when prepared meals and frozen foods were becoming more widely available thanks to postwar technology.
“Tater Tot” is not a generic term — Ore-Ida trademarked the name in 1957. In the years since, competitors have come up with creative alternatives such as “potato gems,” “potato crunchies,” and “spud puppies.”
The downside to booming french fry sales, however, was the waste left behind. Initially, the Griggs sold vegetable byproducts to farmers as livestock feed, but they soon looked for a way to nourish humans instead. They began experimenting with chopping up the potato scraps, mixing them with flour and spices, then shaping the result into a rectangle with the help of a simple, homemade plywood mold. The first Tater Tots — named, by one account, after an employee won a contest by suggesting “tater” for potato and “tot” for small — debuted in 1956. At first, shoppers seemed skeptical of the inexpensive scrap-based snack, but after prices were raised slightly to suggest an air of sophistication, Tater Tots quickly found a permanent home in frozen food aisles, where they continue to reign today.
Tater Tot producer Ore-Ida’s name is a nod to Oregon and Idaho, two potato-growing states.
Advertisement
Miners traded gold for potatoes during the Klondike gold rush.
Between 1848 and 1855, an estimated 300,000 people made their way to California, hoping to strike it rich by mining the supposedly plentiful gold just beneath the Earth’s surface. Unfortunately, many miners at the time faced a common foe: malnutrition. Food costs were often inflated in remote mining towns, and nutritious fresh food was generally hard to come by, meaning many miners had limited diets of shelf-stable goods like bread, salt pork, and beans. For gold hunters who trekked farther north to Alaska for the Klondike gold rush, which kicked off in 1896, that often meant an increasing risk of scurvy — a vitamin C deficiency that can cause fatigue and tooth loss, among other effects. Scurvy could be remedied with potatoes, a vegetable that Klondike miners could more easily source than many other fresh produce items. However, shortages and unscrupulous peddlers increased the price of potatoes, forcing many prospectors to trade their hard-won gold for spuds in an effort to ward off the effects of the illness.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.
The ancient Greeks can be thanked for a great many things, but paper isn’t one of them. On March 11, 105 CE, court official Cai Lun presented his papermaking process to Emperor He of the Han dynasty in China. It’d take many centuries, but the idea transformed the world.
While we know when this library flourished, defining the appearance of the earliest book is slightly murkier. The Egyptians, for example, are known to have written on papyrus scrolls; when the Library of Alexandria in Egypt burned in the first century BCE, 40,000 priceless scrolls were lost. By about the second century CE, Romans began using bound codexes, a kind of proto-book that consisted of papyrus or parchment sheets between wooden covers. The arrival of Christianity made the codex immensely popular, and it basically replaced the scroll by the sixth century CE. Block-printed books showed up in China around 700 CE, although Europe didn’t see anything similar until Johannes Gutenberg invented mechanical movable type around 1448. So while libraries haven’t always housed books, they have been repositories of human knowledge — in whatever form it might take.
The oldest surviving literary work is thought to be “The Epic of Gilgamesh,” composed 4,000 years ago.
Advertisement
The Library of Congress has burned twice.
The U.S. Library of Congress contains the largest collection of published books in world history — but it wasn’t an easy journey getting there. During the War of 1812, the British set fire to the Capitol building, which contained the 3,000-volume Library of Congress. It was completely consumed in the conflagration. To reestablish the library, Congress purchased former President Thomas Jefferson’s book collection — the largest in the U.S. at the time — for $23,950. Sadly, a second fire, on Christmas Eve 1851, consumed a large portion of that library, including many of Jefferson’s original volumes. Two years later, Congress constructed a new library, this time made from flame-resistant cast iron.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The Library of Congress has one of the largest collections of books in human history — and that includes comic books. With more than 12,000 titles stretching across 140,000 issues in its collection, the library holds the largest public collection of comic books in the United States. There’s a little bit of everything — Golden Age superheroes like Superman and Captain America, translated reissues of Japanese manga, foreign translations of American comics, underground comics from the ’70s and ’80s, bande dessinées (French comics) from the ’60s, and much more. The library also has many older comics available on microfiche, including the always-elusive Action Comics #1 (the first appearance of Superman), and it regularly adds comics electronically as they become available.
The Library of Alexandria is one of the Seven Wonders of the Ancient World.
The Egyptian city of Alexandria does make the list, but it’s their famous lighthouse that nabs the accolade — not the library. Both were built around 280 BCE. The lighthouse was a ruin by the 14th century, but its remains were rediscovered in the 20th century.
Of course, this isn’t a collection you can simply peruse at your leisure. Due to the fragility of the comics (the oldest issue is nearly 90 years old, after all), these specimens are only available to “researchers under special conditions” — but that includes people such as scriptwriters, pop culture historians, avid collectors, and graphic artists. The Library of Congress has also put on several public comic-centric exhibitions to showcase some of the most intriguing artifacts in its collection.
The first comic book is often said to be 1897’s “The Yellow Kid in McFadden’s Flats.”
Advertisement
The Library of Congress used to have a book-delivering conveyor belt.
In 1897, the Library of Congress moved out of the Capitol building and into new digs across the street. Congressmen grumbled at having to cross the street to get books, so the Army Corps of Engineers created a tunnel between the two buildings, and Boston’s Miles Pneumatic Tube Company designed a “book conveying apparatus.” Requests for books came to the library via pneumatic tube, after which the requested edition was whisked away in a tray that rode the rails at a speed of about 10 feet per second. Once at the Capitol, the book would travel up three stories until it arrived near the House floor. The whole trip took about a quarter-mile, and lasted only five minutes. The library built updated conveying systems in the following years to connect annex buildings constructed in 1939 and 1980, but the original tunnel conveyor system was dismantled in the 2000s to make way for a subterranean visitor center, and the other belts have since been abandoned due to high repair costs. Today, though most senators and representatives just use the internet to get the books they need, a Library of Congress staffer can still sometimes be seen hand-delivering books to the Capitol.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
More than 200 years after the 10th President of the United States was born, one of his grandsons is still alive. As impossible as that may seem, the math — and biology — checks out. John Tyler, who was born in 1790 and became President in 1841 after William Henry Harrison died in office (possibly of pneumonia), had a son named Lyon Gardiner Tyler in 1853. This son was born to the then-60-something Tyler and his second, much younger, wife, Julia Gardiner. Lyon then had two sons of his own in his 70s (also with a much younger second wife), one of whom — Harrison Ruffin Tyler, born in 1928 — is still gracing the Earth in his 90s.
George H.W. Bush and George W. Bush were the only father and son to both become President.
The Bushes were preceded in this exceedingly rare feat by John Adams (1797–1801) and John Quincy Adams (1825–1829), the second and sixth Presidents, respectively.
It may make this feat slightly less surprising to know that Tyler had 15 children, more than any other POTUS in U.S. history. Tyler’s actual presidency is less remarkable than this biographical oddity, alas — he was referred to as “His Accidency” upon assuming office and wasn’t renominated in the following election. (He was also an enslaver whose profitable plantation ran on the labor of 40 to 50 enslaved people.) Though his grandsons haven’t had major political aspirations, you might say it was in Tyler’s blood to seek office: His father, John Tyler Sr., was roommates with Thomas Jefferson at the College of William and Mary and later served as the 15th governor of Virginia.
Theodore and Franklin Roosevelt were fifth cousins.
It’s well known that the two commanders in chief were related, but how they were related is less-common knowledge. FDR assumed the presidency 24 years after his fifth cousin left the Oval Office, and their mutual political aspirations were no coincidence: The longest-serving President in U.S. history greatly admired his distant relative and intentionally followed in his footsteps. Teddy largely approved: Upon Franklin’s engagement to Eleanor — Franklin’s fifth cousin once removed — the younger Roosevelt received a note from the elder saying he was “as fond of Eleanor as if she were my daughter; and I like you, and trust you, and believe in you.”
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
It’s likely that at some time in your life, a certain smell, whether the earthy aroma of freshly cut grass or the unmistakable fragrance of your grandparents’ house, triggered a powerful and strangely detailed memory. Well, there’s a biological reason for that. Unlike our four other best-known senses, whose electrical impulses are first sent to our thalamus before being sent to appropriate areas for memory, such as the hippocampus, our sense of smell takes a different route. Instead of being sent to the thalamus, scents go straight to the smell center known as the olfactory bulb. The fact that this bulb is directly connected to the hippocampus and the amygdala (which is responsible for emotional processing) is likely why smell evokes such powerful memories compared to our other senses. These memories can be extremely distinct, and they’re often linked to our childhood, likely because they were first stored when we experienced the scent at a young age.
Most of our sense of taste comes from our sense of smell.
Smell is by far the most important facet of taste. The gustatory nerve cells located in taste buds can only sense sweet, sour, bitter, salty, and umami (savory). Because smell accounts for 80% of taste, without it, humans would be limited to only those five basic tastes.
According to a 2017 study, a part of the olfactory bulb known as the piriform cortex is what allows certain smells to be deposited in our long-term memory, though this cortex requires other parts of the brain to pull this off. The olfactory bulb essentially consults our orbitofrontal cortex, an area of the brain responsible for higher-level taste and smell functions, about whether a smell should be stored in long-term memory.
Companies are very aware that smell can be a powerful reminder of memory and emotion, which is why some of them have even trademarked certain scents (yes, you can do that). Verizon, for example, owns the rights to its “flowery musk scent” used in its stores, and olfactive branding companies work with clients like Nike to leverage the power of smell. Because when it comes to unlocking human emotions and memories, it might be the strongest sense we have.
A heightened sense of smell, sometimes causing nausea, is called hyperosmia.
Advertisement
Humans can smell fear.
Human communication is usually associated with sight, hearing, and touch. But our sense of smell also plays an unrecognized role in how we interact, and several scientific studies have discovered a latent human ability to detect chemical signals in human sweat. One study in 2008 collected sweat from novice skydivers about to jump out of a plane, as well as sweat from the same participants when they were running on a treadmill. Then, another batch of participants sniffed the “fear” and “non-fear” sweat pads while inside a brain scanner. Although the subjects could not consciously differentiate between the smells of the two pads, the scanners detected increased activity in the amygdala, the region of the brain associated with emotional processing, when sniffing the “fear” sweat pad. Many other studies have since reproduced similar conclusions, but scientists remain stumped as to what chemical is triggering this response. While some people have what’s called a “vomeronasal organ,” which in other animals detects the scent of prey and sex pheromones, the organ is vestigial in humans (meaning it once played a role in our primate past but is no longer functional, sort of like the appendix). Despite this, some scientists argue that the ability to sense chemosignals provides significant advantages for animals that thrive in groups (like humans), and could also explain the mostly noncommunicative bond between a mother and newborn. In other words, there’s still some lingering mystery around what the nose knows.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The Earth has been around for a while — about one-third as long as the universe itself. By comparison, Homo sapiensare the new kids on the block. Earth’s story began at the outset of the Hadean eon, about 4.6 billion years ago. It took 600 million years just for the Earth’s crust to take shape, another 300 million years for the first signs of microbial life to pop up, and about 3.2 billion years after that for life to really get going thanks to the evolutionary burst known as the Cambrian explosion. Several mass extinction events and some 465 million years later, mammals finally took center stage, but modern humans didn’t enter the biological limelight for another 65 million years. With the first Homo sapiens appearing around 300,000 years ago, humans have only been on planet Earth for 0.0067% of its existence.
Pangea is the only supercontinent in Earth’s history.
Before Pangea, there was Rodinia (which formed about 1 billion years ago), and geologic evidence suggests that another supercontinent called Nuna formed around 1.6 billion years ago. Scientists believe supercontinents occur in cycles stretched across hundreds of millions of years.
In those 300,000 years, humans have been pretty busy. For a couple thousand years, we harnessed fire and lived a nomadic existence, until around the fourth millennium BCE, when the very first civilizations began to take shape. Since then, humans have been on a meteoric trajectory, going from hunter-gatherer to spacefarer in less than 6,000 years. Carl Sagan famously displayed the universe’s history on a 365-day calendar, with the Big Bang on January 1 and our current moment starting at 12:01 a.m. the next year. On that timeline, it’s only at 10:30 p.m. on December 31 that humans first appear, and all of recorded history is squeezed into just a few seconds — but what a few seconds it’s been.
Scientists think the moon formed when the Earth collided with a planet named Theia 4.5 billion years ago.
Advertisement
A growing number of scientists want to declare a new epoch because of humanity’s impact on the Earth.
Geologists divide the life of the Earth into various categories of time. First, there are eons, which stretch for millions and sometimes billions of years. Then come eras, followed by periods, epochs, and finally ages. For the past 11,700 years, since the end of the Paleolithic ice age, the rise of humanity has coincided with the Holocene Epoch (which is part of the Quaternary period of the Cenozoic era). For most of this epoch, the world’s climate remained stable, but with the rise of modern society, the Earth has undergone rapid changes in a very short time. That’s why many scientists believe that a new epoch, called the Anthropocene (meaning “recent age of humans”), should be adopted, beginning around 1950 and the dawn of the nuclear age. For the Anthropocene to become official, both the International Commission on Stratigraphy and then the International Union of Geological Sciences need to sign off. Despite support within these groups, the epoch has yet to be officially recognized.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Stage names are hardly uncommon in Hollywood, but false initials are rarer — if not unheard of. To wit: Michael J. Fox’s middle name doesn’t start with “J.” The Back to the Future star’s middle name is actually Andrew, but there already was a Michael A. Fox in the Screen Actors Guild when Fox wanted to join it. So why the “J”? The letter is an homage to Michael J. Pollard, a character actor Fox admires. Pollard had more than 100 acting credits to his name by the time he died in 2019, and received Academy Award, BAFTA, and two Golden Globe nominations for his role as gas station attendant-turned-accomplice C.W. Moss in 1967’s Bonnie and Clyde.
Fox wasn’t originally cast in “Back to the Future.”
John Cusack, Charlie Sheen, Ralph Macchio, and many others all auditioned for the role of Marty McFly, but Eric Stoltz was cast. It wasn’t until six weeks into production that director Robert Zemeckis let Stoltz go, feeling he wasn’t right for the part, and Fox got the role instead.
Some stage names are so successful that most people don’t realize they’re stage names. Sir Elton John was born Reginald Kenneth Dwight, for instance, while Jamie Foxx’s real name is Eric Marlon Bishop, and Whoopi Goldberg’s is Caryn Elaine Johnson — to name just a few. Fox, who was diagnosed with Parkinson’s disease in 1991 and announced his condition in 1998, retired from acting in 2020. He founded the Michael J. Fox Foundation for Parkinson’s Research in 2000 and remains devoted to finding a cure for the disease.
“Back to the Future” was almost named “Spaceman From Pluto.”
Advertisement
Middle names date back to ancient Rome.
Well, kind of. Many Romans had three names, but their second name wasn’t quite a middle name. There was the praenomen (personal name), nomen (family name), and cognomen, which indicated which branch of a family you were from. (For instance, Julius Caesar’s full name was actually Gaius Julius Caesar.) There was also a hierarchical element to the Roman naming system, as women generally only had two names and enslaved people often had only one. Middle names as we know them today arose in the Middle Ages, a time when faithful Europeans struggled between giving their children a family name or that of a saint. Eventually deciding that both would be preferable to one, they began the tradition of a child receiving a given name, baptismal name (saint’s name), and surname. That custom eventually reached America along with the people who emigrated there, with secular middle names becoming more common over time.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.