When you think of wine, places such as Bordeaux, Tuscany, and Napa Valley tend to come to mind first. One place you probably don’t think of is Antarctica, and yet vino is indeed made on the world’s coldest, windiest continent.
Fittingly, it’s an ice wine, a dessert wine made from grapes that freeze naturally while still on the vine, and it’s made by just one person: James Pope, whose McMurdo Dry Valleys vineyard is located on the side of the continent near New Zealand. The high saline content of the “soil” (which is closer in texture to sand) gives the wine a unique salty flavor.
The Antarctic Polar Desert, which accounts for the vast majority of the continent’s landmass, has an area of 5.5 million square miles and receives as little as 50 millimeters of precipitation per year.
As that soil is in permafrost throughout much of the year, Pope’s wine is cultivated in the summer with vines placed at least 60 feet apart — any closer and they wouldn’t get enough nutrients. Some of those nutrients are obtained through Adélie penguin droppings, though it may take a sommelier to properly describe the effect that has on taste. Though it isn’t produced on a large scale and can’t exactly be bought at your local wine store, the ice wine’s mere existence is testament to the scientific — and culinary — ingenuity on display in Antarctica.
Antarctica wasn’t always cold enough to reach a temperature of -133.6 degrees Fahrenheit (as can sometimes happen nowadays). During the Cretaceous Period, which lasted from 145 million to 66 million years ago, the continent was ice-free and blanketed by forests — and inhabited by dinosaurs.
Among the dinos who roamed Antarctica were the carnivorous Cryolophosaurus and the armored, aptly named Antarctopelta, neither of which was immortalized in the Jurassic Park franchise. Antarctica began freezing about 34 million years ago, when the greenhouse climate that had been stable since the dinosaurs went extinct drastically cooled and created the icehouse phase the continent is still in today.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
It seems there’s an unwritten rule that history best remembers the biggest names, with everyday people’s stories lost to time. But some historians believe that the first recorded name may have belonged to an average person who was likely an accountant. Discovered in what is now Iraq, a 5,000-year-old clay tablet that once recorded barley storage appears to be signed by “Kushim,” who archaeologists believe may have been responsible for counting the crop. Some historians say it’s not surprising to see the name of an ordinary person predate references to royalty, artisans, or ancient celebrities, though not all agree that “Kushim” is the oldest such record; a few researchers believe the name could have been a job title. Other tablets dated to around 3100 BCE — which list the names of two enslaved people and the enslaver — also compete for the record of the world’s oldest known names.
Rolling Stones frontman Mick Jagger attended accounting school.
The singer was enrolled at the London School of Economics before his musical break, studying finance and accounting during the week and playing gigs on the weekend.
The region where Kushim’s tablet was found is also credited as the birthplace of written language, which emerged around 3500 BCE. The earliest known writings were scrawled in pictographs — images used for a word or phrase that generally resembled their meaning. Eventually, writing systems used more and more abstract symbols, evolving into cuneiform, which represented a word’s spoken sound and meaning. Surviving tablets have allowed historians to piece together how communication evolved, along with clues about daily life for people who lived thousands of years ago. Some extant tablets have included recipes, receipts for boat rentals, and recorded court disputes, suggesting ancient Sumerians may have been just like us.
The oldest recipes, inscribed in clay, are for bread and stew.
Advertisement
Bubble gum was invented by an accountant.
Humans have been chewing gum for thousands of years. In Latin America, people in Guatemala and Mexico chomped on chicle, a type of tree sap. In the U.S., Americans were introduced to chicle around the 1870s, though early gum had some faults — it was known for being particularly sticky. That’s why Walter Diemer, an accountant at a Philadelphia candy company, and his co-workers were encouraged to tinker with the formula in their free time; the bubble gum we chew today comes from Diemer’s accidental invention. Diemer was just 23 years old when he created his first batch, popular for its stretchiness and pink color (the only dye color available at the company lab). The Fleer Chewing Gum Company named the concoction Dubble Bubble and sold each piece for a penny. Diemer, who rarely chewed gum himself, went on to be a lifelong judge of bubble-blowing contests as well as the company’s senior vice president.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Dairy Queen makes a lot of popular frozen treats — Blizzards, sundaes, and cones, to name a few — but none of them are technically ice cream. The company’s soft serve products, though delicious, don’t meet the Food and Drug Administration guideline mandating that “ice cream contains not less than 10% milk fat.”
Because Dairy Queen’s products are made with only 5% milk fat, they’re required to be called something else. That’s why you won’t actually see the words “ice cream” at your local DQ or on the website, which is careful to use specific wording.
Dennis the Menace used to be Dairy Queen’s mascot.
Everyone’s favorite little menace served as DQ’s “spokestoon” from 1971 to 2002, when the company chose not to renew the license — presumably because Dennis was no longer as recognizable among children.
Soft serve and similar confections made with lower milk fat used to be classified as “ice milk” by the FDA, but new regulations in 1995 resulted in three other categories instead: reduced-fat, light, and low-fat ice cream. Dairy Queen products fall under the banner of “reduced-fat ice cream,” which is legally distinct from “ice cream” proper — and isn’t the catchiest term when trying to sell frozen desserts. Frozen yogurt, meanwhile, is made of yogurt rather than cream and hasn’t been sold at Dairy Queen since the chain discontinued the frozen yogurt-based Breeze in 2000.
Thomas Jefferson was the first known American to record an ice cream recipe.
When he wasn’t busy writing the Declaration of Independence or acting as third president of the United States, Thomas Jefferson was otherwise occupied eating ice cream. After first being exposed to the treat in France (and apparently enjoying it), he helped popularize ice cream in America.
Jefferson not only served the dessert at parties throughout his life, including during his eight years as president, but also was the first known American to write down a recipe for it. In addition to a simple list of ingredients (“2 bottles of good cream, 6 yolks of eggs, 1/2 lb. sugar”), he included such instructions as “put the cream on a fire in a casserole, first putting in a stick of Vanilla” and “open it to loosen with a spatula the ice from the inner sides of the Sabotiere.” According to those who’ve made it, Jefferson’s recipe is quite tasty — and incredibly rich.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
After spending eight days, three hours, 18 minutes, and 35 seconds in space — with 21 of those hours spent on the moon — Neil Armstrong, Buzz Aldrin, and Michael Collins splashed down 920 miles southwest of Hawaii. The three NASA astronauts had achieved the seemingly impossible on a mission that was the very definition of “otherworldly.” But once back on Earth, they were back in the clutches of human bureaucracy — because after they landed, the Apollo 11 heroes had to fill out a U.S. customs form.
During the BBC’s broadcast of the Apollo 11 landing, Pink Floyd performed a seven-minute jam.
Known for its spacey vibes, Pink Floyd provided the atmospheric and meandering 12-bar jam for Apollo 11’s historic landing. “There was a panel of scientists on one side of the studio, with us on the other,” guitarist David Gilmour wrote in 2009. “The song was called ‘Moonhead.’”
Later posted on the U.S. Customs and Border Protection website in honor of the flight’s 40th anniversary in 2009, the straight-laced form belies the very unearthly information written on the page. Flight number? Apollo 11. Layover? Moon. Cargo? Moon rock and moon dust samples. Anything that could lead to the spread of disease? TBD. NASA has confirmed that the form is authentic, though one spokesperson described it as “a little joke” played on the astronauts upon their return. Today, astronauts still go through customs on their way to and from the International Space Station. Canadian astronaut Chris Hadfield described passing through customs in Kazakhstan — after glimpsing the entire world through a small window only hours before — as “a funny but necessary detail of returning to Earth.”
Discovered on the moon, the rock armalcolite (ARMstrong, ALdrin, and COLlins) is named after all three astronauts of the Apollo 11 mission.
Advertisement
NASA transferred the Apollo 11 astronauts from Hawaii to Houston in a “quarantine trailer.”
Uncertain whether the lunar surface was rife with unknown pathogens, NASA took every precaution to avoid a biological disaster. So the moment the hatch closed for Apollo 11’s return trip home, the quarantine began. After the astronauts splashed down in Hawaii, they exited the Command Module wearing a biological isolation garment before being sealed inside a retrofitted Airstream trailer called the Mobile Quarantine Facility (MQF). Complete with a kitchen, sleeping area, and bathroom, the MQF had its air pressure set low (in case of a leak) and all air exiting the Airstream was meticulously filtered. The astronauts spent 88 hours in the quarantine trailer as they made the journey — tucked in the cargo hold of a C-141 aircraft — from Hawaii to the Lunar Receiving Laboratory in Houston, Texas, where they enjoyed a more spacious quarantine facility. They were released after a total of 21 days when physicians confirmed they didn’t have moon plague.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
The act of high-fiving a friend in celebration may seem like it’s been around forever, but in fact, the gesture originated even more recently than cellphones or email. Lacking any earlier reputable reports, the most widely accepted origin story goes as follows: The high-five was first used during a baseball game between the Los Angeles Dodgers and Houston Astros on October 2, 1977. After hitting his 30th home run of the season, left fielder Dusty Baker was greeted by his teammate Glenn Burke, who excitedly offered a raised hand to celebrate, which Baker then slapped in return. Burke then hit a home run of his own, and the pair repeated the motion.
The players celebrated another homer hit three days later with their new gesture, and that moment was photographed by the Los Angeles Times, which ran the image on the front page of the October 6 edition with the caption “GIVE HIM A HAND.” It wasn’t until 1980 that the term “high-five” was definitively coined and began appearing in print, with its first such appearance in a March 25 Boston Globe article, according to the Oxford English Dictionary.
The thumbs-up is considered offensive in certain cultures.
Despite its positive connotation in Western culture, the thumbs-up gesture is akin to giving the middle finger in other parts of the world. It remains taboo in parts of West Africa, the Middle East, Australia, Greece, and elsewhere, though younger generations may not consider it as offensive.
Another oft-cited report attributes the creation of the high-five to the 1978-1979 University of Louisville men’s basketball team. During practice, Wiley Brown offered his teammate Derek Smith a low-five — a knee-level gesture that was commonly used by African Americans as a symbol of unity. At that moment, as reported in The Week, Smith responded, “No. Up high,” thus giving literal rise to a new gesture. This was cited as the origin of the high-five in a New York Timesarticle on September 1, 1980. However, this event postdates the Baker-Burke story, which makes the NYT’s claim suspect (assuming the reported timelines are indeed accurate).
Before calling Los Angeles home, the Dodgers played in Brooklyn.
Advertisement
Handshakes date back to ancient Mesopotamia.
Handshakes were used by people in ancient Mesopotamia no later than the ninth century BCE. One of the earliest examples is a stone relief from that era depicting the kings of Babylon and Assyria shaking hands to commemorate a pact. The gesture was later mentioned several times by Homer in the “Iliad” and “Odyssey” as a way to convey trust between two parties.
Shaking hands as a greeting was popularized, in part, by Quakers in the 17th century. Many Quakers weren’t particularly fond of greeting people with traditional bows or curtsies, as those gestures reinforced an unequal, hierarchical structure. Instead, they began using handshakes as a sign of equal respect.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
The Appalachian Mountains aren’t the world’s largest mountains. And though they stretch from Canada to Alabama, they aren’t even the world’s longest (that honor goes to the mid-ocean ridge, a chain 40,389 miles long). However, the Appalachian chain does stand among the world’s oldest mountains, with some of its rocks dating back 1.2 billion years — a milestone that makes these peaks older than the Atlantic Ocean.
Earth’s mountains don’t just reach into the sky; they also take up much of the ocean floor. These underwater mountains — aka seamounts — are created by deep-sea volcanoes, and scientists estimate they account for more than 11 million square miles of the planet’s surface.
The oldest parts of the Appalachian Mountains began to rise when our planet looked much different. At the time of their creation, North America was still attached to Europe and most of Asia, making up the supercontinent Laurasia. However, a collision between Laurasia and Gondwana — the massive continental fusion that included Africa, India, South America, Australia, and Antarctica — would eventually create Pangaea, and the first Appalachian peaks along with it. As Pangaea formed around 320 million years ago, the earliest Appalachian mountains began to grow, reaching far higher into the sky than they do today; initially, the southern subrange we call the Blue Ridge Mountains had the largest summits in the world. However, Pangaea eventually broke apart, leaving a rift that would become the Atlantic Ocean about 150 million years ago, as the continents separated.
Today, around 3 million people hike through the Appalachian Mountains along the Appalachian Trail, a feat that wouldn’t at all be possible had the mountain range remained as high as the Himalayas. Thankfully for backpackers, millions of years of erosion have brought the still-stunning mountain chain to a more trekkable level, averaging a more manageable 3,000 feet above sea level.
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Spend enough time at Disneyland and you’ll see them. Maybe you’ll spot one snoozing in the bushes near the Jungle Cruise or observing you warily as you ride the tram, but one thing is certain: However many cats you see, there are more out of sight. About 200 feral cats roam the Happiest Place on Earth, where they earn their keep by helping to control the rodent population. The felines were first seen not long after Disneyland opened in 1955, when they took up residence in Sleeping Beauty Castle, and it soon became evident that keeping them around had more advantages than trying to escort them off the premises.
Though the park ended up being built in Anaheim, Walt Disney originally proposed constructing it just down the street from Walt Disney Studios in Burbank. Anaheim was chosen in part because there was more land available to accommodate Disney’s expanding vision.
The mutually beneficial alliance even includes permanent feeding stations for the cats, as well as spaying or neutering and vaccinations. Though not official cast members, these adept hunters — who mostly come out at night — have earned a devoted following of their own. There are websites, Instagram feeds, and YouTube videos devoted to them. They’re not quite as popular as the actual rides at Disneyland, of course, but for cat lovers, they’re an attraction all their own.
The first Disneyland ticket was bought by Walt Disney’s brother.
Advertisement
A train station in Disneyland plays a message in Morse code.
Next time you find yourself on the Disneyland railroad, listen closely when the train pulls into its second station. New Orleans Square, which houses a telegraph office, plays a secret message in Morse code paraphrased from Walt Disney’s opening-day speech: “To all who come to Disneyland, welcome. Here age relives fond memories of the past, and here youth may savor the challenge and promise of the future.” There are also many other secrets in the park, from the optical illusion that makes Sleeping Beauty Castle look bigger to Walt Disney’sfavorite chili recipe at the Carnation Cafe.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — including and especially gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.
Rome was founded by twin brothers Romulus and Remus.
It’s a great story, but it’s just that — a story. The mythological siblings who were nursed by a she-wolf after being sent down the River Tiber in a basket have long been a key part of Roman mythology.
They weren’t alone in their contempt. The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose. To be fair, though, Romans considered anyone who lived beyond the Empire’s borders (read: most of the world) to be barbarians, so butter eaters were in good company.
The bestselling butter brand in America is Land O’Lakes.
Advertisement
Nero didn’t actually fiddle while Rome burned.
It would have been impossible for him to do so, as the fiddle didn’t exist yet. That’s not to say that Nero was a good emperor (or person), however. In addition to murdering his mother, first wife, and possibly his second wife as well, Nero may have even started the infamous fire that burned for six days in 64 CE and destroyed 70% of the city so that he could expand his Golden Palace and nearby gardens. (Or at least, that’s what some of the populace and some ancient writers suspected.) For all that, Rome’s fifth emperor wasn’t entirely reviled during his time — and it’s been suggested that his cruelty was at least somewhat exaggerated by later historians who were looking to smear his dynastic line, known as the Julio-Claudians. And he was a gifted musician who played the cithara, an ancient stringed instrument similar to a lyre — just not the fiddle.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Original photo by Allstar Picture Library Ltd/ Alamy Stock Photo
Like many classic Hollywood stars, Joan Crawford was known by a stage name rather than her real name. Born Lucille Fay LeSueur, the future Oscar winner made her silver-screen debut in 1925’s Lady of the Night under her birth name. Metro-Goldwyn-Mayer, which had signed her to a $75-a-week contract, saw potential in the starlet but feared her name would be a hindrance; Pete Smith, the head of publicity at MGM, thought her surname sounded too much like the word “sewer.”
So the upper brass at MGM landed on a novel solution: a contest run in the fan magazine Movie Weekly, which offered between $50 and $500 for coming up with a new name for “a beautiful young screen actress.” The perfect name, according to MGM, “must be moderately short and euphonious. It must not imitate the name of some already established artiste. It must be easy to spell, pronounce, and remember. It must be impressive and suitable to the bearer’s type.”
Though biographers know her birthday was March 23, the year has been listed as 1904, 1905, 1906, and 1908 by various sources.
The winner, as fate would have it, wasn’t Joan Crawford; it was Joan Arden, which was already the name of an extra who threatened to sue MGM. And so the second-place winner was chosen instead, not that the new Joan Crawford was happy about it — she initially hated the name before making the most of it.
Crawford’s fourth husband was chairman of the board at Pepsi-Cola.
Advertisement
Crawford accepted her Oscar from bed.
After a string of hits in the late 1920s and early ’30s, Crawford’s luck so reversed itself that she was deemed “box-office poison” in TIME magazine by the end of the decade. Her comeback wasn’t fully solidified until she took the title role in 1945’s Mildred Pierce, which resulted in her sole Academy Award — not that she was expecting to win.
Believing Ingrid Bergman would take home the Oscar for The Bells of St. Mary’s, Crawford was disinclined to attend any ceremony where she wouldn’t be victorious and opted to feign illness. Upon learning she’d won, however, she put on her makeup, invited members of the press to her bedroom, and accepted the statuette from the comfort of her own bed.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
The world’s largest coffeehouse chain, Starbucks, almost had a very different name. According to a 2008 Seattle Times interview with the company’s co-founder Gordon Bowker, the famous java chain was once “desperately close” to being called “Cargo House,” a name meant to tie the first store (in Seattle’s Pike Place Market) to the idea of beans coming from far away. Anxious for another, more pleasing moniker, a brand consultant working with Bowker mentioned that words starting with “st” felt especially strong. Bowker ran with the idea, listing every “st” word he could think of. The breakthrough moment occurred after the consultant brought out some old maps of the Cascade mountains and Mount Rainier — both close to the company’s hometown of Seattle — and Bowker stumbled across an old mining town named “Starbo.” The name lit up a literary reference embedded in his mind: Starbuck.
The musician Moby is related to Herman Melville, the author of “Moby-Dick.”
Born Richard Melville Hall but nicknamed “Moby” as a baby, the musician says he’s the great-great-great-nephew of author Herman Melville. In 2016, Moby followed in his ancestor’s publishing footsteps and came out with a memoir titled “Porcelain.” He has since released a second memoir.
The name comes from Herman Melville’s 1851 masterpiece Moby-Dick; or, The Whale. In the novel, Starbuck is a Quaker and trusty first mate of Captain Ahab, and serves as the voice of reason aboard the whaling ship Pequod (another name the Starbucks co-founders considered). Melville himself likely got the name Starbuck from a real whaling family that lived on the Massachusetts island of Nantucket in the late 18th and early 19th centuries. Bowker readily admits that the character has nothing to do with coffee, but the moniker stuck, and the company doubled down on the nautical theme by introducing a mythological siren, likely influenced by a seventh-century Italian mosaic, as its now-famous green-and-white logo.
In Homer’s “The Odyssey,” the siren has the head of a woman and the body of a bird.
Advertisement
Coffee beans are not actually beans.
Two types of flowering shrubs from the family Rubiaceae, Coffea robusta and Coffea arabica, make up most of the coffee consumed in the world. These plants produce a sweet, reddish-yellow cherry-like fruit, and its seeds or pits — when roasted from light to dark — make the coffee beverage we know and love today. However, calling these seeds “beans” is a misnomer, since a “bean” technically refers to an edible seed from the plant family Fabaceae (also called Leguminosae), which includes foods such as soybeans, peas, chickpeas, and peanuts. Coffee seeds look much like a typical bean, but from a strict botanical perspective, they’re not. In fact, since coffee cherries are fruits, you might argue that your usual cup of joe has more in common with a smoothie than any sort of legume-heavy delicacy.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.