Other than being members of the class Mammalia, humans and elephants might seem to have little in common. But these seemingly disparate creatures, separated by 80 million years of evolution, have some stunning similarities. One of the most intimate (and adorable) is a behavior shared between newborn human babies and elephant calves. Just like a human infant sucks their thumb, a newborn elephant will do the same with its trunk, and for the same reason — comfort.
Elephants are known for their trunks, but plenty of other animals have them too, such as anteaters, shrews, and even a species of antelope. The most prominent example is the tapir, which looks like a pig with a trunk, though it’s more closely related to horses and rhinos.
During the first six months of life, our brains are biologically wired to suck on things, since that’s the primary way infants receive sustenance from their mothers. Thumb-sucking is also a way for babies to self-soothe during times of stress. For elephants, it’s a very similar situation. Since sucking is associated with food and their mothers, elephant calves will suck their trunks much like a natural pacifier — a pacifier with more than 40,000 muscles. An elephant calf also sucks its trunk to learn how to subtly manipulate this immensely important protuberance, and uses the technique as an enhanced form of smelling. So while much has changed since humans and elephants parted ways during the Late Cretaceous, there’s at least one stunning (and very cute) similarity.
The country with the largest population of elephants is Botswana.
Advertisement
Elephants have the longest gestation period of any mammal.
Humans have a relatively long gestation period for mammals (especially compared to the Virginia opossum, which is pregnant for only 12 days), but a few animals outlast even us Homo sapiens. Manatees remain pregnant for 13 months, and giraffes can carry their young for two months beyond that, but all mammals pale in comparison to the African elephant, which has a gestation period of 22 months. There are two reasons for this nearly two-year-long pregnancy — one obvious, the other less so. The first is size. The African elephant is the largest land-dwelling mammal on Earth, and it takes time to grow such an enormous creature from a small clump of cells into a calf that weighs more than an average adult man. The second reason relates to an elephant’s amazing intellect, which includes a brain that is shaped similarly to our own but is three times larger. An elephant’s brain contains some 250 billion neurons, and the temporal lobe is particularly well developed because it allows elephants to create complex mental maps stretching hundreds of miles. Without this impressive memory, elephants couldn’t find their way back to life-sustaining watering holes year after year. So while an elephant pregnancy might seem incredibly long, it’s definitely time well spent.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Maxim Tatarinov/ Alamy Stock Photo
Foods tend to get their names from their appearance or ingredients, though not all are so clear-cut. Take, for instance, the egg cream, a beverage that has delighted the taste buds of New Yorkers (and other diner patrons) since the 1890s. If you’ve never sipped on the cool, fizzy drink known for its chocolate flavor and foamy top, you should know: There are no eggs or cream in a traditional egg cream drink.
Frosty milkshakes are diner standards, served with a side of burgers and fries, though the earliest version of the drink didn’t include ice cream. Invented in the late 1800s, the first milkshakes were a blend of eggs, cream, and whiskey.
According to culinary lore, the first egg cream was the accidental invention of Louis Auster, a late-19th- and early-20th-century candy shop owner in New York’s Lower East Side. Auster’s sweet treat arrived in the 1890s, at a time when soda fountains had started selling fancier drinks, and it was a hit — the enterprising inventor reportedly sold upwards of 3,000 egg creams per day by the 1920s and ’30s. However, Auster kept his recipe well guarded; the confectioner refused to sell his formula, and eventually took his recipe to the grave. The origins of the drink’s name have also been lost to time. Some believe the name “egg cream” came from Auster’s use of “Grade A” cream, which could have sounded like “egg cream” with a New York accent. Another possible explanation points to the Yiddish phrase “echt keem,” meaning “pure sweetness.”
Regardless of the misleading name, egg creams are once again gaining popularity in New York, though you don’t have to be a city dweller to get your hands on the cool refreshment. Egg creams can be easily made at home with just three ingredients: milk, seltzer, and chocolate syrup.
Servers at soda fountains of the early 20th century were called “soda jerks.”
Advertisement
Chocolate syrup was once marketed as a health tonic.
Centuries before it became a dessert, chocolate was employed medicinally. In Mesoamerica, where chocolate originated, cacao was used among Indigenous communities to treat indigestion, fatigue, and even some dental problems. Europeans of the 17th century also consumed chocolate for health purposes, hoping to cure a variety of ailments. By the late 1800s, pharmaceutical publications widely advertised chocolate powders and syrups, promoting them as healthful aids that also masked the bitter flavors of other medications. Brands like Hershey’s began marketing their syrups and chocolates to everyday consumers as health tonics that were wholesome and nutritious — even “more sustaining than meat.” Eventually, however, regulations against dubious health claims and patent medicines, combined with equipment improvements and declining sugar prices, set the stage for chocolate to be considered more treat than tonic, even as some health claims for it have endured.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Although crying to make yourself happier seems counterintuitive, shedding some tears can be one of the best ways to restore your emotional equilibrium. A 2014 study found that emotional crying activates the parasympathetic nervous system, which regulates the body’s “rest and digest” actions. Crying also elevates levels of endorphins and oxytocin, which helps dull both emotional and physical pain. And the physical act of crying — taking in big gulps of air — cools the brain and helps regulate your mood. All in all, “having a good cry” can actually be good for you.
Humans are the only animals that cry emotional tears.
Shedding emotional tears is a solely human characteristic. Although other animals produce tears, scientists believe it’s only for providing moisture or clearing the eyes of irritants.
Of course, whether crying makes you feel better can also be dependent on the situation. Tears are known to inspire interpersonal benefits by signaling to others that you’re in need of support. Unsurprisingly, studies have shown that people who receive support after crying are more likely to feel happier than if they’re shamed for crying. So while the physical act of crying can help our bodies return to an emotional homeostasis, it’s the support of friends and loved ones that makes those good feelings stick.
Chemicals that irritate the eyes and make humans cry are called lachrymators.
Advertisement
Humans tear up when laughing because it’s physiologically similar to crying.
Although crying is often associated with sadness, tears are actually a complex biological response — after all, humans also shed tears of joy. Evidence suggests that the same part of the brain controls both laughing and crying; some of this evidence comes from studies that have shown that patients with a condition known as pseudobulbar affect (PBA) experience both uncontrollable bouts of crying and laughter caused by lesions located on a specific part of the brain. Although scientists aren’t 100% certain why people cry when they’re laughing, one prevailing theory is that in both instances the body is attempting to regulate a high emotional state and simply doesn’t discriminate between immense sadness and immense joy.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Octagon, octopus… most words that begin with the Latin prefix “Oct” have some connection to the number eight. But what about October? While the modern calendar considers the autumn month to be the 10th of the year, it wasn’t always that way. For the ancient Romans, who created the earliest form of the calendar we use now, October was originally the eighth month.
Cultures around the globe track time differently; take, for example, the Ethiopian Ge’ez calendar, which gives each month 30 days and runs about seven years behind the Gregorian calendar. After December, the year’s five or six remaining days create a mini month called “Pagume.”
Today’s calendar follows a 12-month cycle, though the earliest iterations only had 10 months. In ancient Rome, the year began in March and ran through December, with the first four months named for Roman deities. The next six months had more straightforward, numerical names that referenced their place in the year. The remaining weeks of winter (which would eventually become January and February) were largely ignored on paper; when the harvest season ended, so did the calendar, until the next spring planting season rolled around.
Over time, the calendar expanded by two months; January and February were added around 700 BCE, and by about the middle of the fifth century BCE, they had become the starting months of the year. When Julius Caesar introduced the Julian calendar, he didn’t adjust the number-named months to more appropriate places, though later Roman emperors tried, using names that didn’t stick. Domitian, who ruled from 81 to 96 CE, called October “Domitianus” after himself, and decades later, Commodus dubbed the month “Herculeus” after one of his own titles. Some historians believe the attempts to rename October (and the other months of the year) were widely disregarded because the leaders themselves were generally disliked, though another theory might explain it best: Like many people today, Romans of the past just weren’t fans of change.
World Octopus Day, celebrating eight-armed cephalopods, is on October 8.
Advertisement
There was a year when October only had 21 days.
Eager trick-or-treaters counting down to Halloween know October has 31 days, though there was a time in history when the month ran 10 days short. In 1582, Pope Gregory XIII introduced the Gregorian calendar, an upgrade from the Julian calendar that had fallen 10 days out of sync and was thus messing with the timing of religious holidays. Switching to the new calendar fixed the issue, but it required a one-time drop of 10 days to get back on track. The pope decreed the calendar would skip them in October, the month with the fewest holy days. After October 4, the calendar jumped to October 15, omitting the days in between and causing a flurry of issues: Some citizens in Frankfurt rioted against the change, many countries delayed or refused to swap to the new calendar, and participating regions had to recalculate rents and wages for the shortened month. Over the next few centuries, most countries around the globe adopted the Gregorian calendar, though some held out longer than others. Greece became the last European country to officially adopt the calendar, in 1923.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
On March 5, 1973, several hundred people gathered at a farm in tiny Ossineke, Michigan, to witness a burial they would remember for the rest of their lives. One local grocery store closed its doors so employees could attend; even Michigan Governor William G. Milliken dropped by to pay his respects. Was this a funeral for a native son who made good, or perhaps a beloved civic leader? No, it was a ceremony to bid arrivederci to some 30,000 frozen pizzas that may have been harboring dangerous toxins.
Hawaiian pizza (with pineapple and ham toppings) originated in Hawaii.
This particular combination of toppings was the brainchild of Greek immigrant Sam Panopoulos, who introduced the Hawaiian pizza at his restaurant in Ontario, Canada, in 1962.
This bizarre scene stemmed from the discovery of swollen mushroom tins at Ohio's United Canning Company two months earlier. After FDA tests revealed the presence of bacteria that causes botulism, calls to United Canning's extended branch of customers eventually reached frozen-pizza maker Mario Fabbrini. When two test mice croaked after eating his mushroom pizza, Fabbrini believed he had no choice but to recall his wares from store shelves and swallow the estimated $60,000 in losses. Attempting the pizza equivalent of turning lemons into lemonade, he announced intentions for a grand "funeral," and arranged for a series of pickup trucks to dump his 30,000 unwanted mushroom pies into an 18-foot hole. After placing a flower garland on the grave — red gladioli to symbolize sauce, white carnations for cheese — Fabbrini served fresh (mushroom-free) pizza to anyone brave enough to partake.
Further tests later showed that the mice had died not from botulism, but from peritonitis, and it was unclear whether their deaths were pizza-related casualties. Sadly, the $250,000 Fabbrini later won in a lawsuit against United Canning and two other defendants wasn’t enough to fully revive his business, and Fabbrini sold the company in the early 1980s. Nevertheless, much like that sauce stain that never entirely disappears from your shirt, the story of the Great Michigan Pizza Funeral endures for those who know where to look.
Mushrooms are the only nonanimal food product that serves as a significant source of vitamin D.
Advertisement
Atari once buried truckloads of its inventory, including a notoriously awful "E.T." video game.
Video gamers of a certain age may remember the disaster that was “E.T. the Extra-Terrestrial,” an Atari 2600 game based on the blockbuster Steven Spielberg movie. Given just five weeks to have the game in stores by the 1982 Christmas season, designer Howard Scott Warshaw developed an ambitious but deeply flawed product, resulting in a poorly reviewed title that contributed to the company’s $563 million in losses in 1983. That September, Atari deposited 13 truckloads of various game cartridges and computer equipment into the city landfill at Alamogordo, New Mexico. Although contemporary newspapers reported on the event, the legend that lingered was that of Atari secretly dumping their unsold “E.T.” inventory under the cover of darkness to bury the memory of what some called the worst video game ever made. In April 2014, the landfill was excavated as part of the making of the documentary Atari: Game Over; it was called “the first excavation of video games in the history of humanity.” Cartridges of the “E.T.” game were found alongside other titles, such as “Pac-Man” and “Centipede,” as well as decrepit computer parts. Yet unlike Mario Fabbrini and his mushroom pizzas, this story has a happy ending: The sale of items retrieved in the landfill helped raise more than $100,000 for the city of Alamogordo, and Warshaw earned a measure of redemption by receiving a standing ovation after Atari: Game Over screened at Comic-Con that year.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by stephen searle/ Alamy Stock Photo
From Spencer Tracy and Katharine Hepburn to Brad Pitt and Angelina Jolie, the entertainment industry is rife with tales of co-stars who fell in love while performing together. Given the sweet feelings their famous characters consistently displayed to one another, it's not surprising that the same fate befell longtime Mickey and Minnie Mouse voice actors Wayne Allwine and Russi Taylor.
At the time Taylor beat out approximately 200 competitors to claim the voice role of Minnie in 1986, both she and Allwine (by then already established as Mickey for almost a decade) were married to other people. But their rapport as co-workers and friends soon blossomed into genuine affection, especially after each obtained a divorce, and they were married in Hawaii in 1991. The couple refused to talk publicly about their romance, preferring to keep the focus on the iconic characters they were tasked with portraying, although the cartoon hearts they radiated in one another’s presence were clear to all. According to one former colleague, Allwine would bring a ukulele to joint interviews with Taylor, and while he would launch into song as Mickey to serenade Minnie, "You knew it was Wayne talking to Russi."
Although there's been no official on-screen ceremony, Walt Disney attested to their marital status in a 1933 interview: "In private life, Mickey is married to Minnie. … In the studio we have decided that they are married already."
After Allwine died in 2009, Taylor naturally had a difficult time returning to work with Bret Iwan, the new Mickey. Yet she pulled it together to continue with Minnie's various big- and small-screen adventures, even earning her first Primetime Emmy nomination in 2018, before joining her beloved in the great soundbooth in the sky the following year.
The first Disney character to earn a full-length feature film was Snow White.
Advertisement
Walt Disney provided the original voices for both Mickey and Minnie Mouse.
Long before Wayne Allwine and Russi Taylor were making eyes at each other in the recording studio, it was Walt Disney himself supplying the voices for what became the Magic Kingdom’s first couple. Of course, the “speaking” in early Disney shorts largely consisted of yelps, whistles, and other noises, and by the time the studio settled into a groove with sound synching in the 1930s, Minnie’s parts were being delivered by Marcellite Garner. Yet Disney insisted on retaining the voice of Mickey for himself; according to Neal Gabler’s Walt Disney: The Triumph of the American Imagination, the chief was “often embarrassed” to perform as the mouse protagonist, but felt that his version offered “more pathos” than that of his would-be replacements. It wasn’t until 1947 that he finally relinquished the voice to sound effects man Jimmy MacDonald (Allwine’s immediate predecessor), though Disney reclaimed the role for himself when the Mickey Mouse Club TV series began airing in 1955.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Teddy Roosevelt is often thought of as one of America’s toughest presidents, and for good reason. In addition to delivering a speech immediately after getting shot (“It takes more than that to kill a bull moose,” he said during his remarks) and fighting in the Spanish-American War as part of the Rough Riders, he was also a skilled martial artist who received an honorary black belt in judo (and, according to some, was the first American to receive a brown belt). He accomplished the latter under the tutelage of Yamashita Yoshitsugu, also known as Yamashita Yoshiaki, a Japanese judoka who holds the distinction of being the first person to receive a 10th-degree red belt (jūdan).
The elder Roosevelt was 42 when he became president, making him the youngest person ever to hold the office. Next on the list are JFK (43), Bill Clinton (46), Ulysses S. Grant (46), and Barack Obama (47).
Even more impressively, he did all this while serving as president. Already a skilled boxer and wrestler, he first encountered judo on a trip to Japan and sought to study it further upon his return stateside. Yamashita described the president as “his best pupil” but also “very heavy and very impetuous” in a way that “cost the poor professor many bruisings, much worry, and infinite pains.” As in most aspects of his life, Roosevelt was extremely enthusiastic about this endeavor — sometimes in a way that others struggled to keep up with.
Teddy Roosevelt’s vice president was Charles W. Fairbanks.
Advertisement
Teddy Roosevelt was the first U.S. president to win the Nobel Peace Prize.
Roosevelt received this honor in 1906 “for his role in bringing to an end the bloody war recently waged between two of the world’s great powers, Japan and Russia.” Prior to his intervention, the Russo-Japanese War had gone on for more than a year and a half and led to significant casualties on both sides. Not everyone was pleased about Roosevelt winning — Swedish newspapers suggested that Alfred Nobel, the prize’s namesake, was “turning in his grave” — but defenders have pointed to Roosevelt’s role in settling a dispute between France and Germany over Morocco. As of 2024, three other U.S. presidents and one vice president have received the Nobel Peace Prize: Woodrow Wilson (1919), Jimmy Carter (2002), Al Gore (2007), and Barack Obama (2009).
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
To most grocery shoppers, there’s nothing particularly exciting about cauliflower. While the dense and crunchy stalks of this cruciferous plant are great as a side dish or snack, they’re consumed far less than the most popular produce-aisle picks (potatoes, tomatoes, and onions). However, some farmers might say that cauliflower has at least one unique and unexpected property that’s worth your attention: If you listen closely, you can hear it growing.
In the U.S., nearly all cauliflower is grown in one state.
There’s a good chance that the majority of cauliflower you’ve ever eaten in the U.S. has come from one state: California. The Golden State is a produce powerhouse, and its farmers produce around 90% of the country’s cauliflower supply. (Arizona and Oregon also contribute.)
While most plants are silent, cauliflower is able to eke out a barely audible sound thanks to how quickly it grows. The vegetable can add as much as 1 inch per day under the right growing conditions. That rapid expansion means the florets of the plant’s popcorn-like heads often rub against one another as they grow, creating a noise many farmers call “cauliflower creak.” Some agriculturalists describe the tone as a soft squeak, while others say it’s best described as the faint popping noise made by Rice Krispies cereal when doused in milk. However, there are occasions when cauliflower fields reach a more detectable decibel, like in 2015, when British farmers were graced with optimal weather conditions for their cauliflower harvests. That year, some cauliflower cultivators alerted vegetable enthusiasts to what they believed would be the loudest cauliflower creak in decades.
The edible head of a cauliflower plant is called a “curd.”
Advertisement
Vienna is home to a group of vegetable musicians.
The Austrian city of Vienna is often called the “capital of classical music” — after all, it’s where some of history’s most prominent composers (such as Mozart and Beethoven) spent much of their time. It also happens to be a spot where modern experimental artists, like the Vegetable Orchestra, perform regularly. Founded in 1998 as a joke, the group of nearly a dozen musicians builds its own instruments from fresh produce purchased at nearby markets, fashioning drums from pumpkins, recorders from carrots, and more than 150 other produce contraptions. Each concert requires around 70 pounds of vegetables, which are made into instruments over two to three hours and last only one performance. However, audiences who attend Vegetable Orchestra concerts don’t just hear their veggies; they get a chance to eat them, too, since the band’s produce scraps are crafted into a soup that is served after every performance.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Humans are masters of navigation. Over the course of history, we’ve developed tools to help us explore not only Earth but other planets. Yet strip away all those tools, blindfold us, and tell us to walk in a straight line, and inevitably we become a directional mess, turning in tight loops. Many studies in the past century — though mostly informal — have cataloged this phenomenon again and again. Without some form of reference, such as a mountain, a building, or even the sun, humans are incapable of walking in a straight line, no matter how hard we try. It happens whether we’re blindfolded or just lost in the forest. So what’s going on?
The compass wasn’t originally used for navigation.
The Chinese created the first compass — made with a lodestone — in the third century BCE. Early compasses were used for divination and other spiritual purposes rather than navigation, but eventually their useful wayfinding attributes won out.
We don’t know for sure, but scientists have been able to rule out some popular go-to explanations. Researchers from the Max Planck Institute for Biological Cybernetics in Germany discovered that body asymmetries (different-sized legs, right-handedness vs. left-handedness, etc.) didn’t account for such vast misdirection. Additionally, the idea that people can’t correctly calculate the movement of their legs doesn’t explain the tight-looped pattern. The Max Planck scientists theorize that with every blindfolded step, a very small directional discrepancy from a straight line is introduced, which then compounds with every additional step. Without the aid of visual references to unconsciously correct for these discrepancies, blindfolded people are poor at navigating a straight line, and will inevitably begin walking in tight-looped circles. While this theory explains why humans do this, scientists aren’t sure of the biological how (though they think errors in the inner ear may be to blame). For now, this straight-line conundrum remains one of the many mysteries of the human brain and body.
For centuries, sailors used an instrument called a mariner’s astrolabe to navigate the seas.
Advertisement
The Earth’s North Star won’t always be Polaris.
When it comes to navigation, the North Star — known to astronomers as Polaris — is an important one. Because the star sits roughly above the Earth’s North Pole, being able to pick out Polaris from the tapestry of the night sky can be useful in finding your way. For centuries, seafarers measured the angle of the North Star from the horizon to determine latitude and position in the Northern Hemisphere. But although Polaris has been humanity’s navigational friend for many centuries, the North Star won’t always be Polaris — in fact, it’s only held the position since 500 CE. That’s because the Earth’s rotation wobbles in a roughly 26,000-year-long cycle known as axial precession. In about a thousand years, the Earth’s North Pole will instead point to Errai (Gamma Cephei), followed by a variety of other stars, until Polaris once again becomes the North Star some 24,000 years from now.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Umbrellas have been around for a long time — at least 3,000 years, according to T.S. Crawford's A History of the Umbrella — but they were used by only select segments of the population for much of that history. Ancient Egyptians used them to shade their pharaohs, setting the tone for an association with royalty and nobility that would also surface in China, Assyria, India, and other older civilizations. Meanwhile, they were deemed effeminate by ancient Greeks and the Romans who assumed many of their cultural habits. It should be noted that these early umbrellas protected against the sun, not rain, and were generally used by women to shield their complexions. The association between women and umbrellas persisted through much of Europe for centuries, and stubbornly remained into the 18th century, even after the first waterproof umbrellas had been created (around the 17th century in France).
Baltimore was once the umbrella capital of the world.
The nation’s first umbrella factory opened in Baltimore in 1828. The Beehler Umbrella Factory motto was “Born in Baltimore, Raised Everywhere!” By the 20th century, Baltimore factories made 1.5 million umbrellas annually, and the city was deemed the umbrella capital of the world.
In England, at least, the man credited with ushering in a new age of gender-neutral weather protection was merchant and philanthropist Jonas Hanway. Having spotted the umbrella put to good use during his many travels, Hanway took to carrying one through rainy London in the 1750s, a sight met with open jeering by surprised onlookers. The greatest abuse apparently came from coach drivers, who counted on inclement weather to drive up demand for a dry, comfy ride. But Hanway took the derision in stride. Shortly after his death in 1786, an umbrella advertisement surfaced in the London Gazette, a harbinger of sunnier days to come for the accessory’s reputation as a rain repellant for all.
The English word "umbrella" originates from the Latin term "umbra," meaning shade/shadow.
Advertisement
Umbrella champion Jonas Hanway was a pronounced opponent of tea drinking.
You’d expect a man bold enough to navigate the hardscrabble streets of 18th-century London with a dainty umbrella to possess a certain determination to get things done, and indeed, Jonas Hanway was a champion of many causes. He founded the Marine Society to recruit cadets for the navy, served as an executive for the Foundling Hospital children’s home, advocated for the safety of chimney sweepers, and even attempted to convince a nation of tea drinkers that they were doing something profoundly wrong. This last effort came by way of his 1756 work “An Essay on Tea,” which summed up the evils of the seemingly benign refreshment with the subtitle: “Considered as pernicious to health; obstructing industry; and impoverishing the nation.” As with his umbrella crusade, his zealous stance on tea drinking was met with a degree of public mockery, most notably in a response by famed British writer and lexicographer Samuel Johnson in Literary Magazine. But there was no posthumous redemption to be earned in this particular arena, as his countrymen and women went right on indulging themselves with the beverage, and continue to do so to the tune of 100 million cups daily.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.