Collective nouns are nouns that refer to groups of people, objects, or — our fun for today — animals. We’re all familiar with “a herd of sheep” and “a flock of birds,” but here are six fun and unexpected collective nouns for other animals. Many of these terms originated (or at least were first recorded) in a book from 1486 called The Book of Hawking, Hunting and Blasing of Arms, by Julia Berners — thought to be the first woman published in the English language.
These outrageously pink birds basically define “flamboyance,” with their gracefully curved necks, dramatic habit of perching on one long leg, and their Barbie-pink or scarlet plumage. So it should come as no surprise that both their collective noun (“flamboyance”) and their name (“flamingo”) derive from French and Spanish (respectively) forms of the Latin word flamma, which means “flame.” Other fun bird words: A group of owls is known as a “parliament” or a “looming” (eerie!), and a gathering of peacocks is an “ostentation.”
The proper term for a group of bears is a “sleuth” or a “sloth.” Though bears aren’t related to the lethargic South American mammal, the words for the collective noun and the permanently smiling creature do share the same root. “Sloth” is derived from the Middle English word for “slow.” (There’s also an ursine species known as the “sloth bear.”) For no reason except fun, a group of pandas is known as an “embarrassment,” and a party of polar bears is called a “celebration.”
Americans call them “buffalo,” but the shaggy species that once covered the Great Plains is properly known as “bison.” And sure, we could just refer to a group of these large and stubborn creatures as a “herd,” but it’s much more fun to address them by their other collective noun — an “obstinacy.” Why an obstinacy? Ask anyone who has ever had their car blocked by them at Yellowstone, and you’ll have your answer.
There’s not a fancy scientific explanation as to how a herd of these black-and-white striped safari favorites came to be known as a “zeal.” (They’re also sometimes referred to as a “dazzle.”) But the term is, like many collective nouns, simply fun. The name even made it into the title of the book A Compendium of Collective Nouns: From an Armory of Aardvarks to a Zeal of Zebras. And speaking of safaris, anyone who’s witnessed the chaos of East Africa’s great migration will understand why a group of wildebeest is referred to as a “confusion.”
They’re big and they seem to float, so let’s call a group of hippos a “bloat”! Although they may look rather comical, Hippopotamus amphibius (which don’t actually float but can nap underwater) are extremely aggressive. These rotund natives of sub-Saharan Africa are one of the largest — and deadliest — mammals on the planet.
While a group of kittens born to the same mother is most commonly referred to as a “litter,” an assemblage of unrelated puffballs is called a “kindle” or, more rarely, an “intrigue.” Intriguing! Children will love the 1979 illustrated book A Kindle of Kittens by Rumer Godden and Lynne Byrnes, while language enthusiasts may enjoy the origin of “clowder,” which originated as a variant of the word “clutter.”
Cynthia Barnes
Writer
Cynthia Barnes has written for the Boston Globe, National Geographic, the Toronto Star and the Discoverer. After loving life in Bangkok, she happily calls Colorado home.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
There are few creatures on the planet as cuddly, loyal, and beloved as dogs. Many people pamper their pooches and treat them as members of the family, and in return, dogs provide unconditional love and companionship. In some cases, they can even be trained to sniff out diseases, detect explosives, or assist people with disabilities. But no matter their breed or purpose, they’re incredible animals — with a fascinating history to boot. Here are five faithful facts about dogs.
President Harding’s Dog Had His Own Seat at Cabinet Meetings
These days, presidential pets are almost as famous as their commanders in chief — but that wasn’t always the case. The first White House animal to really achieve celebrity status was President Warren G. Harding’s pup, an Airedale terrier named Laddie Boy, who lived in Washington, D.C., during the Harding administration from 1921 to 1923.
Laddie Boy was a fixture at the president’s side from the very beginning. In fact, on March 5, 1921, one day after taking office, Harding interrupted his first official Cabinet meeting to introduce the dog, who had just arrived from Ohio. After that, Laddie Boy became a regular at Cabinet meetings, and even had his own chair at the table. As part of Harding’s attempt to appeal to the average person and capitalize on his campaign slogan, which promised a “Return to Normalcy,” Laddie Boy also accompanied the president on the golf course, helped welcome foreign delegates, and once participated in the annual White House Easter Egg Roll. The pooch was beloved not only by Harding but by the press as well; newspapers would publish pretend interviews with fictitious quotes from Laddie Boy, much to the delight of the public. After Harding’s death, more than 19,000 newsboys donated pennies to be turned into a copper statue of Laddie Boy, which now belongs to Washington’s Smithsonian Institution.
Few dogs are more famous than the cairn terrier who played Toto in 1939’s The Wizard of Oz. Terry, as she was known off-screen, was born in 1933 and rescued by Carl Spitz after being abandoned by her birth parents. Despite a tumultuous start to her life, she went on to have a prolific career in Hollywood that even many human actors would envy.
Spitz ran his own Hollywood Dog Training School, and although he initially took in Terry just as a pet, she quickly became his biggest star. His technique used silent hand signals, which gave him (and his dogs) an edge over other trainers who had to vocally call out their commands. With his help, Terry landed an uncredited role in 1934’s Ready for Love, and later that year starred alongside Shirley Temple in Bright Eyes. After appearing in several other films, Terry ascended to superstardom when she booked the role of Toto in The Wizard of Oz, earning $125 a week for portraying Dorothy’s trusted companion — more than some of her human castmates made.
The Beatles’ “A Day in the Life” Features a Whistle Only Dogs Can Hear
Back in 2013, the Beatles’ Paul McCartney revealed a little canine-related trick the band had included on their seminal 1967 album, Sgt. Pepper’s Lonely Hearts Club Band. At the end of the album’s final track, “A Day in the Life,” the Fab Four added a tone pitched at 15 kilohertz, making the sound audible to dogs but difficult to hear for most humans. The tone was reportedly added at the request of John Lennon, who asked producer George Martin to dub in the high-pitched frequency. “We’d talk for hours about these frequencies below the sub that you couldn’t really hear and the high frequencies that only dogs could hear,” McCartney explained. “If you ever play Sgt. Pepper, watch your dog.”
Though they serve a more decorative purpose in modern times, spiked dog collars had an important use in ancient Greece, where they were originally conceived of as protection for pooches patrolling farms, as those dogs were susceptible to harm from random wolf attacks. Inspired by standard dog collars that had been developed in the ancient Egyptian city of Naucratis — with whom the Greeks frequently traded — Greek dog owners designed a couple of spiked options to defend against predators. One type of collar was made of metal, thus forming a kind of chain-link guard with spikes, whereas the other was made of leather with metal spikes poking through the material and secured by rivets. In both cases, the spikes were meant to protect the dog’s throat and potentially injure the attacking wolf.
Artistic depictions suggest the collars could also be symbols of status, perhaps ornamented with engravings. Dogs were important creatures within Greek society, and had a profound cultural impact as well. In his epic poem the Odyssey, for example, Homer wrote of Argos, the faithful dog who waits 20 years for the return of the hero Odysseus.
There’s some debate around the oldest breed of domesticated dog, but if you go by the Guinness Book of World Records, that title belongs to the saluki, which is believed to have originated circa 329 BCE (though some experts posit it may date back even further to around 7000 BCE). Sometimes called the royal dog of Egypt, the saluki was heralded in ancient Egyptian society; the dogs were even honored with mummification after death, much like pharaohs at the time. In many Arab cultures throughout the Middle East, hunters used salukis — which boast incredible speed — to track and take down gazelle. The breed eventually made its way to England by the mid-1800s, and was finally recognized by the American Kennel Club in 1927.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Would a Pepsi by any other name taste as sweet? Although many of the world’s most famous brands may seem inseparable from their current names, a number started out with very different labels. Read on for some of the strange and surprising stories behind the names of your favorite products and companies, from Google’s slightly uncomfortable former moniker to the household salve originally called “Wonder Jelly.”
Search engines and massage therapy are usually separate spheres. But when Larry Page and Sergey Brin started working together from their dorm rooms at Stanford in the mid-1990s, they built a search engine that used “back links” to determine the relative importance of pages on the web. Thus, they called the search engine BackRub.
To the relief of everyone who uses the internet today, the name didn’t last long. It was soon switched to Google, a riff on the mathematical term "googol,” which refers to the number 1 followed by 100 zeros. According to the company, their new moniker reflected the team’s mission to “organize the world’s information and make it universally accessible and useful.” The term “googol” itself was coined by the American mathematician Edward Kasner, who used it in a 1940 book as an example of a number so large it baffles the imagination. Kasner came up with the term around 1920 with the help of his 9-year-old nephew, who told him that such a silly number required a suitably silly name.
Nike was founded on a 1964 handshake between Bill Bowerman, then a University of Oregon track-and-field coach, and his former student Phil Knight. At first, they named themselves Blue Ribbon Sports, and served only as the U.S. distributor for Japanese running shoes made by Onitsuka Tiger (now known as Asics).
Then, in 1971, Bowerman and Knight decided to make their own shoes. Their famous swoosh logo actually came first, designed by Portland State University graphic design student Carolyn Davidson. The name choice didn’t happen until the eleventh hour, just before the first shipment of shoes was set to go out. Earlier options included “Dimension Six” (possibly a play on Knight’s love for the music group The 5th Dimension), “Peregrine” (a type of falcon), and “Bengal” (inspired by the brand Puma). But Jeff Johnson, the company’s first employee, had read a magazine article noting that successful brand names were often short with punchy or “exotic” letters like “Z,” “X,” or “K.” He came up with Nike, as in the Greek winged goddess of victory. Knight went with it begrudgingly — but it stuck.
When Jeff Bezos moved to the Seattle area in 1994 to start an online bookstore, he wanted to call the company “Relentless.” In fact, to this day, typing Relentless.com into your browser will take you to Amazon’s site. Another option he considered was Cadabra, as in “Abracadabra,” an idea that was squashed when Bezos’ lawyer misheard it as “Cadaver.”
The name was changed to “Amazon” in part because the world’s largest river (by volume) suggested a sense of scale; the company’s initial tagline was "Earth's biggest bookstore." It was also handy to have a name that began with “A,” because back then, websites were often listed alphabetically on search engines.
Unadulterated Food Products doesn’t have quite the same ring to it that Snapple has, but when the company started out in 1970s New York, it originally sold juice to health food stores. Presumably, the name was a nod to their purity and wholesomeness. The company’s current moniker came about in 1980, inspired by a carbonated apple juice that had a “snappy apple taste.”
This chewy, fruity candy originated in the United Kingdom as “Opal Fruits.” A few years later, in 1967, the treats debuted in the United States as Starburst, supposedly because they’re “unexplainably juicy.” The reason for the “star” reference isn’t entirely clear, although it may have been an attempt to capitalize on the Space Race of the time, when anything otherworldly was cool. (The U.K. name changed to Starburst in 1998, although it changed back temporarily for a nostalgia-tinged reissue of Opal Fruits in 2020.)
Credit: Newscast/ Universal Images Group via Getty Images
Vaseline Was Originally “Wonder Jelly”
Vaseline has numerous uses, from soothing chapped lips to preventing diaper rash, so it may not be a surprise that it was originally called “Wonder Jelly.” The company got its start in 1859, when a chemist named Robert Chesebrough traveled to Titusville, Pennsylvania, and noticed that oil workers were using rod wax (unrefined petroleum jelly) on their burns and abrasions. After a series of experiments, the young chemist produced a lighter, clearer jelly suitable for household use. The product debuted in 1870 as Wonder Jelly. But in 1872 it was rebranded as Vaseline, a combination of the German word “wasser” (water) and the Greek word “oleon” (oil).
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Not all countries are agriculturally equal, and a handful of places — China, the U.S., India, and Brazil — dominate global food production and exports. Yet smaller players also contribute to stocking the shelves with our favorite foods, thanks to native plant species, environmental factors, and infrastructure investments. Take, for instance, Canada’s abundance of lentils or Peru’s booming quinoa industry. Here are 13 foods and their top-producing countries. Do you know where your favorite snack comes from?
Turkey is the world’s leading hazelnut producer, by a large margin. The transcontinental country, which straddles Asia and Europe, accounts for about 72.9% of the total global supply. By comparison, Italy, the second-highest hazelnut-producing country in the world, yields just 20% of the world’s supply each year. About 60% of Turkey’s crop comes from the Eastern Black Sea region; the persistent rainfall, moderate temperatures, and hospitable soil on the steep hills create the perfect growing conditions for the nut. It’s likely you’ve sampled Turkey’s supply and not even realized: Companies like Nestlé, Ferrero, and Godiva primarily source hazelnuts for their candy bars, Nutella spread, and decadent chocolates from the region.
Indonesia: Coconuts
Coconuts have created a heated agricultural competition between Indonesia and the Philippines over the past several years. In 2019, Indonesia edged out the Philippines as the top producer in the world, growing around 19 million tons versus the Philippines’ 14 million tons. (The Philippines, however, remains the world’s top producer of coconut oil.) The coconut is a resilient fruit, and while the palm tree it grows on doesn’t require a specific soil, a high amount of rainfall is needed to properly sustain growth. The trees thrive in humid coastal areas; India, Brazil, Sri Lanka, and Thailand also rank among the world’s major coconut producers. The coconut is extremely versatile; everything from the tree’s leaves and wood to the fruit’s water, meat, and shell can be used, giving the palm tree its nickname, the “Tree of Life.”
If you love the smell and taste of vanilla, you can thank Madagascar. Though it originated in Mexico, 80% of the enduringly popular spice is now grown in the East African country. Anyone who has ever sought natural vanilla extract or beans knows that the prices are not always consumer-friendly, but it’s for good reason: Vanilla isn’t an easy crop to grow. Vines take anywhere from two to four years to mature, pollination is done artificially by hand (flowers open only one day a year, and the plant’s natural pollinator, the Melipona bee, is found only in Mexico), and the beans take nine months after pollination to be ripe for picking. It then takes many more months of preparing and drying the vanilla beans in the sun for their aromatic appeal to be just right, meaning the process from pollination to shipment takes about one year.
Costa Rica: Pineapples
Although pineapples are native to South America, Costa Rica leads the world in pineapple production and exports. The small Central American country leans heavily on the crops, which generate an estimated $1 billion USD a year for their economy. While the crops are bountiful for the country (as well as for Brazil and the Philippines), they require a significant amount of time and effort to produce fruit — one plant typically produces only one or two pineapples every 18 to 24 months. In an effort to speed up the growth, some producers have used artificial fertilizers, but not without criticisms and concerns over the toxicity to the famously environmentally forward country. In response, the Costa Rican pineapple industry is working toward implementing regulations to ensure more sustainable practices.
Avocados have become so ubiquitous in food culture that their consumption was once cited as a reason for Millennials not being able to buy homes. But before they became a so-called luxury grocery item for hip young people, avocados were a long-running staple of the Mexican diet, and to this day, Mexico is the leading avocado producer and exporter in the world. Avocados weren’t always so popular outside of their native land, though — it wasn’t until a PR campaign and Super Bowl commercial in the early 1990s that guacamole became a game-day staple. Today, a staggering 87% of the U.S. supply comes from Mexico, where the avocado industry provides 40,000 jobs and 70,000 seasonal jobs during harvest.
Peru: Quinoa
In the mid-to-late 2000s, quinoa enjoyed a huge popularity surge in Europe and the U.S., where it was touted for its health benefits. Since 2015, Peru — the native region for the Andean plant — has emerged as the largest quinoa producer and exporter in the world. The “superfood” is a grain crop, the edible seeds of which are high in protein, amino acids, fiber, iron, and antioxidants. The ancient grain is so revered that it even received a special honor from the United Nations General Assembly, who named 2013 the International Year of Quinoa. The popularity and production boom has been financially beneficial to Peruvian farmers, who previously grew quinoa primarily for their own family’s use.
When you think of Canada, wheat, canola, or maple syrup might be some of the top agricultural exports that come to mind. (And rightfully so — our neighbors to the north produce 85% of the syrup in the world.) But it might be more surprising to learn that Canada is the world’s top lentil producer as well. The western prairie province of Saskatchewan is responsible for 65% of the world’s lentils. Saskatchewan also supplies the majority of its own country’s lentil supply — 95% — but India, even with steep import duties, remains the top international importer of Canadian lentils. Started in the 1970s, Canada’s lentil industry is relatively young; there are now more than 5,000 lentil farmers in the country.
Nigeria: Yams
Yams have been cultivated in Africa for more than 11,000 years, and the majority of the world’s supply (over 60%) comes from the country of Nigeria. Yams are not only a primary agricultural commodity and staple food for the country, but also have high cultural capital. In Nigeria, where it is often said that “yam is food and food is yam,” traditional dancing, drumming, and costumes accompany the harvest months of August and September; yams are also present at marriages and have a role in fertility ceremonies. Several other African countries celebrate the starchy root vegetable: Nigeria — together with Ghana, Benin, Ivory Coast, Central Africa, Cameroon, and Togo — produce over 94% of the world’s yams.
You may think that Florida, “The Sunshine State,” could be a contender for one of the “orangest” places in the world. And at one time, it was the world’s top producer. But Brazil actually takes the title now, growing 30% of the world’s supply, with 94% of that production concentrated in Sao Paulo. The shift started after the 1960s, when a series of frosts devastated Florida crops. The U.S. citrus industry then took repeated hits throughout the 1970s and 1980s as Florida’s growing conditions proved too volatile, paving the way for Brazil to take the top spot in the late 1970s. The orange industry in Brazil has been a boon to the economy, generating more than 200,000 jobs in 300-plus cities, and bringing in export revenue of up to $2.5 billion USD annually.
Germany: Cheese
Of all the cheese-loving countries in the European Union, France and Switzerland are no match for Germany’s 2.2 million tons produced in 2019. Most of the product stays within the EU; outside the European market, the U.S. was the top cheese importer, accounting for 17% of the exports outside the EU. Japan, Switzerland, South Korea, and Saudi Arabia are the other top importers. So, what kind of cheese is Germany making the most of? Fresh cheeses are incredibly popular, from schichtkäse to mozzarella, but the semi-hard, all-time favorite gouda is the most-produced cheese in Germany.
Black peppercorns are one of the most essential spices for any household pantry, and Vietnam is to thank for nearly half of the world’s output. The black pepper fruit, which originated in India, thrives in tropical climates. It grows on a woody vine, and resembles a small berry before it’s picked and dried into its familiar hardened, shriveled shell. Pepper farms, which are easy to spot across the landscape with their tall, leafy green columns, have started to replace the country’s aging coffee farms over the past decade. (Vietnam is, however, still a leading coffee producer.) The country’s pepper cultivation area tripled between 2013 and 2018, and Vietnam is now looking ahead to ensure that as the industry grows, it does so sustainably.
Italy: Artichokes
Italy’s contributions to the agricultural industry on a global scale are now modest, but it is the world’s top producer of this classic (if not divisive) pizza topping. Artichokes are native to the Mediterranean region and are one of the oldest cultivated vegetables in the world. The layered plant is actually a thistle in the sunflower family; the pointed, edible portion you’re likely familiar with is made up of flower buds before they bloom. (Once bloomed, the violet-blue flower can measure up to seven inches in diameter and becomes coarse and inedible.) Despite Italy’s dominance as an artichoke grower, Spain is actually the world’s leading exporter of the vegetable.
It’s probably not a huge surprise that India takes the top spot on the list of the world’s biggest mango producers — the tropical fruit is native to the region and accounts for 50% of the world supply. But what you might not know is that mangoes from India only became an export available to the U.S. in the late aughts, when President George W. Bush reportedly allowed the import in exchange for India allowing Harley-Davidson motorcycles into the South Asian country. Still, India is not a top mango provider to the U.S., which instead imports most of its product from Mexico, Peru, and Brazil. Of the approximately 1,500 varieties of mangoes grown in India, the most popular — home and abroad — is the Alphonso, which is also known as the “King of Mangoes.”
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
We savor food in many forms, from succulent meats and crumbly cheeses to creamy spreads and crispy crusts. Food nourishes, satisfies, and, most importantly, gives us life.
But not everything is gravy when it comes to culinary consumption, as some of what we eat (and drink) comes packed with surprises. A few foods fuel bizarre reactions from the moment we take a bite, while others combine with outside forces to turn our dining experiences sour. Here are seven such food-fueled oddities that can flummox the senses.
To be clear, many gourmands enjoy topping their fish, salads, and soups with a smattering of this herb. However, others feel like they’re biting into a bar of Ivory Spring. The reason appears to be a matter of genetics. One 2012 study showed that people equipped with certain olfactory receptor genes are more prone to detecting cilantro’s aldehydes, compounds also commonly found in household cleaning agents and perfumes. While the percentage of the population that suffers from this fate tops out at about 20%, the resulting taste is apparently awful enough to spark passionate responses of the sort found on Facebook’s I Hate Cilantro page, which has more than 26,000 likes.
Why Does Orange Juice Taste Terrible After I Brush My Teeth?
Most of us have endured this unpleasant situation at least once. The culprit is a toothpaste ingredient called sodium lauryl sulfate (SLS), which produces the foam that builds during vigorous brushing. Unfortunately, SLS also temporarily blocks the tongue’s sweet receptors, while simultaneously destroying the compounds in saliva that suppress our bitter receptors. The result is a double-whammy for our sensitive taste buds, which leaves us to taste only the unsavory citric acid from what would otherwise be a refreshing drink.
Why Does Spinach Make My Mouth and Teeth Feel Strange?
While experts ranging from celebrity chef Jamie Oliver to Popeye the Sailor Man have praised the nutritional benefits of spinach, few warn about the “chalky” feel that can come with munching on these leafy greens. The effect, known as “spinach tooth,” comes from the oxalic acid and calcium present in the vegetable; ground together in our mouths, they produce easily detectable crystals of calcium oxalate. These crystals are potentially problematic to some people, as they dissolve poorly in water and may cause the formation of kidney stones. The rest of us can simply boil, steam, or apply lemon juice to spinach to offset the unpleasant mouthful that accompanies our daily supplies of iron, fiber, and vitamin C.
Not to be outdone by its fellow healthy side dish, asparagus comes with the unfortunate side effect of producing strong-smelling urine. This comes from the asparagusic acid present solely in this particular vegetable, which breaks down into sulfur byproducts upon digestion and surfaces in urine as soon as 15 minutes after eating. Not everyone is genetically capable of detecting this odor. One study published in 2016 found that roughly 60% of participants reported nothing funky in the bathroom after ingesting asparagus. Regardless, for the people who do experience the aroma, it’s perfectly natural.
Even the most disciplined among us occasionally give in to the temptation to down a bag of salty snacks, for which we may be punished with noticeably swollen fingers, toes, or lips. Officially known as edema, this puffiness stems from the uptick of sodium and our body’s response of pumping more water into the bloodstream, which results in fluid-bloated tissue. Edema can also be a sign of more serious health problems, but those who simply enjoy a few too many fries during a weekend lunch with friends can beat back the swelling by drinking lots of water, ingesting high-potassium foods, and sweating it out in the gym.
Sometimes sprinkled on salads and almost certainly found in pesto-flavored dishes, pine nuts have drawn attention in recent years for producing a metallic aftertaste that can linger for up to two weeks. After reports of “pine nut syndrome” or “pine mouth” first surfaced in Belgium early in the new millennium, investigators followed the trail to the Far East, with seeds of the Chinese white pine (Pinus armandii) fingered as the likely source of this unusual but harmless affliction. It’s still unclear as to what exactly causes the metallic taste, although one professor at the University of Idaho suggested that the seeds stimulate a hormone that increases the production of bitter-tasting bile.
This one isn’t the result of consuming a particular food, but an oft-unforeseen outcome of the food’s residue lingering on hands and arms. Citrus fruits such as lemons and limes contain chemicals called furanocoumarins, which can produce poison ivy-like effects of discoloration, inflammation, and blistering when exposed to the sun’s ultraviolet rays. Technically called phytophotodermatitis, the condition is also known as “bartender dermatitis” for the unfortunate souls who experience it after preparing citrus-infused drinks in outdoor locales. And while prevention isn’t as simple as wiping off errant juice with a towel — a more thorough soap-and-water scrubbing is required — the good news is that these rashes are usually treatable with cold compresses and topical creams.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
If you know one thing about how women’s clothing tends to differ from menswear, it’s that garments made for women are often sorely lacking in pockets. On dresses? Usually nonexistent. Pants? So small as to be functionally useless. According to one study, the disparity is even more severe than you might expect: On average, the pockets in women’s jeans are 6.5% narrower and 48% shorter than those on men’s jeans. A number of companies now seek to correct this oversight, although many legacy brands have yet to get with the times.
But why is this lack of pockets a problem in the first place? And while we’re at it, what’s up with that tiny pocket on everyone’s jeans, or the V-shaped stitching on many sweatshirts? Here are the answers to a few common questions about the clothes in your closet.
In the late 1600s, women didn’t have pockets in their clothing at all — they had belts with attached pockets that they usually wore under their skirts and accessed via small slits that were meant to be essentially invisible. These were spacious enough to carry everything from fruit to gloves, and often as stylish as the purses of today. Purses themselves became more fashionable (and functional) as dresses got smaller and less conducive to covert storage. It wasn’t until the late 18th century that pockets were regularly sewn directly into women’s clothing; for a time, most of them were even larger than men’s pockets.
Then the same thing happened to pants and other garments that had happened to dresses: Smaller, more form-fitting variants became in vogue, making it more difficult to accommodate large pockets. The line of thought was that they ruined the female silhouette, which brings us to perhaps the main crux of this issue: gender inequality.
Women have long entreated the fashion industry to elevate function to the same level as form. The Rationalist Dress Society was founded in 1891 to push back against corsets and other constricting garments in favor of clothing that was more comfortable and useful, but it wasn’t until World War II that this really happened en masse — and even that was only because women were performing jobs that had previously been the sole province of men. If you’ve seen A League of Their Own, you already know what happened once the war ended: Things went back to the way they were. Small steps have been made since then, of course, but by and large women are still forced to deal with tiny pockets.
Ever notice how some of your sweatshirts have V-shaped stitching under the collar? Known by some as a V-insert and by others as a Dorito (yum!), this strange little detail seems like it doesn’t really do anything, so far as most of us can tell, and some might find it a strange design choice. However, the V-stitch can serve not one, but two purposes (and you thought it was pointless!).
The first has to do with the structural integrity of the sweatshirt. As these garments are worn by placing one’s noggin directly through the collar, they’re prone to stretching. V-inserts originally included elastic ribbing that promoted stretch and prevented the material from losing shape. The second reason has to do with sweat, which has a way of permeating crewnecks and letting the world see how much that last workout raised your heart rate. Ribbed V-stitches absorb some of this perspiration, keeping us looking fresh even when we aren’t feeling that way.
While it’s true that many V-inserts you’ll see today are purely decorative, as they aren’t ribbed, some uphold the traditions of yore and keep our sweaters looking like they did the day you bought them. Thanks, V-stitch.
Ever notice the tiny pocket-within-a-pocket in your jeans? As a kid you may have put small change in there, whereas most adults tend to forget it even exists. Despite all the names it’s had throughout time — frontier pocket, coin pocket, and ticket pocket being just a few — it originally had a specific purpose that didn’t pertain to any of those objects: It was a place to put your watch.
Originally called waist overalls when Levi Strauss & Co. first began making them in 1879, the company’s jeans have always had this dedicated spot for pocket watches — especially those worn by miners, carpenters, and the like. They only had three other pockets (one on the back and two on the front) at the time, making the watch pocket especially prominent. As for why it’s stuck around, the answer seems to be a familiar one: People were used to it and no one felt inclined to phase it out.
Credit: Ranta Images/ iStock via Getty Images Plus
What About That Loop on the Back of My Button-Down?
If you were to go pick a button-down shirt out of your closet and examine the back of it, you might find something surprising: a small loop of fabric an inch or two below the collar. The origin of locker loops, as they’re known, involves sailors, the Ivy League, and the mid-20th century. Having just heard their name, you can likely guess why they exist: Hanging shirts is a handy, efficient way to store them.
Locker loops are believed to have first appeared on the uniforms of East Coast sailors, whose ships tended to have lockers rather than closets, and their function was twofold: They saved space and prevented wrinkles that might arise from clothes being folded. Locker loops were then incorporated into the button-down shirts made by Gant Shirtmakers, Yale’s official clothing brand at the time, helping develop an aesthetic that would now be described as preppy.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Have you ever wondered why people give bags of Jordan almonds as wedding favors? Or why circus peanuts taste like bananas? Pop an after-dinner mint and settle in as we dive into these and other common candy questions.
In 1928, an accountant for the Fleer Chewing Gum Company began toying with new recipes. At the time, chewing gum was extremely sticky. But this accountant, a man named Walter Diemer, found a recipe that was less gluey and more stretchy, qualities that allowed him to do something unprecedented — blow bubbles. The color of the gum (the original Dubble Bubble) was supposedly born out of necessity: A diluted red dye was the only food coloring Diemer had available, which thankfully turned the grayish concoction pink.
In 19th-century Europe, it wasn’t uncommon to see trained bears frolicking down the streets in celebration of a parade or festival. Called “dancing bears,” these animals would skip, hop, whirl, twirl, and perform an array of tricks. Fast-forward to the 1920s, when German candymaker Hans Riegel was searching for a clever way to sell his gelatin-based confections to children. Recalling the two-stepping bears of yore, Riegel decided to make an Ursus-shaped candy called Tanzbär (literally “dancing bear”). The snacks were a huge success. Today, you probably know Riegel’s company as Haribo.
Invented in Great Britain, the “curiously strong” Altoid has been freshening mouths since the 1780s. But foul-smelling breath isn’t the reason candied mints became a mainstay at restaurants. In the mid-20th century, peppermint oil was touted as a digestive aid. In the early 20th century, sprigs of mint were offered to diners at the end of meals; eventually restaurants began offering buttermints, scotch mints, polo mints, and After Eights with the bill. (The creators of Altoids, however, were ahead of the pack. They had been marketing the mints as a “stomach calmative to relieve intestinal discomfort” for decades.)
Jordan almonds are a type of dragée, a French confectionary technique that involves coating a treat in a hard decorative shell. Their name has nothing to do with the Middle Eastern country. Rather, according to the Oxford English Dictionary, the word is a descendant of the French and Spanish words for garden: Jardin. (Centuries ago, a “jardyne almaunde” referred to a specific variety of almond grown in the yard.) Eventually, a sweetened variety would become popular at Italian weddings. According to The Knot, “fresh almonds have a bittersweet taste, which represents life; the sugarcoating is added with the hope that the newlyweds’ life will be more sweet than bitter.” Greek wedding guests are often given gift bags with odd numbers of jordan almonds in them to represent indivisibility, while Italian guests receive five almonds representing five wishes: health, wealth, happiness, fertility, and longevity.
The circus peanut is like a Zen koan: The more you think about it, the more your brain hurts. After all, it’s an orange peanut-shaped marshmallow with a taste reminiscent of banana. While the peanut’s origins are murky, the Wall Street Journal suggests that the “peanut-shaped marshmallows … were actually supposed to taste like peanuts … but the flavoring wasn’t stable. So they used banana oil instead, which was inexpensive and didn’t degrade.” There’s also a rumor, shared by Bizarre Foods host Andrew Zimmern, that the odd flavoring was the result of a “freak banana-oil accident.”
Redundant, for one. The extract from the licorice root, which comes from an herbaceous shrub grown in balmy climates, is naturally black. (Other confections with a “licorice” identity — like red licorice, Twizzlers, and other rope candies — don’t contain licorice at all.) Real licorice contains a natural sweetener called glycyrrhizin, which is significantly sweeter than sugar, making it a favorite of candymakers for centuries.
What Is the Origin of the Idea of a “Sweet Tooth”?
Back in the late 14th century, English speakers began using the word “tooth” as a way to say “has a taste for.” (Such as: “Jim has a tooth for steak.”) This is where we get the word toothsome, to describe a pleasant meal. It’s also the origin of “sweet tooth.” In 1390, author John Gower had the first known usage in print, when he included the phrase in his lengthy poem Confessio Amantis: “Delicacie his swete toth Hath fostred.”
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The official purpose of Christmas is to celebrate the birth of Jesus Christ, whom the globe’s roughly 2.2 billion Christians worship as the Son of God. The unofficial purpose is to spend quality time and exchange presents with loved ones, preferably near a fireplace. Based on the fact that Christmas (short for “Christ Mass”) takes place on December 25, one might reasonably assume that this is the date of Jesus’ birth. The truth is a little more complicated.
First of all, no one can say with certainty precisely when Jesus was actually born. Many scholars believe he most likely wasn’t born on December 25, and in fact may have been a spring baby. The Bible does not mention a specific day, month, or year for his birth.
The timing of Christmas is further complicated by the discrepancies between the Julian and Gregorian calendars, which is why Christmas isn’t universally celebrated on December 25. January 6 is the preferred date for Armenian Apostolics, while most Oriental and Eastern Orthodox churches observe it on January 7.
Is the Date of Christmas Based on the Winter Solstice?
One of the most widely accepted theories for the date of Christmas involves the winter solstice, which in ancient Rome took place on December 25. According to the fourth-century theologian Augustine of Hippo, Jesus chose to be born on the shortest day of the year: “Hence it is that He was born on the day which is the shortest in our earthly reckoning and from which subsequent days begin to increase in length. He, therefore, who bent low and lifted us up chose the shortest day, yet the one whence light begins to increase.” This interpretation was later supported by Isaac Newton.
The December 25 date may also have been chosen by the Roman Catholic Church in an attempt to co-opt the pagan festival of Saturnalia, which was dedicated to the Roman deity Saturn.
The earliest theologians who discussed Jesus’ birthday mentioned that he was likely divinely conceived during Passover, the Jewish holiday on which he was also later crucified. They calculated Passover in the year of Jesus’ death as March 25, and arrived at December 25 (nine months later) as a likely date for his birth. Eastern Christian communities used a different calendar, which calculated the date of Passover as April 6; that’s how we got January 6 as Christmas in some parts of the world.
Another theory centers more specifically around the Annunciation, or the day that the Archangel Gabriel told Mary she would give birth to the Son of God. The Annunciation is observed on March 25 — again, exactly nine months before December 25.
When Did Christmas Start Being Celebrated on December 25?
Whatever the case, Christmas taking place on December 25 is hardly a new phenomenon. Christmas probably started being commemorated around the second century, and the church decreed that it be held on December 25 in 336 CE. However, Christmas did not become a really significant Christian holiday until the ninth century. In the end, some theologians argue that the precise date of the celebration doesn’t matter a great deal, as long as the spirit of the day is preserved.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
We look at the sky almost every day, but if we stare at it for a little longer than usual, it can start raising a lot of questions. While we generally know the answers to a lot of the immediate queries (Where does the sun rise and set? What is the brightest star in the sky?), many phenomena remain a mystery. Our atmosphere is a big science fair, and once you start digging in, you’ll find demonstrations of color waves, states of matter, and the speed of light. Here’s an overview of some of the most pressing questions about the sky above us.
To understand this question, think about the atmosphere as a prism. In a prism, white light refracts through its polished surfaces and separates into the colors of the rainbow. The sun produces white light, so when its light travels through the atmosphere, it refracts a rainbow of colors.
But then why do we mostly see blue? Each color comes from an electromagnetic wave. While red has the longest, slowest wavelength, blue and violet move in quick, short waves. As these colors pass through the atmosphere, they oscillate charged particles in air molecules like oxygen and nitrogen. Blue and violet are scattered in all directions at around 10 times the efficiency of red light, so they get the highest coverage area in our sky. Our eyes are more sensitive to blue than violet, which is why we see the sky as blue.
By contrast, on the moon, the sun just looks like a glowing disc traveling through the dark, night sky. This is because the moon doesn’t have an atmosphere, so there’s nothing to scatter the sun’s light and reveal individual colored wavelengths.
Sunsets and sunrises are colorful for a similar reason that the sky is blue: It’s about how light scatters in our atmosphere. During the day, the sun is close enough that we see blue in all directions. But as the sun rises and sets, there’s more atmosphere for the light to travel through. This longer journey gives yellow, orange, and red waves, which naturally take longer to scatter through the atmosphere, a chance to shine. This also explains the golden, or magic, hour, when the sun covers everything in a soft, diffused glow shortly after rising and before setting.
Scientists still don’t know for sure why outer space appears black, but there are a few ideas. In scientific circles, many astronomers wrestle with Olbers’ paradox: If the universe is endless and full of infinite stars, why are we not bathed in the glow from this blanket of stars on Earth? Some theorize that light from distant stars doesn’t have time to reach the Earth in a way that’s visible to our eyes because the universe is expanding faster than the speed of light.
There’s also a lot of light in space that we can’t see. Think back to the blue sky and color wavelengths. There are plenty of light waves that are above or below the threshold for what our eyes can see. Slow, long radio waves and quick, short gamma rays aren’t visible to the naked eye. Stars give off all kinds of invisible light, including infrared, ultraviolet, and other colors we can’t see with our eyes alone.
First, a quick recap from science class: Water can exist as a solid, liquid, or gas. When ice melts, it turns into a liquid; when water freezes, it turns into a solid; when water evaporates, it turns into a gas that’s held in the air. One of the ways gas turns back into liquid is precipitation, like rain or snow.
Clouds form when the air saturates, meaning the air is holding too much moisture. When this happens, condensation can occur. It’s the same phenomenon that causes the outside of a cup of water to become wet on a hot day, but instead of glass, that moisture binds to tiny particles in the air like dust, ash, and salt. As a result, that moisture becomes visible as clouds or fog.
The air’s capacity for water depends on atmospheric pressure, which changes with temperature, so clouds can also form when temperatures suddenly cool. Rain, snow, and hail happen when the clouds become too heavy and that moisture gives in to gravity.
Sarah Anne Lloyd
Writer
Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Americans have been celebrating Halloween for just as long as Thanksgiving, and longer than Independence Day. But while the costume-friendly, sugar-filled holiday feels timeless, the version celebrated today — in the U.S. and around the globe — differs drastically from its Celtic origins. Once a night for honoring the dead, Halloween slowly transformed into a night of revelry and mischief with a supernatural twist. The biggest change? The focus on candy and treats, which American trick-or-treaters have made inseparable from the holiday. Americans are expected to spend $10.6 billion on costumes, candy, and other Halloween items in 2022, up from the record-high $10.14 billion spent in 2021. Whether your Halloween plans include a spooky movie marathon or hosting your own monster mash, you can prep for Halloween trivia with these commonly asked questions.
It’s not exactly clear what ancient Celts did during Samhain, the pagan holiday we now link with Halloween, but historians have some idea thanks to a surviving bronze calendar. The first written mentions of Samhain appeared in Europe around the first century, marking winter’s swift approach and the start of the Celtic new year. Celebrated on October 31, Samhain was a time when the wall between the spirit plane and the living world was thought to be at its weakest, allowing spirits to cross the boundary with ease. In an effort to curb vandalism and mishaps from angsty ghosts, the Celts hosted welcoming bonfires and left food offerings; eventually, the practice transitioned to dressing as ghouls themselves and traveling door to door in search of refreshments and merriment. Modern Halloween has held tight to many Celtic traditions, like fortune-telling and bobbing for apples, but Roman Christian attempts to squash pagan ceremonies starting around 600 CE started the slow transition from religious festival to the spooky secular event.
Colonists in early America brought some Halloween traditions with them (telling ghost stories, pulling pranks, and sharing harvest meals) but strict social and religious rules in Puritan communities scaled back the death-centric influence of early celebrations. Halloween would gain back some of its edge around the mid-1800s when a large influx of Irish immigrants began sharing their holiday traditions passed down from Celtic ancestors, such as carving pumpkins and donning costumes. The Halloween we’re familiar with today slowly spread across the U.S., and by the 1920s, trick-or-treaters across the country were looking forward to their one night of socially acceptable mischief and candy collecting.
Samhain, Halloween, All Hallow’s Eve — a cluster of names surrounding October 31 can make it seem like the fall celebrations are all the same despite having different roots. Samhain, which is still celebrated by pagans worldwide, remains its own holiday that spun off the Halloween traditions we celebrate today. All Hallow’s Eve, however, was created in an attempt to replace Samhain as Christianity spread through Europe. Pope Gregory I crafted a calendar of holy days that coincided with non-Christian holidays around the early 600s CE, co-opting the celebrations in an effort to convert new followers. All Saint’s Day was set for November 1 with the intention of honoring Christian martyrs and saints around the same time Samhain was memorializing deceased loved ones. The holiday, which also went by the name All Hallow’s Day, picked up in popularity; the night prior (October 31) was referred to as All Hallow’s Eve. The name morphed into Hallowe’en, with the apostrophe eventually being dropped altogether.
Halloween decorations primarily come in orange and black, and while there’s no definitive answer to when this color palette took root, both hues are fitting for the crisp, autumnal holiday. Orange is thought to signify fall, reflecting the colors of changing leaves and the season’s most abundant crops — think pumpkins, wheat, and carrots, which dominate gardens and farms this time of year. If you’ve ever felt called to decorate with seasonal squash, know the vibrant orange hue is practically contagious; despite being inedible, brightly colored gourds have sent Americans into autumnal decorating frenzies since the 1930s.
The use of black has a more clear connection to Halloween, thanks once again to the Celts. Because Samhain was a religious festival honoring the deceased, it wasn’t unusual for mourners and celebrants to don dark clothing or veils during festivities. Black also represented the shift to longer nights and shorter days associated with the autumnal equinox. With the blazing days of summer long gone and bountiful harvests with it, black became a visual symbol of death, darkness, and rest.
Surprisingly, candy wasn’t always the main focus of Halloween. The Celts were known to carry treats in their pockets or bags during Samhain as a form of protection against unfriendly spirits; danger could be staved off with the bribe of a snack should a traveler encounter a particularly ill-behaved ghoul. While the rise of Christianity throughout Europe snuffed out many pagan practices associated with Samhain, the idea of exchanging food and treats remained. Following the creation of All Saint’s Day, British and Irish bakers would give away small, spiced “soul cakes” to revelers who meandered from house to house. Door knockers would promise prayers for the homeowner’s deceased family members in exchange for the raisin-topped treats.
In America, the early days of trick-or-treating in the ninth century didn’t exactly yield pillowcases full of candy either; costumed children roaming from door to door begged for money or food instead of sweets while older kids and teens went about the business of performing pranks. It’s likely that public sentiment about vandalism is what helped candy gain more importance than Halloween hijinks. Trickery was a common part of Halloween festivities through the late 1800s, with rowdy revelers performing relatively benign pranks such as soaping windows and tipping outhouses. But by the turn of the 20th century, holiday mischief was seen less as a right of passage for youngsters and more as vandalism and cruelty. As families moved from small communities to large cities, pranks escalated to include more costly property damage and were no longer tailored to specific victims, but unsuspecting passersby.
Cities began hosting parties, parades, and other events to curb Halloween destruction and create a more positive holiday atmosphere. Despite those efforts, it was World War II that drastically changed the holiday’s course; pranks were characterized as a wasteful use of limited resources and a disturbance to factory workers who didn’t have time or energy for tricks. After several years of dampened festivities, communities retooled Halloween, promoting the idea of costumed trick-or-treating as an enjoyable, safe activity. With a booming generation of post-war kids who could easily demand treats from their new subdivision neighbors, the concept took off, cementing itself today as the main way to celebrate the spookiest day of the year.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.