Interesting Facts

While the things we see and use daily may sometimes be considered mundane, there’s more to them than you might imagine. Did you know that the stick (like the ones you may have just raked up in your yard) was inducted into the National Toy Hall of Fame? When you ate a bagel for breakfast, did you think back to its days as a gift for new mothers? Learn about these stories and more with some mind-expanding facts about everyday items collected from around our website.

Original photo by Joe Hendrickson/ iStock

A pink love seat couch with candles on the sides.
Credit: JasminkaM/ Shutterstock

Love Seats Were Originally Designed To Fit Women’s Dresses, Not Couples

The two-seater upholstered benches we associate with cozy couples were initially crafted with another duo in mind: a woman and her dress. Fashionable attire in 18th-century Europe had reached voluminous proportions — panniers (a type of hooped undergarment) were all the rage, creating a wide-hipped silhouette that occasionally required wearers to pass through doors sideways. Upper-class women with funds to spare adopted billowing skirts that often caused an exhausting situation: the inability to sit down comfortably (or at all). Furniture makers of the period caught on to the need for upsized seats that would allow women with large gowns a moment of respite during social calls.

As the 1800s rolled around, so did new dress trends. Women began shedding heavy layers of hoops and skirts for a slimmed-down silhouette that suddenly made small settees spacious. The midsize seats could now fit a conversation companion. When sweethearts began sitting side by side, the bench seats were renamed “love seats,” indicative of how courting couples could sit together for a (relatively) private conversation in public. The seat’s new use rocketed it to popularity, with some featuring frames that physically divided young paramours. While the small sofas no longer act as upholstered chaperones, love seats are still popular — but mostly because they fit well in small homes and apartments.

Bagels being put into baskets.
Credit: Jordan González/ Unsplash+

Bagels Were Once Given as Gifts to Women After Childbirth

After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challah is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child.

Community records in Krakow advised that bagels could be bestowed on both expectant and new moms. They were also regarded as a thoughtful gift for midwives. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.

Tiny front pocket on denim pants, close up.
Credit: triocean/ Shutterstock

The Tiny Pocket in Your Jeans Was Created To Store Pocket Watches

Ever notice the tiny pocket-within-a-pocket in your jeans? As a kid you may have put small change in there, whereas most adults tend to forget it even exists. Despite all the names it’s had throughout time — “frontier pocket,” “coin pocket,” and “ticket pocket” being just a few — it originally had a specific purpose that didn’t pertain to any of those objects: It was a place to put your watch.

Originally called waist overalls when Levi Strauss & Co. first began making them in 1879, the company’s jeans have always had this dedicated spot for pocket watches — especially those worn by miners, carpenters, and the like. They only had three other pockets (one on the back and two on the front) at the time, making the watch pocket especially prominent. As for why it’s stuck around, the answer seems to be a familiar one: People were used to it and no one felt inclined to phase it out.

Close-up of three pens with caps on resting on a lined paper sheet.
Credit: Marta Nogueira/ Shutterstock

Pen Caps Have Holes for Safety Reasons

If you’ve ever gotten bored enough to study the cap of your ballpoint pen, you may have noticed that it has a hole in it. The hole wasn’t created to save on plastic or to regulate air pressure. Rather, the design is meant to prevent people — namely small children — from choking should they ever swallow a cap. This was first done by BIC, whose popular Cristal pen had a cap that proved more desirable among undiscerning children than safety-conscious parents would have liked. So while the conspiracy-minded among us tend to think that the holes are there to dry out the ink and ensure that consumers will have to continue buying pens in mass quantities, this particular design choice was actually made with public health in mind.

A card going into a vending machine.
Credit: Getty Images/ Unplash+

The World’s First Vending Machine Dispensed Holy Water

Democracy, theater, olive oil, and other bedrocks of Western civilization all got their start with the Greeks. Even some things that might seem like squarely modern inventions have Hellenistic roots, including the humble vending machine. In the first century CE, Greek engineer and mathematician Heron of Alexandria published a two-volume treatise on mechanics called Pneumatica. Within its pages was an assortment of mechanical devices capable of all types of wonders: a never-ending wine cup, rudimentary automatic doors, singing mechanical birds, various automata, the world’s first steam engine, and a coin-operated vending machine.

Heron’s invention wasn’t made with Funyuns and Coca-Cola in mind, however: It dispensed holy water. In Heron’s time, Alexandria was a province of the Greek empire and home to a cornucopia of religions with Roman, Greek, and Egyptian influences. To stand out, many temples hired Heron to supply mechanical miracles meant to encourage faith in believers. Some of these temples also had holy water, and experts believe Heron’s vending machine was invented to moderate acolytes who took too much of it. The mechanism was simple enough: When a coin was inserted in the machine, it weighed down a balancing arm, which in turn pulled a string opening a plug on a container of liquid. Once the coin dropped off the arm, the liquid stopped flowing. It would be another 1,800 years before modern vending machines began to take shape — many of them using the same principles as Heron’s miraculous holy water dispenser.

Hands spreading butter on toasted bread.
Credit: Oksana Mizina/ Shutterstock

The Ancient Romans Thought Eating Butter Was Barbaric

Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — like gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.

They weren’t alone in their contempt. The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose. To be fair, though, Romans considered anyone who lived beyond the empire’s borders (read: most of the world) to be barbarians, so butter eaters were in good company.

Aerial view of honeycombs on a white surface.
Credit: Olivie Strauss/ Unplash+

Honey Never Expires

As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.

Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria.

Despite these built-in natural preservatives, it is possible for honey to spoil if it’s improperly stored. In a sealed container, honey is safe from humidity, but when left open it can absorb moisture that makes it possible for bacteria to survive. In most cases, honey can be safely stored for years on end, though the USDA suggests consuming it within 12 months for the best flavor.

Plain cooked spaghetti pasta on a fork.
Credit: Oko Laa/ Shutterstock

The Name for a Single Spaghetti Noodle Is “Spaghetto”

If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction— one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word spago, which means “twine” or “string.”

A living room with a brown leather couch.
Credit: Clay Banks/ Unsplash+

Couches and Sofas Aren’t the Same Thing

Though usually used interchangeably, these are technically two different pieces of furniture — and the distinction lies in the words themselves. “Couch” comes to us from French, namely coucher — “to lie down” — whereas we have the Arabic word suffah to thank for “sofa.” In the most traditional sense, a sofa would be a wooden bench that comes complete with blankets and cushions and is intended for sitting. eBay’s selling guide used to distinguish between the two by defining a couch as “a piece of furniture with no arms used for lying.” Though it may be a distinction without a difference these days, purists tend to think of sofas as a bit more formal and couches as something you’d take a nap on and let your pets hang out on.

Steel railings of a community pool.
Credit: Enis Yavuz/ Unsplash

U.S. Pools Were Originally Designed to Keep the Masses Clean

Boston’s Cabot Street Bath was the nation’s first indoor municipal pool. Founded in 1868, the pool was on the bleeding edge of what would become a boom in baths designed to help the working classes clean up. The short-lived facility (it was open for only eight years) was soon followed by municipal baths and pools all over the nation, especially in cities with growing immigrant populations whose tenement apartments didn’t contain adequate bathing facilities.

In New York, starting in 1870, river water filled floating, poollike public baths that, according to one onlooker, were as filthy as “floating sewers.” Eventually, by about the mid-20th century, the city’s river baths morphed into the indoor pools we know today — though the city does still have some seasonal outdoor pools.

Close-up of golf balls in a hole.
Credit: Kamran Abdullayev/ Unsplash+

There Are Golf Balls on the Moon

On February 6, 1971, Alan Shepard took one small shot for golf and one giant swing for golfkind. An astronaut on the Apollo 14 landing, Shepard was also a golf enthusiast who decided to bring his hobby all the way to the moon — along with a makeshift club fashioned partly from a sample-collection device. He took two shots, claiming that the second went “miles and miles.” The United States Golf Association (USGA) later put the actual distance of his two strokes at about 24 yards and 40 yards, respectively.

While not enough to land him a spot on the PGA Tour, those numbers are fairly impressive when you remember that the stiff spacesuit Shepard was wearing (in low gravity, no less) forced him to swing with one arm. And while those two golf balls remain on the moon, Shepard brought his club back, later donating it to the USGA Museum in Liberty Corner, New Jersey. Other objects now residing on the moon include photographs, a small gold olive branch, and a plaque that reads: “Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind.”

Close-up of a traffic stop sign.
Credit: Will Porada/ Unsplash

The Inventor of the Stop Sign Never Learned How To Drive

Few people have had a larger or more positive impact on the way we drive than William Phelps Eno, sometimes called the “father of traffic safety.” The New York City-born Eno — who invented the stop sign around the dawn of the 20th century — once traced the inspiration for his career to a horse-drawn-carriage traffic jam he experienced as a child in Manhattan in 1867: “There were only about a dozen horses and carriages involved, and all that was needed was a little order to keep the traffic moving,” he later wrote. “Yet nobody knew exactly what to do; neither the drivers nor the police knew anything about the control of traffic.”

After his father’s death in 1898 left him with a multimillion-dollar inheritance, Eno devoted himself to creating a field that didn’t otherwise exist: traffic management. He developed the first traffic plans for New York, Paris, and London. In 1921, he founded the Washington, D.C.-based Eno Center for Transportation, a research foundation on multimodal transportation issues that still exists. One thing Eno didn’t do, however, is learn how to drive. Perhaps because he had such extensive knowledge of them, Eno distrusted automobiles and preferred riding horses. He died in Connecticut at the age of 86 in 1945 having never driven a car.

Tiny sticks on a group.
Credit: blair yang/ Unsplash

The Stick Has Been Inducted Into the National Toy Hall of Fame

From teddy bears to train sets, classic playthings of youth often conjure memories of a gleaming toy store, holidays, or birthdays. So curators at the Strong National Museum of Play branched out when they added the stick to their collection of all-time beloved toys. Among the most versatile amusements, sticks have inspired central equipment in several sports, including baseball, hockey, lacrosse, fencing, cricket, fishing, and pool. Humble twigs are also ready-made for fetch, slingshots, toasting marshmallows, and boundless make-believe.

Located in Rochester, New York — about 70 miles northeast of Fisher-Price’s headquarters — the Strong acquired the fledgling National Toy Hall of Fame in 2002. (It was previously located in the Gilbert House Children’s Museum in Salem, Oregon.) To date, 74 toys have been inducted, including Crayola Crayons, Duncan Yo-Yos, and bicycles. The stick was added in 2008, three years after another quintessential source of cheap childhood delight: the cardboard box.

 Boxes of Eggo Waffles sit for sale.
Credit: Andrew Burton/ Getty Images News via Getty Images

Eggo Waffles Were Originally Called Froffles

The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.

When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name “Froffles,” a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years. Now a Kellogg’s-owned brand, Eggo serves up waffles as well as other frozen breakfast treats, with mayonnaise — and the name Froffles — but a distant memory.

A can opener to open metallic can on the table in the kitchen.
Credit: FotoDuets/ Shutterstock

Canned Food Was Invented Before the Can Opener

On January 5, 1858, Ezra J. Warner of Connecticut invented the can opener. The device was a long time coming: Frenchman Nicolas Appert had developed the canning process in the early 1800s in response to a 12,000-franc prize the French government offered to anyone who could come up with a practical method of preserving food for Napoleon’s army. Appert devised a process for sterilizing food by half-cooking it, storing it in glass bottles, and immersing the bottles in boiling water, and he claimed the award in 1810. Later the same year, Englishman Peter Durand received the first patent for preserving food in actual tin cans — which is to say, canned food predates the can opener by nearly half a century.

Though he didn’t initially know why his method of storing food in glass jars and heating them worked, years of experimentation led Appert to rightly conclude that “the absolute deprivation from contact with the exterior air” and “application of the heat in the water-bath” were key. He later switched to working with cans himself. Before Warner’s invention, cans were opened with a hammer and chisel — a far more time-consuming approach than the gadgets we’re used to. Warner’s tool (employed by soldiers during the Civil War) wasn’t a perfect replacement, however: It used a series of blades to puncture and then saw off the top of a can, leaving a dangerously jagged edge. As for the hand-crank can opener most commonly used today, that wasn’t invented until 1925.

Balled up paper and a pencil eraser marks.
Credit: Leigh Prather/ Shutterstock

Before Erasers, People Used Bread To Rub Out Pencil Marks

The very first pencils arrived around the dawn of the 17th century, after graphite (the real name for the mineral that forms a pencil’s “lead”) was discovered in England’s Lake District. But the eraser didn’t show up until the 1770s, at the tail end of the Enlightenment. So what filled the roughly 170-year-long gap? Look no further than the bread on your table. Back in the day, artists, scientists, government officials, and anyone else prone to making mistakes would wad up a small piece of bread and moisten it ever so slightly. The resulting ball of dough erased pencil marks on paper almost as well as those pink creations found on the end of No. 2 pencils today.

But in 1770, English chemist Joseph Priestly (best known for discovering oxygen) wrote about “a substance excellently adapted to the purpose of wiping from paper the marks of a black lead pencil.” This substance, then known as caoutchouc, was so perfect for “rubbing” out pencil marks that it soon became known simply as “rubber.” Even today, people in the U.K. still refer to erasers as “rubbers.” (The name “lead-eater” never quite caught on.)

View of an Apple iPhone.
Credit: Daniel Frank/ Unsplash+

The First Smartphone Debuted in 1992

On January 9, 2007, Apple CEO Steve Jobs revealed the iPhone to the world. Since then, Apple’s pricey slab of glass stuffed with technology has become more or less synonymous with the word “smartphone” (sorry, Android fans). But smartphones predate the iPhone by more than a decade. To pinpoint the smartphone’s true birthdate, look back to November 23, 1992, and the introduction of IBM’s Simon at a trade show in Las Vegas. Today, IBM is best known for supercomputers, IT solutions, and enterprise software, but in the ’80s and early ’90s the company was a leader in consumer electronics — a position it hoped to solidify with Simon.

Simon was a smartphone in every sense of the word. It was completely wireless and had a digital assistant, touchscreen, built-in programs (calculator, to-do list, calendar, sketch pad, and more), and third-party apps, something even the original iPhone didn’t have. The idea was so ahead of its time, there wasn’t even a word for it yet — “smartphone” wasn’t coined for another three years. Instead, its full name when it debuted to the larger public in 1993 was the Simon Personal Communicator, or IBM Simon for short. But there’s a reason there isn’t a Simon in everyone’s pocket today. For one thing, the phone had only one hour of battery life. Once it died, it was just a $900 brick (technology had a long way to go before smartphones became pocket-sized; Simon was 8 inches long by 2.5 inches wide). Cell networks were still in their infancy, so reception was spotty at best, which is why the Simon came with a port for plugging into standard phone jacks. In the mid-aughts, increases in carrier capacity and the shrinking of electronic components created the perfect conditions for the smartphones of today. Unfortunately for Simon, it was too late.

Three new replacement windows with green trim on front of house.
Credit: Noel V. Baebler/ Shutterstock

Britain Used To Have a Special Tax on Windows

Governments worldwide have levied taxes for thousands of years; the oldest recorded tax comes from Egypt around 3000 BCE. But England — which relied heavily on taxes to fund its military conquests — is known for a slate of fees that modern taxpayers might consider unusual. Take, for instance, the so-called “window tax,” initially levied in 1696 by King William III, which annually charged citizens a certain amount based on the windows in their homes. Some 30 years before, the British crown had attempted to tax personal property based on chimneys, but clever homeowners could avoid the bill by temporarily bricking up or dismantling their hearths and chimneys before inspections. With windows, assessors could quickly determine a building’s value from the street. The tax was progressive, charging nothing for homes with few or no windows and increasing the bill for dwellings that had more than 10 (that number would eventually shrink to seven).

Not surprisingly, homeowners and landlords throughout the U.K. resented the tax. It didn’t take long for windows to be entirely bricked or painted over (much like fireplaces had been), and new homes were built with fewer windows altogether. Opponents called it a tax on “light and air” that hurt public health, citing reduced ventilation that in turn encouraged disease. Even famed author Charles Dickens joined the fight to dismantle the tax, publishing scathing pieces aimed at Parliament on behalf of poor citizens who were most impacted by the lack of fresh air. Britain repealed its window tax in July 1851, but the architectural impact is still evident — many older homes and buildings throughout the U.K. still maintain their iconic converted windows.

Philadelphia cream cheese packaging.
Credit: NurPhoto via Getty Images

Philadelphia Cream Cheese Isn’t Actually From Philadelphia

The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But one thing’s for sure: Despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods.

So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.

Plastic tags made for bread bags.
Credit: amonphan comphanyo/ Shutterstock

The Color of Your Bread Tag Has an Important Meaning

Ever wonder why the tags used to seal loaves of bread come in different colors? Far from arbitrary, the color-coded system indicates which day of the week the bread was baked. The color system is even alphabetical: Monday is blue, Tuesday is green, Thursday is red, Friday is white, and Saturday is yellow. (Traditionally, bread wasn’t delivered on Wednesday or Sunday.)

Because bread rarely remains on the shelf for more than a few days, this system is more for internal use among employees than it is for customers looking to get the freshest sourdough possible. But if you favor a local bakery and get to know their system, you could either snag the best deals or the fluffiest dinner rolls in town.

Snickers chocolate bar packaging .
Credit: NurPhoto via Getty Images

The Snickers Candy Bar Was Named After One of the Mars Family’s Favorite Horses

While names like Hershey’s and 3 Musketeers (which originally included three bars) are fairly straightforward, some candy bar monikers are more elusive. Case in point: What, exactly, is a Snickers? Actually it’s a “who” — and not a human “who” at that. The candy bar was named after one of the Mars family’s favorite horses. Franklin Mars founded Mars, Incorporated (originally known as Mar-O-Bar Co.) in 1911, introducing Snickers in 1930; when it came time to name his product, he immortalized his equine friend as only a candy magnate could.

As Mars has grown into America’s fourth-largest private company, it has retained a dual focus on both candy and pets. M&M’s, Twix, and Milky Way are all Mars products, as are Iams, Pedigree, and Royal Canin. If you’ve ever wondered how M&M’s got their name, the story is slightly less interesting — it’s simply the last initials of Forrest Mars (Frank’s son) and partner-in-candy Bruce Murrie. The company is known for secrecy, with the family itself having been described as a “reclusive dynasty,” which means it’s a minor miracle that the identity of Snickers the horse was ever revealed in the first place.

Barcode scanner at check out.
Credit: hurricanehank/ Shutterstock

The First Product Scanned With a Barcode Was Juicy Fruit Gum

When Marsh Supermarket cashier Sharon Buchanan rang up a 10-pack of Juicy Fruit on June 26, 1974, and heard a telltale beep, her face must have registered relief. Buchanan’s co-workers at the grocery store in Troy, Ohio, had placed barcodes on hundreds of items the night before, as the National Cash Register Company installed the shop’s new computers and scanners. Buchanan’s “customer” for that first purchase was Clyde Dawson, the head of research and development at Marsh Supermarkets, Inc. For that fateful checkout, Dawson chose the gum, made by the Wrigley Company, because some had wondered if the machine would have trouble reading the item’s very small barcode. It didn’t. Today, one of Marsh’s earliest scanners is part of the Smithsonian Museum of American History.

A microwave surrounded by colorful cabinets.
Credit: Lissete Laverde/ Unsplash

The Microwave Was Invented by Accident, Thanks to a Melted Chocolate Bar

The development of radar helped the Allies win World War II — and oddly enough, the technological advances of the war would eventually change kitchens forever. In 1945, American inventor Percy Spencer was fooling around with a British cavity magnetron, a device built to make radar equipment more accurate and powerful. To his surprise, microwaves produced from the radar melted a chocolate bar (or by some accounts, a peanut cluster bar) in his pocket. Spencer quickly realized that magnetrons might be able to do something else: cook food.

With the help of a bag of popcorn and, some say, a raw egg, Spencer proved that magnetrons could heat and even cook food. First marketed as the Radarange, the microwave oven launched for home use in the 1960s. Today, they’re as ubiquitous as the kitchen sink — all thanks to the Allied push to win the war.

Two people reading inside a library.
Credit: Getty Images/ Unsplash+

Libraries Predate Books

While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.

Happy woman running around with an opened umbrella.
Credit: Getty Images/ Unsplash+

Umbrellas Were Once Used Only by Women

Umbrellas have been around for a long time — at least 3,000 years, according to T.S. Crawford’s A History of the Umbrella — but they were used by only select segments of the population for much of that history. Ancient Egyptians used them to shade their pharaohs, setting the tone for an association with royalty and nobility that would also surface in China, Assyria, India, and other older civilizations. Meanwhile, they were deemed effeminate by ancient Greeks and the Romans who assumed many of their cultural habits. It should be noted that these early umbrellas protected against the sun, not rain, and were generally used by women to shield their complexions. The association between women and umbrellas persisted through much of Europe for centuries, and stubbornly remained into the 18th century, even after the first waterproof umbrellas had been created (around the 17th century in France).

In England, at least, the man credited with ushering in a new age of gender-neutral weather protection was merchant and philanthropist Jonas Hanway. Having spotted the umbrella put to good use during his many travels, Hanway took to carrying one through rainy London in the 1750s, a sight met with open jeering by surprised onlookers. The greatest abuse apparently came from coach drivers, who counted on inclement weather to drive up demand for a dry, comfy ride. But Hanway took the derision in stride. Shortly after his death in 1786, an umbrella advertisement surfaced in the London Gazette, a harbinger of sunnier days to come for the accessory’s reputation as a rain repellant for all.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by lunamarina/ Shutterstock

While we’ve come a long way from being solely reliant on the sun’s rays to chart the day, the core principles of determining time remain largely the same. Nowadays, some people wear a trusty wristwatch, whereas others glance at their phone for a quick update. No matter your preferred method of tracking the hours, here are six timely facts about clocks and other timekeeping devices.

The mechanism of the Salisbury Cathedral clock in the north aisle of Salisbury Cathedral.
Credit: Folb/ Hulton Archive via Getty Images

The Oldest Working Mechanical Clock Is Located at an English Cathedral

England’s Salisbury Cathedral dates back to the 13th century, and is home to one of four surviving original copies of the 1215 Magna Carta. The cathedral is also the site of the world’s oldest working mechanical clock, a machine dating back to 1386, if not earlier.

Composed of hand-wrought iron and intertwined with long ropes that extend halfway up the cathedral walls, the Salisbury Cathedral clock is the brainchild of three clockmakers: Johannes and Williemus Vriemand, as well as Johannes Jietuijt of Delft. The clock operates thanks to a system of falling weights, which are pre-wound once each day, and the device is designed solely to denote each passing hour. It once sat in a detached bell tower before falling into disuse around 1884. Thankfully, the mechanism was rediscovered in 1929 and later restored in 1956; prior to that restoration, the clock had successfully chimed for nearly 500 years on over 500 million separate occasions. It continues to operate today.

Close-up of the clock on Big Ben in London.
Credit: Hoberman Collection/ Universal Images Group via Getty Images

Pennies Are Used To Maintain the Accuracy of Big Ben’s Clock Tower

London’s Elizabeth Tower, at the north end of the Palace of Westminster, boasts one of the most recognizable clock faces in the world. Inside the tower’s belfry is where one can find “Big Ben” — though many use the name to refer to the tower as a whole, it actually refers to the mechanism’s grandest and most prominent bell. Name-related confusion aside, the clock is notable for another reason, too: Its accuracy is regulated using old pennies and, on occasion, other coins.

Due to external atmospheric conditions such as air pressure and wind, the exact time depicted on the face of Elizabeth Tower can fall ever so slightly out of sync with reality. In order to right these wrongs, old pennies — coins that existed prior to England’s switch to decimal coinage in 1971 — are added to the bell’s pendulum, which in turn alters the daily clock speed by 0.4 seconds per penny. The process is a long-standing one, having been used to regulate the time as far back as 1859. In 2012, three of the 10 coins relied upon for this purpose were, for a brief time, swapped out for a five-pound crown commemorating that year’s London Olympics.

Close-up of the Time Squares Ball Drop ball.
Credit: Theo Wargo/ WireImage via Getty Images

19th-Century Maritime Signals Inspired Times Square’s New Year’s Ball

The New Year’s Ball drop in Times Square, New York, is a beloved annual tradition, though its origins had nothing to do with revelry. In fact, the ball drop was inspired by a 19th-century timekeeping mechanism targeted at maritime crews. “Time balls” — which dropped at certain times as a signal to passing ships and navigators to set their on-ship chronometers — first appeared in Portsmouth Harbor in 1829 and later at England’s Royal Observatory at Greenwich in 1833. In fact, the giant red time ball located in Greenwich continues to operate today.

The balls were the culmination of an idea suggested by a man known as Robert Wauchope, who promoted the concept of visual clues located ashore to help passing ships tell time. Wauchope originally suggested the use of flags, though orbs that moved up and down were finally settled upon instead. Though these time balls were initially made to help mariners keep track of time, they soon became an attraction among locals, as people would come to watch the ball fall, in a precursor to today’s New Year’s Eve crowds.

Elephant Water Clock, a masterpiece of medieval engineering.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Medieval Engineer Ismael al-Jazari Invented an Elephant Clock

Throughout the 12th and early 13th centuries, few inventors pioneered more mechanisms in the world of robotics than Ismael al-Jazari, who lived and worked in what is now Turkey. Al-Jazari was so influential at the time that he’s believed to have even inspired the works of Leonardo da Vinci. Among al-Jazari’s most notable timekeeping inventions was an elephant clock, colorful illustrations of which appeared in his 1206 manuscript, The Book of Knowledge of Ingenious Mechanical Devices.

The clock was an intricate device constructed atop the back of a copper elephant, containing several moving parts as well as a scribe to denote the passing of time. The entire clock relied upon a water-powered timer, which was made up of a bowl that slowly descended into a hidden tank of water. As that bowl sank, the scribe noted the number of minutes. Furthermore, every half hour a ball would be triggered to fall and collide with a fan, which rotated the device’s dial to show how many minutes had passed since sunrise. That same ball ultimately dropped into a vase that in turn triggered a cymbal to begin the cycle anew. The whole mechanism not only incorporated this Indian-inspired timing technology, but also Greek hydraulics as well as design elements from Egyptian, Chinese, and Arabian cultures.

Aerial Photo of Boulder, Colorado.
Credit: Kent Raney/ Shutterstock

The World’s Most Accurate Clock Is Located in Boulder, Colorado

Located in the basement of a laboratory at the University of Colorado, Boulder, is a clock considered to be the world’s most accurate. Invented by scientist Jun Ye, the clock is so precise that it would take 15 billion years for it to lose a single second of time. That absurd number dwarfs the traditional 100 million years that it takes many modern atomic clocks to lose a second.

The first atomic clock was constructed in 1949 by the National Bureau of Standards, and helped scientists to accurately redefine the measurement of a second by the year 1967. Prior to that point, a second had been calculated as 0.000,011,574 of a mean solar day, which proved to be inaccurate due to the irregular rotation of the Earth. Ye’s new clock optimizes the techniques of those early atomic clocks, using strontium atoms that are arranged in a 3D lattice to tick at 1 million billion times per second. While that science-heavy explanation may not be entirely clear to the average person, Ye’s atomic clock can be summed up like this: It’s really, really accurate.

An Antique Clock Face with gold gilding and roman numerals.
Credit: rossco/ Shutterstock

French Revolutionary Time Instituted a System of 10-Hour Days

While societies around the world may not agree on much, one thing that’s generally accepted is that each day is 24 hours long. Back in 1793, however, during the French Revolution, France took an oppositional stance and adopted a new timekeeping concept. This decimal time concept included 10-hour days, 100 minutes every hour, and 100 seconds per minute. In essence, its base 10 method of timekeeping was proposed as a simpler way to note how much time had passed on any given day.

This new timekeeping plan officially started on November 24, 1793, and was immediately met with resistance and confusion by the public. People were unwilling to change their old habits for telling time, despite French clockmakers producing new mechanisms that featured both traditional timekeeping methods and the new decimal-based technique. In the end, decimal clocks lost their official status in 1795, though the concept wasn’t quite dead yet. France tried yet again in 1897, this time proposing a variant that incorporated 24-hour-long days with 100 minutes per hour, but that proposal was scrapped in 1900.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by vchal/ iStock

Thanks to the 1975 blockbuster Jaws, a generation of people have grown up with the mistaken belief that sharks are man-eating monsters, intent on attacking anything that moves. Scientists have worked hard to dispel such myths about the ancient creatures, which roam every ocean and vary widely in size, shape, diet, habitat, and attitude. Here are a few facts about these fascinating fish.

Schooling grey reef sharks in the Ningaloo reef of Western Australia.
Credit: Lewis Burnett/ Shutterstock

Sharks Are Older Than Trees

Sharks evolved more than 450 million years ago — long before trees or Tyrannosaurus rex appeared on Earth. The earliest sharks were probably toothless and, like today’s sharks, had cartilaginous skeletons; they may have resembled fish called chimeras that still live in the deep ocean.

The first shark that really looked shark-like appeared around 380 million years ago in the Devonian period. Just a few million years later, a major extinction wiped out many species that competed with sharks, allowing them to evolve rapidly into numerous new shapes, sizes, and ecological niches — some of which are still around. One of the oldest species living today is the bluntnose sixgill shark, which evolved between 200 million and 175 million years ago in the early Jurassic epoch.

Shark fish teeth closeup taken through the glass of aquarium.
Credit: s1murg/ iStock

We Have Learned a Lot From Shark Teeth

As cartilaginous fishes, sharks don’t leave much behind when they die. Known shark fossils consist mainly of teeth and a handful of scales, vertebrae, and impressions left in rock. Even so, paleontologists have been able to identify about 2,000 species of extinct sharks just by examining differences in fossilized teeth. For example, the oldest shark teeth ever found came from an Early Devonian fish dubbed Doliodus problematicus; bits of its fossilized skeleton showed characteristics similar to bony fishes, while its teeth and jaw were more shark-like, confirming a theory that the species was an ancient ancestor of sharks.

Collection of an hand drawn illustrations of various sharks.
Credit: KUCO/ Shutterstock

There Are More Than 500 Species of Sharks in the World

Sharks are categorized into nine taxonomic orders. To name a few of the most recognizable types, Carcharhiniformes, the order of ground sharks, encompasses over 290 species, including the bull shark, tiger shark, blue shark, hammerhead, and more. The great white shark, basking shark, and makos, as well as the aptly named goblin shark and other species, belong to Lamniformes — also known as mackerel sharks. The carpet shark order, Orectolobiformes, includes the whale shark, nurse shark, wobbegong, and others. In all, there are more than 500 species of sharks swimming the world’s water.

size comparison between a whale shark and hammerhead shark.
Credit: bearacreative/ Shutterstock

There’s a Huge Size Difference Between the Largest and Smallest Sharks

With so many shark species swimming Earth’s oceans, there’s incredible variation in their sizes. The largest shark species living today is the whale shark (Rhincodon typus), a gentle, plankton-eating giant that can grow to 45 feet long or more and weigh 20 tons (the biggest accurately measured whale shark reached 61.7 feet!). They can be found in all of the world’s tropical seas. The smallest known shark species, meanwhile, was discovered in 1985 off the coast of Colombia in the Caribbean Sea: The dwarf lantern shark (Etmopterus perryi) averages a length of just under 7 inches. It dwells in the ocean’s twilight zone, about 1,000 feet below the surface, but sometimes feeds in the shallows and uses bioluminescent organs along its belly to camouflage itself against sunlit waters.

Close-up of the lateral line on a great white shark.
Credit: Alessandro De Maddalena via Getty Images

Sharks Have a Sixth Sense

Like all fishes, sharks have a sensory organ called the lateral line running down the length of their bodies. The lateral line system involves exterior pores and specialized cells that can detect vibrations in water, which helps sharks locate prey from hundreds of feet away. In addition to sensing water movements, sharks can perceive electric fields surrounding other animals (the fields are caused by the animals’ muscle contractions). This sixth sense, called electroreception, picks up electrical signals that sharks can use to home in on prey. Electroreception can also guide migrating sharks via Earth’s electromagnetic fields.

View of a Greenland shark in Canada.
Credit: Doug Perrine/ Alamy Stock Photo

One Shark Species Can Live for Centuries

The slow-growing, Arctic-dwelling Greenland shark (Somniosus microcephalus) is not only the longest-lived shark, but also holds the record for the longest-lived vertebrate on Earth. Unlike other sharks, Greenland sharks don’t have cartilage that shows their growth over time, so scientists have had difficulty estimating their age accurately. In 2016, a study in the journal Science described how a team of biologists carbon-dated eye proteins, which build up continuously during the animals’ lives, in several Greenland sharks. They found the individuals were an average of 272 years old when they died, and the results suggested that the sharks’ maximum life span could be up to 500 years.

Under the waves view of a diver with great white sharks.
Credit: solarseven/ iStock

You’re More Likely To Be Killed by a Cow Than a Shark

Your risk of suffering a shark attack is practically nil. For its 2022 global summary, the Florida Museum of Natural History’s International Shark Attack File confirmed 57 unprovoked shark bites in 2022, meaning they happened when humans were simply in the shark’s natural habitat, and 32 provoked attacks, such as when people were feeding or harassing the fish. Forty-one of the unprovoked attacks occurred in the U.S., and one was fatal. Other animals are way more likely to kill you, including cows (which kill an average of 20 Americans a year, according to CDC data), hornets, bees, and wasps, (about 48 people a year) and dogs (around 19 a year).

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Its me Pravin/ Unsplash

Most features in an airplane cabin are designed for a very specific purpose. However, due to the cabin’s complex design, the flight attendants don’t usually take the time to explain every detail to their passengers. (They're more concerned with making sure everyone is safe and comfortable.) However, if you're a curious person who likes to know how things work, we've got you covered. Here are six things you never knew about airplane cabins.

The cabin lights dimmed inside the airplane during flight takeoff.
Credit: Wenhao Ryan/ Unsplash

Cabin Lighting Has a Purpose

Have you noticed that the cabin lights dim during takeoff and landing? It turns out that there are two very good reasons for this. According to Reader's Digest, the first reason is safety. If the lights stayed on and were to suddenly switch from bright to dark in an emergency, it would take precious seconds for passengers' eyes to adjust. With dim lighting during takeoff and landing, our eyes are already adjusted — making it easier to find an exit.

The second reason is the mood. Dim lights are more relaxing than bright lights and might calm a passenger who struggles with flight anxiety. Some airlines such as Virgin Atlantic take this a step further by adding colored lights. Virgin Atlantic uses different shades of their brand color for various situations, like a rosy pink color for boarding and a hot magenta color for drinks.

Overhead console inside a passenger aircraft.

Credit: Juanmonino/ iStock

The Temperature Is Cold for a Reason

Passengers often complain about the cold temperature in airplane cabins. Flight staff will provide passengers with a blanket, but they don't ever increase the heat. That's because the temperature on an aircraft has been set in a very intentional way — and it's for your safety.

A study by ATSM International found that people were more likely to faint on an aircraft than on the ground due to a condition called hypoxia. The pressurized environment of an airplane cabin can prevent our body from getting enough oxygen, which causes fainting. The warmer the temperature onboard the aircraft, the more likely this is to happen. To prevent passengers from passing out, airlines intentionally lower the cabin temperature. While this might be slightly uncomfortable, it's much safer for your body.

Aircraft cabin with vapor condensation due to differences of temperature.

Credit: Vajirawich Wongpuvarak/ iStock via Getty Images Plus

The Air Is Cleaner Than You Think

A common myth about air travel is that you're sharing air — along with germs and food particles — with all the other passengers on board. Gross, right? In reality, airlines do a great job of maintaining clean air quality onboard the aircraft. They actually use a HEPA (High Efficiency Particulate Air) filter system. According to the International Air Transportation Association (IATA), this is the same type of filter used to clean the air in hospital operating rooms. The next time you fly, don’t worry: Cabin air is cleaner than you think.

Occupied lavatory sign on a commercial airlines flight.

Credit: Artem Chekharin/ iStock

Bathrooms Can Be Unlocked From the Outside

While there is a lock inside cabin bathrooms for passengers to use, flight attendants also have the ability to quickly unlock the door from the outside as well. According to Aerotime Hub, this is for passenger safety. In the event of an emergency, flight attendants need to be able to access the bathroom without picking the lock or taking the door off its hinges. This is necessary if a passenger has a health scare or needs assistance while in the bathroom. It can also be used for children who are unable to unlock the door themselves. Don't worry, though: A flight attendant would never just open the door for no reason. They respect passenger privacy and would only use the unlock option in an emergency.

A child sitting by an aircraft window and looking outside.

Credit: MNStudio/ Shutterstock

Window Blinds Must Remain Open

During takeoff and landing, most flight attendants will ask that passengers lift their window blinds. Like so many other things on an airplane, there's a real reason for this. Open blinds allow the flight staff to see any issues on the ground or on the airplane itself. Passengers might also report unusual circumstances they observe from their windows. Lifting the blinds also allows our eyes to adjust to the conditions outside quickly in case of an emergency.

Cabin windows also sometimes have triangle stickers on them to mark certain seats. According to Captain Joe, these stickers indicate which windows provide the best view of the wings. Flight attendants can easily look for the triangle when they need to see the wings for safety reasons. According to Captain Joe, these aisles are also great for passengers prone to motion sickness due to the extra stability provided by the wings.

A view of the overhead compartment in an aircraft cabin.

Credit: Purd77/ Shutterstock

There's a Secret Handrail

Walking down the aisle of a moving airplane can be a wobbly experience — especially when there's turbulence. Most passengers end up grabbing the seats as they walk, which can disturb the people in those seats, but there's actually a better way.

If you watch the flight attendants, you'll notice that they repeatedly reach up to the ceiling when they walk down the aisle. That's because there's a built-in handle rail along the bottom edge of the storage compartment, which can be used to steady yourself. Next time, copy the flight attendants, avoid aggravating fellow passengers, and use this secret rail instead!

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Pictorial Press Ltd/ Alamy Stock Photo

When the film version of West Side Story was released on October 18, 1961, it quickly surpassed its theatrical predecessor to become a smash hit. Audiences were blown away by the love story of Tony (Richard Beymer) and Maria (Natalie Wood) and captivated by the dancing and singing of Anita (Rita Moreno) and Bernardo (George Chakiris).

West Side Story swept the Academy Awards, winning 10 statuettes, including Best Picture and Best Supporting Actress and Actor for Moreno and Chakiris, respectively. Today, it’s still one of the most-watched and beloved films of all time. Here are six surprising facts about the movie musical.

Natalie Wood wrapped in chiffon while singing in a scene.
Credit: Archive Photos/ Moviepix via Getty Images

Wood Wasn’t Originally Tapped to Play Maria

Audrey Hepburn, one of the biggest actresses of her time, was originally asked to play the lead character of Maria. However, Hepburn was pregnant with her son Sean and previously suffered several miscarriages, so she turned down the role to not over-exert herself.

Despite saying no to the blockbuster, Hepburn still made a splash on the big screen that same year in Breakfast at Tiffany’s.

American actor Richard Beymer during the filming of 'West Side Story'.
Credit: Ernst Haas via Getty Images

One Big Star — and a Few Stars-to-Be — Might Have Portrayed Tony

In one account of West Side Story‘s casting, Elvis Presley was in the running to play the lead role of Tony — until his manager, Colonel Tom Parker, reportedly rejected the part. And while Presley’s name may only have been bandied about and never under serious consideration, several actors who hadn’t yet had their big breaks did audition for the film. These include Warren Beatty (who was also considered for the stage version as Riff), Robert Redford, and Burt Reynolds (though the interview sheet listed him as “Bert”).

Beymer eventually won the part of Tony. However, he ended up displeased with his performance. “It’s a thankless role,” he admitted in 1990. “It could have been played more street-wise, with someone other than me.”

Natalie Wood and Richard Beymer in West Side Story.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

Wood and Beymer Didn’t Get Along

In West Side Story, Tony and Maria embody the instantaneous pull of young love at first sight. Away from the cameras, Wood, by far the movie’s biggest star at the time, didn’t connect with her leading man. One theory posited to explain Wood’s distant attitude was that she would have preferred acting opposite her then-husband, Robert Wagner.

According to West Side Story costar Russ Tamblyn (Riff), Wood’s dressing room contained a “hit list” of people who’d gotten on her bad side, and Beymer was one of the names on that list. When Tamblyn asked Wood what Beymer had done, she reportedly answered, “I just don’t like him.”

Natalie Wood in character during a scene in West Side Story.
Credit: United Archives/ Hulton Archive via Getty Images

Wood’s Singing Voice Was Dubbed — To the Surprise of the Actress

After accepting the lead role of Maria, Wood spent the entire production certain her vocals would be heard when the movie headed to theaters. She received intense coaching, and the music department assured Wood that her takes were wonderful. Though singer Marni Nixon also recorded Maria’s songs, Wood believed Nixon’s voice would solely be used for a few high notes. (Ironically, Nixon also was the singing voice for Hepburn in “My Fair Lady.”)

It wasn’t until the end of production that Wood discovered Nixon would be singing the entire role. Wood was an actress, not a trained singer, so it’s not shocking filmmakers wanted a more skilled vocalist to perform Maria’s challenging songs. But Wood would never forgive co-director Robert Wise for keeping her in the dark for so long.

Jerome Robbins, Broadway's renowned director and choreographer.
Credit: Bettmann via Getty Images

Robbins Was Fired as the Movie’s Co-Director

Making West Side Story wouldn’t have been possible without Robbins, who conceptualized the stage musical and did the choreography. So when Robbins wanted to direct the movie version, producers agreed, though they did install Robert Wise as co-director.

As the film was shot, Robbins’ choreography was, as always, impressive. But he demanded numerous takes, which held back production. When most of the big dance numbers were finished, the producers fired Robbins. His assistants handled the remaining dance scenes in the movie. Robbins considered removing his name from the finished project but ultimately decided not to, which turned out to be a wise decision, as he (along with Wise) ended up being awarded an Oscar for Best Director.

Dancers perform in a scene during the filming of the movie musical West Side Story.
Credit: Bettmann via Getty Images

The Original Title Was “East Side Story”

When Robbins came up with the show in 1949, the original plot was about a Catholic boy and a Jewish girl living on the Upper East Side of Manhattan, appropriately called “East Side Story.” The project was eventually shelved while Robbins, along with composer Leonard Bernstein and playwright Arthur Laurents, took on other projects.

The show resurfaced in 1955 — but with a plot twist. Latin gang violence in Los Angeles was making headlines, inspiring Laurents to propose switching locations from the swanky Upper East Side to the then-rundown Upper West Side and centering the conflict around Puerto Rican and white gangs.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by BonNontawat/ Shutterstock

Plastic is everywhere. Look around and you’re bound to immediately notice something made from the stuff. It houses the milk in our grocery carts, makes up the components in our phones, and is woven into the fibers of our clothes. Here are nine facts you might not know about one of the most common materials we interact with every day.

Alexander Parkes, inventor of the first synthetic plastic, c 1850.
Credit: Science & Society Picture Library via Getty Images

The Very First Plastic Was a Flop

Plastic may seem like a manufacturing miracle limited to the 20th century, but its earliest version actually cropped up around the mid-1800s. English chemist and inventor Alexander Parkes created the first known plastic, eponymously named Parkesine, in 1855, and exhibited it in London in 1862. Parkes’ moldable material was formed from cellulose (aka plant fibers), and it wasn’t cheap. It was also brittle and prone to cracking — two reasons that kept Parkesine from gaining widespread popularity.

An ancient doll made of celluloid, a plastic material.
Credit: Diana Macias/ Shutterstock

Some Early Plastics Were Flammable

By the late 1860s, American inventor John Wesley Hyatt had created celluloid, the first plastic product that would be used for everyday products. Hyatt initially marketed celluloid as a substitute for natural materials, suggesting it was an environmentally friendly swap for the ivory used in billiard balls and tortoiseshell harvested for jewelry and combs. However, celluloid did have a major drawback: It could catch on fire. Billiard balls made from the substance reportedly ran the risk of spontaneously bursting into flame, and cinema film made from celluloid was known to be extremely flammable.

Close up view of colorful vintage bakelite (baekelite) bangle bracelets.
Credit: Cynthia Shirk/ Shutterstock

A Plastic Invented in 1909 Is Still Used Today

Belgian chemist Leo Baekeland had already made his fortune from inventing specialty photo paper when he began experimenting with polymers, aka the molecules that make up plastics. Baekeland created a new version of plastic, called Bakelite, in 1907; historians consider it the first fully synthetic plastic made with no naturally occurring materials. Bakelite’s popularity skyrocketed, and the heat-resistant plastic was used for everything from irons, kitchen cookware handles, and telephones to smaller items like buttons and chess pieces. You can still find Bakelite used for electrical components today thanks to its superior insulating properties.

Plastic bins labeled for storage and sorting waste at home.
Credit: SeventyFour/ Shutterstock

The Word “Plastic” Is Ancient

Modern consumers began using the term “plastic” as far back as 1909, to refer to Bakelite, though the word is actually centuries old. “Plastic” has roots in Latin — from plasticus, meaning something like “fit for molding” or “moldable” — and before that the Greek plastikos, which had a similar meaning.

World War II advertisement for plastics.
Credit: Museum of Science and Industry, Chicago/ Archive Photos via Getty Images

World War II Fueled the Plastics Industry

Plastics slowly made their way into everyday products during the early part of the 20th century, though World War II had a profound impact on the industry and caused a surge in production. Lower-quality plastics replaced rationed and hard-to-find materials for consumer products, and higher-end versions were used in the war effort. Acrylic and plexiglass made their way inside bombers and fighter planes in place of glass, and nylon was created as a synthetic silk for parachutes, body armor, and ropes. Plastic technology improved during wartime and led to a boom after, with shoppers buying tons of plastic products that were marketed as durable and easy to clean.

A collection of sorted HDPE bottles from cosmetics and detergents.
Credit: JasminkaM/ Shutterstock

There Are Seven Common Types of Plastic

Technically, there are hundreds of types of plastic, though most of the kinds we interact with on a daily basis fit into seven categories. Polyethylene terephthalate (PET, or #1) plastics are most common, found in water bottles, food containers, and polyester. Milk and laundry detergent jugs come from high-density polyethylene (HDPE, or #2). Squeezable bottles, shopping bags, and garbage bags are made from low-density polyethylene (LDPE, or #4). Polypropylene (PP, or #5) is often used for straws and takeout containers. Plastic types #3 (PVC, or polyvinyl chloride) and #6 (PS, or polystyrene) are considered more difficult to recycle, and #7 is a catch-all category for combination plastics, like electronics, DVDs, and clear plastic forks.

Heating plastic container with broccoli and buckwheat in the microwave.
Credit: goffkein.pro/ Shutterstock

Some Plastics Shouldn’t Be Microwaved

Is it safe to reheat your lunch if it’s stored in a plastic container? It depends on the type of plastic used. Heating some plastics can cause the materials to release additives, aka chemicals that help them stay durable and flexible (BPA and phthalates are the most common causes of concern). Polypropylene (aka plastic #5) is generally considered the safest to microwave because it’s heat-resistant, though plastics #3, #6, and #7 should never be heated. Researchers recommend checking to see if a container is labeled as microwave-safe, and steering clear of plastics that are damaged or unlabeled.

A woman collecting and separating recyclable garbage plastic bottles into a trash bin.
Credit: Farknot Architect/ Shutterstock

Most Plastic Is Never Recycled

Nearly all plastic can be melted down and turned into something new, though most never is. Less than 10% of all plastic products ever created have been recycled, with most ending up burned, in the ocean, or in landfills. That’s because collecting, sorting, and the actual recycling is expensive, far outweighing the cost of producing new plastic items. And plastic manufacturers say that the used containers can only be recycled once or twice before the materials degrade in quality, meaning creating new containers is more reliable — though obviously not great for the environment.

Placing apples in a plastic bag.
Credit: wavebreakmedia/ Shutterstock

Plastic Shopping Bags Were Created to Save Trees

Swedish engineer Sten Gustaf Thulin created the plastic shopping bag in 1965. At the time, Thulin’s intention was to reduce the number of trees harvested to make paper bags; the plastic version he invented was sturdier and could be used over and over again. The bags, which were cheap to produce, became so popular that they nearly replaced paper bags by the end of the 1980s. Yet there were unintended consequences. In 2002, Bangladesh became the first country to ban plastic bags, a movement that’s now grown to more than 100 countries in an effort to safeguard the seas and reduce landfill waste.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by lucky-photographer/ iStock

One thing’s for sure: U.S. presidents are the stuff of legends. However, just because personal tales about the leaders are passed down from generation to generation doesn’t mean the stories are rooted in truth. In fact, many of the stories are so outlandish that it’s amazing people believed them in the first place.

From flammable teeth to ridiculous bathtub debacles, we take a look at the eight of the oddest presidential myths out there — and set the record straight.

Replica of a set of dentures made for George Washington.
Credit: Science & Society Picture Library via Getty Images

Myth: George Washington Had Wooden Teeth

Cherry tree aside, one of the most chewable facts is that the nation’s first president had a mouth full of wooden teeth. While it seems like an odd story to be linked to the founding father, a deeper dig gets to the root of the issue. Washington did indeed have terrible teeth, so much so that he had multiple dentures made. Those mouthpieces were made out of ivory, gold, lead, and even human teeth, but never any wood. Wood was not used by dentists at the time, because not only could wooden dentures cause splinters, but wood is also susceptible to expanding and contracting due to moisture — not ideal for something that lives in your mouth.

The signing of the United States Constitution in 1787.
Credit: Bettmann via Getty Images

Myth: Thomas Jefferson Signed the Constitution

It seems incomprehensible that a big-name founding father like Thomas Jefferson missed out on signing the U.S. Constitution, but he never inked the deal. He was actually absent during the Philadelphia Constitutional Convention in 1787, as he was across the Atlantic Ocean in Paris, France, as the U.S.’s envoy.

Lincoln making his famous 'Gettysburg Address' speech.
Credit: Library of Congress/ Archive Photos via Getty Images

Myth: Abraham Lincoln Wrote the Gettysburg Address on an Envelope

There’s no doubt that the 16th president was a brilliant orator. But the idea that he haphazardly scribbled one of the most important speeches in American history on the back of an envelope during a train ride sounds a little far-fetched. In reality, Abraham Lincoln toiled away at different versions of the Gettysburg Address, which he gave on November 19, 1863. Not just that, it was anything but a solo project. He collaborated with several associates on it — and there are even five original copies of the speech, not one of them on an envelope.

President William Howard Taft makes a point during an election speech.
Credit: Bettmann via Getty Images

Myth: William Howard Taft Got Stuck in a Bathtub

One of the stranger presidential myths might be chalked up to potty humor. Somehow, 27th President William Howard Taft became associated with an embarrassing incident around getting stuck in a bathtub. While it’s true that he was larger in stature, weighing in at 350 pounds, he never had to be rescued from a tub.

That said, there is a reason he’s associated with baths. During his presidency, a super-sized porcelain tub that was 7 feet long, 41 inches wide, and a ton in weight was installed in the White House. It was so massive that four grown men could fit inside. In another bath incident after his presidency, he filled a tub at a hotel in Cape May, New Jersey, a little too high and when he stepped into it, it overflowed to the point that the guests in the dining room below got a bit of a shower.

A Teddy Bear describing the origin of the toy and US president Theodore Roosevelt.
Credit: Hulton Archive via Getty Images

Myth: The Teddy Bear Got Its Name After Theodore Roosevelt Saved a Real Bear

Theodore Roosevelt had long been a hunter, but didn’t exactly show off his best skills on a bear hunt in November 1902. Everyone else in the group had had a fruitful hunt, so to help Roosevelt, the guide tracked a 235-pound bear to a watering hole, clubbed it, and tied it to a tree so the president could claim it. As the story goes, Roosevelt refused to shoot the bear.

The incident made its way to the Washington Post, which published a satirical cartoon about the president sparing the bear. New York City store owners Morris and Rose Mitchom saw the cartoon, were inspired by the president’s act of heroism, and created stuffed animals in his honor, appropriately naming them “Teddy’s bear.”

The problem? Roosevelt didn’t shoot the bear, but he didn’t save it either. He saw that it had been mauled by dogs so savagely already that he asked for the bear to be killed with a hunting knife. Given the dark nature of this true tale, it makes sense that the details are often ignored when talking about this beloved childhood toy.

John Kennedy delivering an address.
Credit: Bettmann via Getty Images

Myth: John F. Kennedy Won the Election Because of the TV Debates Against Richard Nixon

The televised broadcast of a 1960 presidential debate between John F. Kennedy and Richard Nixon is often said to have clinched the victory for JFK, who many found to be more photogenic and charismatic. But when you truly look at the election numbers, it didn’t really have that big of an effect on the results. The candidates were pretty much neck-and-neck throughout the campaign, even appearing to be tied in the polls before and after the four debates. Kennedy seemed to have a slight boost after the first one on September 26, but then Nixon hit it out of the park on the others, especially with his foreign policy take during the final one. In the end, Kennedy won the election by a mere 119,000 votes.

Kennedy and Nixon’s September 1960 debate is often credited as the first televised presidential debate, but that is also a myth. In 1956, a televised debate aired during the run-off between Republican President Dwight Eisenhower and Democrat Adlai Stevenson. However, neither of them attended, and sent surrogates in their place. Eisenhower sent Maine senior senator Margaret Chase Smith, while Democrats went with Eleanor Roosevelt, and it aired on CBS’ Face the Nation.

US President Zachary Taylor dies at home, surrounded by his wife and son and his colleagues.
Credit: MPI/ Archive Photos via Getty Images

Myth: Zachary Taylor Was Poisoned

Just over a year and four months into his term, 12th President Zachary Taylor fell ill and died while in office. For years, many thought that he may have been the first president to be assassinated, since it was rumored that he was poisoned. Despite his death in July 1850, it wasn’t until 1991 that Kentucky scientists definitively concluded there was no arsenic in his blood. Another story, that he died of eating cherries in iced milk, unfortunately may have more truth to it. After leaving the Washington Monument dedication in 1850, he had that combo as a snack and likely came down with severe gastroenteritis — an inflammation of the digestive system — dying five days later.

President Ford smiles as he acknowledges the reception given to him at a convention.
Credit: Bettmann via Getty Images

Myth: Gerald Ford Was a Total Klutz

Throughout Gerald Ford’s presidency, many joked that his vice president, Nelson Rockefeller, was only a banana peel away from the presidency, since the 38th president was so often caught being clumsy. He tumbled down ski slopes, slipped in the rain, and fell coming out of Air Force One, so much so that he was spoofed by Chevy Chase on Saturday Night Live. But in actuality, Ford was quite an athlete in his younger days. He was a football star at the University of Michigan, where he earned his letter for three years. He even tackled future Heisman Trophy winner Jay Berwander in 1934. During his White House years, he also swam and skied regularly, and played tennis and golf, so perhaps all that falling was just to add to his relatability.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by FluidMediaFactory/iStock

As America’s first national park and one of its most important biosphere reserves, Yellowstone holds a unique place in our national consciousness — more than 4 million people visit the park each year. However, with its rich history, there are likely many facts you’ve probably never heard of, even if you consider yourself a park aficionado. Here are eight fascinating Yellowstone National Park facts that will take your knowledge of America’s favorite national park to the next level.

Upper Yellowstone Falls in Yellowstone National Park.
Credit: evenfh/ Shutterstock

There’s Another Grand Canyon at Yellowstone

When most people think of the Grand Canyon, they think of Arizona. But what about the Grand Canyon of the Yellowstone River? This 20-mile long canyon is said to be an important example of river-type erosion, with a depth of more than 1,000 feet. On the ridge of the canyon lies Artist Point, which offers one of the most beautiful views in the park. From this spot on the trail, you can see a majestic, 300-foot waterfall flowing into the canyon. If you look down, you’ll see steep canyon walls in gorgeous hues of pink, orange, yellow, and red.

Rainbow near Castle geyser, Yellowstone National Park.
Credit: janaph/ Shutterstock

Half of the World’s Geysers Are in the Park

Yellowstone is home to a whopping 10,000-plus hydrothermal features, including 500 geysers — which scientists estimate is about half of the world’s geysers. The most famous is Old Faithful, which erupts around 17 times a day. Other breathtaking features, like the Beehive Geyser and Grotto Geyser, are somewhat less popular but still provide a thrilling show of geothermal action. So, if you’re worried about Old Faithful being too crowded at peak times of the year, don’t worry — you still have hundreds of other geysers to see.

Bison grazing at Yellowstone National Park
Credit: Andrew Milas/ iStock

Bison in Yellowstone Are the Oldest in America

While many other grassland areas have been over-hunted and bison have been driven to extinction, Yellowstone’s herd has remained intact. According to the History Channel, Yellowstone’s bison population is the only herd that has existed since prehistoric times in the United States. In the 19th century, the herd was hunted down to its last 23 members by avid fur traders exploring the Wild West. Today, however, the park is home to 5,500 bison, making it the biggest bison population in the country.

Sunny beautiful Yellowstone River landscape in Yellowstone National Park.
Credit: Kit Leong/ Shutterstock

Yellowstone County Has Its Own Judicial System

For 30 years, the United States Army kept order at Yellowstone. Until 1916, soldiers patrolled the park to protect the wildlife from unscrupulous poachers. The park spans three states — Montana, Idaho, and Wyoming — all of which have differing laws pertaining to wildlife and preservation. To fix this decades-old issue of disputes in different parts of the park, Yellowstone officially created the Yellowstone County judicial system in 2006. That means if you break the law while you’re visiting the park, you’ll be put in the official Yellowstone jail. And your mugshot may just be the only souvenir you get to take home.

Nez Perce Creek in Yellowstone National Park.
Credit: dszc/ iStock

The Park Is One of Only UNESCO World Heritage Sites in the U.S.

Around the world, 878 extraordinary locations have been designated as United Nations Education, Science, and Cultural Organization (UNESCO) World Heritage Sites. The United States only has 20 sites across the entire country, and Yellowstone is one of the most important.

UNESCO’s website provides a list of reasons for Yellowstone’s coveted honor, including its distinctive manifestation of geothermal forces and vast number of rare species. These ecological features are why Yellowstone stands alongside culturally significant sites like the Great Barrier Reef and Machu Picchu.

Yellowstone's grand prismatic.
Credit: f11photo/ Shutterstock

Yellowstone Is Actually a Giant Supervolcano

Hot spots and geysers represent just a fraction of the action beneath the surface at Yellowstone. The whole park is actually a supervolcano, although it’s not supposed to erupt anytime soon. But, how do we know this? Despite the warnings, Yellowstone is quite safe: Its supervolcano is made up of two magma chambers. The first chamber contains no more than 15% molten. Meanwhile, the second chamber contains only two percent molten. According to Forbes, it’s practically impossible for a supervolcano to erupt unless its magma chambers contain at least 50% molten. So, rest easy — and don’t forget to enjoy the view.

Large Cinnamon-phase Black Bear crosses the road.
Credit: Tom Reichner/ Shutterstock

The Bears Aren’t as Dangerous as You Think

In the entire history of Yellowstone, only eight people have ever been killed by bears in the park. To put this in perspective, that means only one in 2.7 million visitors will have a fatal bear encounter. Getting injured by a bear is a bit more common, but still happens only about every 20 years. The National Park Service cautions people to look out for falling trees instead, which kill the same number of people (but get a lot less media attention).

Close-up of Yellowstone Sand Verbena.
Credit: Nature and Science/ Alamy Stock Photo

Hundreds of Unique Flowers Thrive in Yellowstone

An estimated 1,350 different types of flowering plants grow wild at Yellowstone, the vast majority native to the region. One remarkable plant that calls the park home is Yellowstone sand verbena, a flower which normally thrives in warm environments but has managed to grow at a 7,700 foot altitude inside the park. Another unique floral trademark of Yellowstone is Ross’s Bentgrass, which grows exclusively in hot, vapor-heavy environments. This plant is a common sight at the park but rare everywhere else in the world.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Lia Bekyan/ Unsplash+

Who can resist the velvety richness of chocolate cake, or the melt-in-your mouth sweetness of a chocolate bar? Not many, considering that the average American eats almost 20 pounds of chocolate each year. The appeal only increases when you take into account that dark chocolate is scientifically proven to reduce stress, lower blood pressure, and improve brain function. Let these eight facts about chocolate (and the cacao trees that create it) deepen your appreciation for this historic and fascinating treat.

Plethora of chocolate and beans on a red background.
Credit: Nikolay_Donetsk/ iStock

Humans Have Enjoyed Chocolate for Thousands of Years

Evidence from an archaeological dig site in the Amazon suggests that humans have enjoyed chocolate for much longer than scientists previously realized. While the origins of consuming chocolate are often linked to some Mesoamerican civilizations — such as the Olmec and Maya peoples — ancient pots containing chocolate residue have also been unearthed farther south, in Ecuador. The containers were collected from Santa Ana-La Florida, an ancient village belonging to the Mayo-Chinchipe people, whom researchers believe regularly consumed chocolate between 5,300 and 2,100 years ago.

Cocoa pod split in half.
Credit: EWY Media/ Shutterstock

Some Early Chocolate Was Served as a Boozy Drink

The Olmec people, who likely lived in communities throughout modern-day southern Mexico between 1200 and 500 BCE, were some of the earliest humans to enjoy chocolate, though not exactly in the same way we do today. Some researchers believe the Olmecs fermented the pulp from cacao fruit to make a hard-hitting drink with a 5% alcohol content.

Orange color cocoa pods hanging on tree in sunlight.
Credit: PixieMe/ Shutterstock

Cacao Trees Can Live for Centuries

Cacao trees are an evergreen species native to Colombia, Ecuador, Venezuela, and other countries in the northern region of South America. Maxing out at 39 feet tall, the trees can reach more than 200 years of age, though researchers aren’t entirely sure of their ultimate life span. That’s because cacao trees only produce fruit for about 25 years, and are often cut down and replaced with younger, productive trees.

Milk and dark chocolate on a wooden table.
Credit: Sebastian Duda/ Shutterstock

Chocolate Was an 18th-Century Cure-All

Today, chocolate can cure a bad mood, though some 200 years ago, many people believed it could remedy a variety of medical ailments. Early pharmacists and doctors marketed chocolate as a miracle food that could cure coughs, hangovers, and indigestion, and nourish the sick back to health. Even Benjamin Franklin — under the pen name Richard Saunders — recommended chocolate as a cure for smallpox in his 1761 Poor Richard’s Almanack.  

Heart shaped and chocolate sweets.
Credit: Svetlana-Cherruty/ iStock

Cadbury Created the First Heart-Shaped Chocolate Boxes

Heart-shaped boxes full of chocolates are forever linked with Valentine’s Day thanks to a marketing ploy from the mid-1800s. British chocolatier Richard Cadbury had created “eating chocolates” — small treats made from excess cocoa butter — and needed a clever way to package them. In 1861, Cadbury introduced his handmade heart-shaped boxes decorated with roses, Cupids, and other Valentine’s Day symbols. The boxes were a hit, with romantics using them to store love letters and other mementos long after the chocolates were devoured.

Broken and whole chocolate eggs.
Credit: New Africa/ Shutterstock

One Chocolate-Making Brand Created Two Iconic Candies

Chocolate bars are today a candy aisle standard, and a far departure from the earliest chocolate blocks. While bitter and naturally oily chocolate was commonly shaped into bricks during the 18th and 19th centuries, it was sold as an ingredient meant for cooking, not as a stand-alone confection. J. S. Fry & Sons, a British chocolate maker, is credited with molding the world’s first chocolate bar meant for eating in 1847, sweetening the confection with sugar. Nearly three decades later, the Fry brand released the first known hollow chocolate Easter eggs.

View of colorful cacao pods and beans.
Credit: eefauscan/ iStock

It Takes a Lot of Cacao to Make Chocolate

While a single cacao tree produces thousands of blossoms per year, not every flower will develop into a chocolate-producing pod. Only 10% to 30% of the fruit — up to 70 pods per tree — will survive the five to seven months it takes to mature for harvest. Despite all that work, the harvest doesn’t go far. Cacao pods contain 20 to 60 beans each, and it takes about 400 beans to make just 1 pound of chocolate.

View of cacao beans and chocolate from Ghana.
Credit: artphotoclub/ Shutterstock

Most of the World’s Chocolate Comes From Two African Nations

While cacao trees are native to parts of South America, most of the world’s commercial crop is grown in Africa. Most farms are located within 10 degrees north and south of the equator, where the finicky trees have access to rainforest-like conditions for consistent temperatures, high humidity, and regular rainfall. Côte d’Ivoire and Ghana are the world’s leading cacao producers; farmers there grow more than half of the world’s chocolate supply, all of which must be harvested by hand.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Allstar Picture Library Limited/ Alamy Stock Photo

There’s only one place where sunny days have been sweeping the clouds away for more than 50 years: Sesame Street. What started as a way to educate preschool kids who might not have the means to receive schooling has turned into one of the most beloved American cultural institutions, filled with iconic characters such as Big Bird, Cookie Monster, Kermit the Frog, Grover, Elmo, Oscar the Grouch, Bert, and Ernie. Premiering on November 10, 1969, the show — which aired on PBS, and since 2016 on HBO and then HBO Max — continues to both entertain and educate new generations of children and remains a nostalgic favorite for adults of all ages.

Cast members of the television show, 'Sesame Street,' posing on the set.
Credit: Hulton Archive/ Archive Photos via Getty Images

The Show Idea Started at a Dinner Party

A producer at New York City’s Channel 13 public television station, Joan Ganz Cooney, was hosting a dinner party in 1966 when she chatted up Lloyd Morrisett, a Carnegie Corporation educator. He told her that one morning he found his 3-year-old staring at the television’s test pattern, waiting for something to begin. They started discussing whether there was any way for young minds to learn from the medium, and thus the entire concept of educational television — and Sesame Street — was born. It was first described as a preschool for those who couldn’t afford to attend.

Sesame Street characters pose under a "123 Sesame Street" sign.
Credit: Astrid Stawiarz/ Getty Images Entertainment via Getty images

The Original Name Was “123 Avenue B”

While names like The Video Classroom and Fun Street were tossed around, the most serious contender was 123 Avenue B, since it fit the vibe of the inner city set of the show. But the name was abandoned because it was an actual street address — and also because there was concern that those outside of New York City may not relate. The show’s writer Virginia Schone came up with the name Sesame Street, though it wasn’t immediately embraced, as many worried it would be hard for young kids to pronounce. After a weekend of brainstorming and no better options, it became the official title. “We went with it because it was the least bad title,” Cooney told Sesame Workshop.

“Rubber Duckie” Was a Billboard Hit Song

Of all the catchy and memorable songs on the show, the only one to ever become a certified Billboard hit was “Rubber Duckie,” which was on the Hot 100 for seven weeks in 1970, topping out at No. 16. The tune was performed by Jim Henson himself, in character as Ernie — and was also nominated for a Grammy for Best Recording for Children that year. Little Richard covered the song in 1994, and an all-star version for National Rubber Duckie Day, featuring Tori Kelly, James Corden, Sia, Jason Derulo, Daveed Diggs, and Anthony Mackie, was released in 2018.

 The Cookie Monster performing on a stage.
Credit: Brian Killian/ WireImage via Getty Images

Not only is “C” standing for cookie good enough for Cookie Monster, so is a five-note singing range. While he was never shy about showing off his vocals, Cookie Monster’s range has always been limited. (Thus, you rarely hear a Cookie-fronted ballad!) “If Grover and Cookie are singing a duet, the whole thing sounds like ‘arrggh,’” the show’s musical director, Bill Sherman, said in 2019, mimicking the sounds of the monster’s gargling. “Sometimes that really works.”

On the Sesame Street set Big Bird puppeteer rehearses a scene with Mr. Snuffleupagus.
Credit: Ira Berger/ Alamy Stock Photo

Snuffleupagus Remained Imaginary for 14 Years

Big Bird first mentioned his imaginary friend Snuffleupagus — or Snuffy for short — in a 1971 episode. But for more than a decade, he remained a mystery, seemingly just a figment of the bird’s imagination. However, in 1985, as child abuse cases started dominating the news, producers decided it was essential to teach children that when they talk to adults, they will be believed. So on the 17th season premiere of the show, the elephant-like brown Pachyderm showed up in person to help Big Bird water flowers with his trunk.

A Puppet 'Kermit the Frog' character of the famous TV series Sesame Street.
Credit: Anadolu Agency via Getty Images

Kermit the Frog Was Originally Made Out of a Coat

The very first rendition of Kermit had more of a lizard-like feel and was made by Henson back in 1955 for the five-minute program Sam and Friends, which aired on a Washington, D.C., affiliate station. He was stitched together out of Henson’s mother’s old spring coat and pieces of Henson’s own jeans — with the bug-eyes predictably made of ping pong balls. Later, he got a more saturated green hue and more frog-like features. Though not currently on display, the original Kermit is part of the National Museum of American History’s collection.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.