Interesting Facts

While the things we see and use daily may sometimes be considered mundane, there’s more to them than you might imagine. Did you know that the stick (like the ones you may have just raked up in your yard) was inducted into the National Toy Hall of Fame? When you ate a bagel for breakfast, did you think back to its days as a gift for new mothers? Learn about these stories and more with some mind-expanding facts about everyday items collected from around our website.

Original photo by Joe Hendrickson/ iStock

A pink love seat couch with candles on the sides.
Credit: JasminkaM/ Shutterstock

Love Seats Were Originally Designed To Fit Women’s Dresses, Not Couples

The two-seater upholstered benches we associate with cozy couples were initially crafted with another duo in mind: a woman and her dress. Fashionable attire in 18th-century Europe had reached voluminous proportions — panniers (a type of hooped undergarment) were all the rage, creating a wide-hipped silhouette that occasionally required wearers to pass through doors sideways. Upper-class women with funds to spare adopted billowing skirts that often caused an exhausting situation: the inability to sit down comfortably (or at all). Furniture makers of the period caught on to the need for upsized seats that would allow women with large gowns a moment of respite during social calls.

As the 1800s rolled around, so did new dress trends. Women began shedding heavy layers of hoops and skirts for a slimmed-down silhouette that suddenly made small settees spacious. The midsize seats could now fit a conversation companion. When sweethearts began sitting side by side, the bench seats were renamed “love seats,” indicative of how courting couples could sit together for a (relatively) private conversation in public. The seat’s new use rocketed it to popularity, with some featuring frames that physically divided young paramours. While the small sofas no longer act as upholstered chaperones, love seats are still popular — but mostly because they fit well in small homes and apartments.

Bagels being put into baskets.
Credit: Jordan González/ Unsplash+

Bagels Were Once Given as Gifts to Women After Childbirth

After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challah is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child.

Community records in Krakow advised that bagels could be bestowed on both expectant and new moms. They were also regarded as a thoughtful gift for midwives. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.

Tiny front pocket on denim pants, close up.
Credit: triocean/ Shutterstock

The Tiny Pocket in Your Jeans Was Created To Store Pocket Watches

Ever notice the tiny pocket-within-a-pocket in your jeans? As a kid you may have put small change in there, whereas most adults tend to forget it even exists. Despite all the names it’s had throughout time — “frontier pocket,” “coin pocket,” and “ticket pocket” being just a few — it originally had a specific purpose that didn’t pertain to any of those objects: It was a place to put your watch.

Originally called waist overalls when Levi Strauss & Co. first began making them in 1879, the company’s jeans have always had this dedicated spot for pocket watches — especially those worn by miners, carpenters, and the like. They only had three other pockets (one on the back and two on the front) at the time, making the watch pocket especially prominent. As for why it’s stuck around, the answer seems to be a familiar one: People were used to it and no one felt inclined to phase it out.

Close-up of three pens with caps on resting on a lined paper sheet.
Credit: Marta Nogueira/ Shutterstock

Pen Caps Have Holes for Safety Reasons

If you’ve ever gotten bored enough to study the cap of your ballpoint pen, you may have noticed that it has a hole in it. The hole wasn’t created to save on plastic or to regulate air pressure. Rather, the design is meant to prevent people — namely small children — from choking should they ever swallow a cap. This was first done by BIC, whose popular Cristal pen had a cap that proved more desirable among undiscerning children than safety-conscious parents would have liked. So while the conspiracy-minded among us tend to think that the holes are there to dry out the ink and ensure that consumers will have to continue buying pens in mass quantities, this particular design choice was actually made with public health in mind.

A card going into a vending machine.
Credit: Getty Images/ Unplash+

The World’s First Vending Machine Dispensed Holy Water

Democracy, theater, olive oil, and other bedrocks of Western civilization all got their start with the Greeks. Even some things that might seem like squarely modern inventions have Hellenistic roots, including the humble vending machine. In the first century CE, Greek engineer and mathematician Heron of Alexandria published a two-volume treatise on mechanics called Pneumatica. Within its pages was an assortment of mechanical devices capable of all types of wonders: a never-ending wine cup, rudimentary automatic doors, singing mechanical birds, various automata, the world’s first steam engine, and a coin-operated vending machine.

Heron’s invention wasn’t made with Funyuns and Coca-Cola in mind, however: It dispensed holy water. In Heron’s time, Alexandria was a province of the Greek empire and home to a cornucopia of religions with Roman, Greek, and Egyptian influences. To stand out, many temples hired Heron to supply mechanical miracles meant to encourage faith in believers. Some of these temples also had holy water, and experts believe Heron’s vending machine was invented to moderate acolytes who took too much of it. The mechanism was simple enough: When a coin was inserted in the machine, it weighed down a balancing arm, which in turn pulled a string opening a plug on a container of liquid. Once the coin dropped off the arm, the liquid stopped flowing. It would be another 1,800 years before modern vending machines began to take shape — many of them using the same principles as Heron’s miraculous holy water dispenser.

Hands spreading butter on toasted bread.
Credit: Oksana Mizina/ Shutterstock

The Ancient Romans Thought Eating Butter Was Barbaric

Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — like gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.

They weren’t alone in their contempt. The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose. To be fair, though, Romans considered anyone who lived beyond the empire’s borders (read: most of the world) to be barbarians, so butter eaters were in good company.

Aerial view of honeycombs on a white surface.
Credit: Olivie Strauss/ Unplash+

Honey Never Expires

As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.

Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria.

Despite these built-in natural preservatives, it is possible for honey to spoil if it’s improperly stored. In a sealed container, honey is safe from humidity, but when left open it can absorb moisture that makes it possible for bacteria to survive. In most cases, honey can be safely stored for years on end, though the USDA suggests consuming it within 12 months for the best flavor.

Plain cooked spaghetti pasta on a fork.
Credit: Oko Laa/ Shutterstock

The Name for a Single Spaghetti Noodle Is “Spaghetto”

If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction— one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word spago, which means “twine” or “string.”

A living room with a brown leather couch.
Credit: Clay Banks/ Unsplash+

Couches and Sofas Aren’t the Same Thing

Though usually used interchangeably, these are technically two different pieces of furniture — and the distinction lies in the words themselves. “Couch” comes to us from French, namely coucher — “to lie down” — whereas we have the Arabic word suffah to thank for “sofa.” In the most traditional sense, a sofa would be a wooden bench that comes complete with blankets and cushions and is intended for sitting. eBay’s selling guide used to distinguish between the two by defining a couch as “a piece of furniture with no arms used for lying.” Though it may be a distinction without a difference these days, purists tend to think of sofas as a bit more formal and couches as something you’d take a nap on and let your pets hang out on.

Steel railings of a community pool.
Credit: Enis Yavuz/ Unsplash

U.S. Pools Were Originally Designed to Keep the Masses Clean

Boston’s Cabot Street Bath was the nation’s first indoor municipal pool. Founded in 1868, the pool was on the bleeding edge of what would become a boom in baths designed to help the working classes clean up. The short-lived facility (it was open for only eight years) was soon followed by municipal baths and pools all over the nation, especially in cities with growing immigrant populations whose tenement apartments didn’t contain adequate bathing facilities.

In New York, starting in 1870, river water filled floating, poollike public baths that, according to one onlooker, were as filthy as “floating sewers.” Eventually, by about the mid-20th century, the city’s river baths morphed into the indoor pools we know today — though the city does still have some seasonal outdoor pools.

Close-up of golf balls in a hole.
Credit: Kamran Abdullayev/ Unsplash+

There Are Golf Balls on the Moon

On February 6, 1971, Alan Shepard took one small shot for golf and one giant swing for golfkind. An astronaut on the Apollo 14 landing, Shepard was also a golf enthusiast who decided to bring his hobby all the way to the moon — along with a makeshift club fashioned partly from a sample-collection device. He took two shots, claiming that the second went “miles and miles.” The United States Golf Association (USGA) later put the actual distance of his two strokes at about 24 yards and 40 yards, respectively.

While not enough to land him a spot on the PGA Tour, those numbers are fairly impressive when you remember that the stiff spacesuit Shepard was wearing (in low gravity, no less) forced him to swing with one arm. And while those two golf balls remain on the moon, Shepard brought his club back, later donating it to the USGA Museum in Liberty Corner, New Jersey. Other objects now residing on the moon include photographs, a small gold olive branch, and a plaque that reads: “Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind.”

Close-up of a traffic stop sign.
Credit: Will Porada/ Unsplash

The Inventor of the Stop Sign Never Learned How To Drive

Few people have had a larger or more positive impact on the way we drive than William Phelps Eno, sometimes called the “father of traffic safety.” The New York City-born Eno — who invented the stop sign around the dawn of the 20th century — once traced the inspiration for his career to a horse-drawn-carriage traffic jam he experienced as a child in Manhattan in 1867: “There were only about a dozen horses and carriages involved, and all that was needed was a little order to keep the traffic moving,” he later wrote. “Yet nobody knew exactly what to do; neither the drivers nor the police knew anything about the control of traffic.”

After his father’s death in 1898 left him with a multimillion-dollar inheritance, Eno devoted himself to creating a field that didn’t otherwise exist: traffic management. He developed the first traffic plans for New York, Paris, and London. In 1921, he founded the Washington, D.C.-based Eno Center for Transportation, a research foundation on multimodal transportation issues that still exists. One thing Eno didn’t do, however, is learn how to drive. Perhaps because he had such extensive knowledge of them, Eno distrusted automobiles and preferred riding horses. He died in Connecticut at the age of 86 in 1945 having never driven a car.

Tiny sticks on a group.
Credit: blair yang/ Unsplash

The Stick Has Been Inducted Into the National Toy Hall of Fame

From teddy bears to train sets, classic playthings of youth often conjure memories of a gleaming toy store, holidays, or birthdays. So curators at the Strong National Museum of Play branched out when they added the stick to their collection of all-time beloved toys. Among the most versatile amusements, sticks have inspired central equipment in several sports, including baseball, hockey, lacrosse, fencing, cricket, fishing, and pool. Humble twigs are also ready-made for fetch, slingshots, toasting marshmallows, and boundless make-believe.

Located in Rochester, New York — about 70 miles northeast of Fisher-Price’s headquarters — the Strong acquired the fledgling National Toy Hall of Fame in 2002. (It was previously located in the Gilbert House Children’s Museum in Salem, Oregon.) To date, 74 toys have been inducted, including Crayola Crayons, Duncan Yo-Yos, and bicycles. The stick was added in 2008, three years after another quintessential source of cheap childhood delight: the cardboard box.

 Boxes of Eggo Waffles sit for sale.
Credit: Andrew Burton/ Getty Images News via Getty Images

Eggo Waffles Were Originally Called Froffles

The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.

When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name “Froffles,” a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years. Now a Kellogg’s-owned brand, Eggo serves up waffles as well as other frozen breakfast treats, with mayonnaise — and the name Froffles — but a distant memory.

A can opener to open metallic can on the table in the kitchen.
Credit: FotoDuets/ Shutterstock

Canned Food Was Invented Before the Can Opener

On January 5, 1858, Ezra J. Warner of Connecticut invented the can opener. The device was a long time coming: Frenchman Nicolas Appert had developed the canning process in the early 1800s in response to a 12,000-franc prize the French government offered to anyone who could come up with a practical method of preserving food for Napoleon’s army. Appert devised a process for sterilizing food by half-cooking it, storing it in glass bottles, and immersing the bottles in boiling water, and he claimed the award in 1810. Later the same year, Englishman Peter Durand received the first patent for preserving food in actual tin cans — which is to say, canned food predates the can opener by nearly half a century.

Though he didn’t initially know why his method of storing food in glass jars and heating them worked, years of experimentation led Appert to rightly conclude that “the absolute deprivation from contact with the exterior air” and “application of the heat in the water-bath” were key. He later switched to working with cans himself. Before Warner’s invention, cans were opened with a hammer and chisel — a far more time-consuming approach than the gadgets we’re used to. Warner’s tool (employed by soldiers during the Civil War) wasn’t a perfect replacement, however: It used a series of blades to puncture and then saw off the top of a can, leaving a dangerously jagged edge. As for the hand-crank can opener most commonly used today, that wasn’t invented until 1925.

Balled up paper and a pencil eraser marks.
Credit: Leigh Prather/ Shutterstock

Before Erasers, People Used Bread To Rub Out Pencil Marks

The very first pencils arrived around the dawn of the 17th century, after graphite (the real name for the mineral that forms a pencil’s “lead”) was discovered in England’s Lake District. But the eraser didn’t show up until the 1770s, at the tail end of the Enlightenment. So what filled the roughly 170-year-long gap? Look no further than the bread on your table. Back in the day, artists, scientists, government officials, and anyone else prone to making mistakes would wad up a small piece of bread and moisten it ever so slightly. The resulting ball of dough erased pencil marks on paper almost as well as those pink creations found on the end of No. 2 pencils today.

But in 1770, English chemist Joseph Priestly (best known for discovering oxygen) wrote about “a substance excellently adapted to the purpose of wiping from paper the marks of a black lead pencil.” This substance, then known as caoutchouc, was so perfect for “rubbing” out pencil marks that it soon became known simply as “rubber.” Even today, people in the U.K. still refer to erasers as “rubbers.” (The name “lead-eater” never quite caught on.)

View of an Apple iPhone.
Credit: Daniel Frank/ Unsplash+

The First Smartphone Debuted in 1992

On January 9, 2007, Apple CEO Steve Jobs revealed the iPhone to the world. Since then, Apple’s pricey slab of glass stuffed with technology has become more or less synonymous with the word “smartphone” (sorry, Android fans). But smartphones predate the iPhone by more than a decade. To pinpoint the smartphone’s true birthdate, look back to November 23, 1992, and the introduction of IBM’s Simon at a trade show in Las Vegas. Today, IBM is best known for supercomputers, IT solutions, and enterprise software, but in the ’80s and early ’90s the company was a leader in consumer electronics — a position it hoped to solidify with Simon.

Simon was a smartphone in every sense of the word. It was completely wireless and had a digital assistant, touchscreen, built-in programs (calculator, to-do list, calendar, sketch pad, and more), and third-party apps, something even the original iPhone didn’t have. The idea was so ahead of its time, there wasn’t even a word for it yet — “smartphone” wasn’t coined for another three years. Instead, its full name when it debuted to the larger public in 1993 was the Simon Personal Communicator, or IBM Simon for short. But there’s a reason there isn’t a Simon in everyone’s pocket today. For one thing, the phone had only one hour of battery life. Once it died, it was just a $900 brick (technology had a long way to go before smartphones became pocket-sized; Simon was 8 inches long by 2.5 inches wide). Cell networks were still in their infancy, so reception was spotty at best, which is why the Simon came with a port for plugging into standard phone jacks. In the mid-aughts, increases in carrier capacity and the shrinking of electronic components created the perfect conditions for the smartphones of today. Unfortunately for Simon, it was too late.

Three new replacement windows with green trim on front of house.
Credit: Noel V. Baebler/ Shutterstock

Britain Used To Have a Special Tax on Windows

Governments worldwide have levied taxes for thousands of years; the oldest recorded tax comes from Egypt around 3000 BCE. But England — which relied heavily on taxes to fund its military conquests — is known for a slate of fees that modern taxpayers might consider unusual. Take, for instance, the so-called “window tax,” initially levied in 1696 by King William III, which annually charged citizens a certain amount based on the windows in their homes. Some 30 years before, the British crown had attempted to tax personal property based on chimneys, but clever homeowners could avoid the bill by temporarily bricking up or dismantling their hearths and chimneys before inspections. With windows, assessors could quickly determine a building’s value from the street. The tax was progressive, charging nothing for homes with few or no windows and increasing the bill for dwellings that had more than 10 (that number would eventually shrink to seven).

Not surprisingly, homeowners and landlords throughout the U.K. resented the tax. It didn’t take long for windows to be entirely bricked or painted over (much like fireplaces had been), and new homes were built with fewer windows altogether. Opponents called it a tax on “light and air” that hurt public health, citing reduced ventilation that in turn encouraged disease. Even famed author Charles Dickens joined the fight to dismantle the tax, publishing scathing pieces aimed at Parliament on behalf of poor citizens who were most impacted by the lack of fresh air. Britain repealed its window tax in July 1851, but the architectural impact is still evident — many older homes and buildings throughout the U.K. still maintain their iconic converted windows.

Philadelphia cream cheese packaging.
Credit: NurPhoto via Getty Images

Philadelphia Cream Cheese Isn’t Actually From Philadelphia

The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But one thing’s for sure: Despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods.

So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.

Plastic tags made for bread bags.
Credit: amonphan comphanyo/ Shutterstock

The Color of Your Bread Tag Has an Important Meaning

Ever wonder why the tags used to seal loaves of bread come in different colors? Far from arbitrary, the color-coded system indicates which day of the week the bread was baked. The color system is even alphabetical: Monday is blue, Tuesday is green, Thursday is red, Friday is white, and Saturday is yellow. (Traditionally, bread wasn’t delivered on Wednesday or Sunday.)

Because bread rarely remains on the shelf for more than a few days, this system is more for internal use among employees than it is for customers looking to get the freshest sourdough possible. But if you favor a local bakery and get to know their system, you could either snag the best deals or the fluffiest dinner rolls in town.

Snickers chocolate bar packaging .
Credit: NurPhoto via Getty Images

The Snickers Candy Bar Was Named After One of the Mars Family’s Favorite Horses

While names like Hershey’s and 3 Musketeers (which originally included three bars) are fairly straightforward, some candy bar monikers are more elusive. Case in point: What, exactly, is a Snickers? Actually it’s a “who” — and not a human “who” at that. The candy bar was named after one of the Mars family’s favorite horses. Franklin Mars founded Mars, Incorporated (originally known as Mar-O-Bar Co.) in 1911, introducing Snickers in 1930; when it came time to name his product, he immortalized his equine friend as only a candy magnate could.

As Mars has grown into America’s fourth-largest private company, it has retained a dual focus on both candy and pets. M&M’s, Twix, and Milky Way are all Mars products, as are Iams, Pedigree, and Royal Canin. If you’ve ever wondered how M&M’s got their name, the story is slightly less interesting — it’s simply the last initials of Forrest Mars (Frank’s son) and partner-in-candy Bruce Murrie. The company is known for secrecy, with the family itself having been described as a “reclusive dynasty,” which means it’s a minor miracle that the identity of Snickers the horse was ever revealed in the first place.

Barcode scanner at check out.
Credit: hurricanehank/ Shutterstock

The First Product Scanned With a Barcode Was Juicy Fruit Gum

When Marsh Supermarket cashier Sharon Buchanan rang up a 10-pack of Juicy Fruit on June 26, 1974, and heard a telltale beep, her face must have registered relief. Buchanan’s co-workers at the grocery store in Troy, Ohio, had placed barcodes on hundreds of items the night before, as the National Cash Register Company installed the shop’s new computers and scanners. Buchanan’s “customer” for that first purchase was Clyde Dawson, the head of research and development at Marsh Supermarkets, Inc. For that fateful checkout, Dawson chose the gum, made by the Wrigley Company, because some had wondered if the machine would have trouble reading the item’s very small barcode. It didn’t. Today, one of Marsh’s earliest scanners is part of the Smithsonian Museum of American History.

A microwave surrounded by colorful cabinets.
Credit: Lissete Laverde/ Unsplash

The Microwave Was Invented by Accident, Thanks to a Melted Chocolate Bar

The development of radar helped the Allies win World War II — and oddly enough, the technological advances of the war would eventually change kitchens forever. In 1945, American inventor Percy Spencer was fooling around with a British cavity magnetron, a device built to make radar equipment more accurate and powerful. To his surprise, microwaves produced from the radar melted a chocolate bar (or by some accounts, a peanut cluster bar) in his pocket. Spencer quickly realized that magnetrons might be able to do something else: cook food.

With the help of a bag of popcorn and, some say, a raw egg, Spencer proved that magnetrons could heat and even cook food. First marketed as the Radarange, the microwave oven launched for home use in the 1960s. Today, they’re as ubiquitous as the kitchen sink — all thanks to the Allied push to win the war.

Two people reading inside a library.
Credit: Getty Images/ Unsplash+

Libraries Predate Books

While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.

Happy woman running around with an opened umbrella.
Credit: Getty Images/ Unsplash+

Umbrellas Were Once Used Only by Women

Umbrellas have been around for a long time — at least 3,000 years, according to T.S. Crawford’s A History of the Umbrella — but they were used by only select segments of the population for much of that history. Ancient Egyptians used them to shade their pharaohs, setting the tone for an association with royalty and nobility that would also surface in China, Assyria, India, and other older civilizations. Meanwhile, they were deemed effeminate by ancient Greeks and the Romans who assumed many of their cultural habits. It should be noted that these early umbrellas protected against the sun, not rain, and were generally used by women to shield their complexions. The association between women and umbrellas persisted through much of Europe for centuries, and stubbornly remained into the 18th century, even after the first waterproof umbrellas had been created (around the 17th century in France).

In England, at least, the man credited with ushering in a new age of gender-neutral weather protection was merchant and philanthropist Jonas Hanway. Having spotted the umbrella put to good use during his many travels, Hanway took to carrying one through rainy London in the 1750s, a sight met with open jeering by surprised onlookers. The greatest abuse apparently came from coach drivers, who counted on inclement weather to drive up demand for a dry, comfy ride. But Hanway took the derision in stride. Shortly after his death in 1786, an umbrella advertisement surfaced in the London Gazette, a harbinger of sunnier days to come for the accessory’s reputation as a rain repellant for all.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.