In the Northern Hemisphere, July’s arrival signals the full swing of summer. With school out and vacation on the mind, the days of our seventh month become jam-packed with barbecues, adventure, and holiday celebrations — all enjoyed while enduring some of the hottest weather of the year. But there’s more to the month than just its scorching temps, so read on for six interesting facts about the dog days of July.
The Gregorian calendar divides our year into 12 months, with July sitting in the seventh spot. But it wasn’t always this way; at one time, July had an entirely different name. Under the Roman calendar, July was called Quintilis — the Roman word for “fifth,” marking its place as the fifth month of the 10-month year. Statesman and leader Julius Caesar influenced the name change: Quintilis was renamed to honor Caesar following his assassination in 44 BCE, in an ode to his birth month. By then, July had slid out of fifth place and into its current seventh spot.
If you’ve RSVP’d to a seemingly endless stream of birthday parties in July, it’s no surprise. That’s because July’s arrival marks the start of one of the most popular birth months in the year. While August reigns as the most common birth month, July comes in second, with the 12-week popularity wave ending in September (the third-most popular month).
When it comes down to the specific date, July 7 is circled as the sixth-most common birthday, with an average of 12,108 babies born on that day each year. However, not every day in the summer month is popular for new arrivals; July 4 is the fifth-least common birthday among Americans, with an average of just 8,796 babies born. The summer holiday is beaten out only by three other unpopular days, all of which occur in winter: Christmas, New Year’s Day, and New Year’s Eve.
In the U.S., July’s grandest holiday is Independence Day, marked with a day of fireworks, fanfare, and food. But the culinary celebrations don’t have to end after the Fourth of July is through. The summer month hosts a handful of unofficial food-related holidays that are perfectly timed to summer cravings. Dessert lovers can celebrate the season with National Apple Turnover Day on July 5, along with National Sugar Cookie Day on July 9 and National Hot Fudge Sundae Day on July 25. National Piña Colada Day arrives on July 10, followed by Mojito Day on July 11, and both chicken wings and lasagna are honored on July 29. However, one star of the seasonal backyard barbecue gets more than just a day; sausages are honored for a full four weeks thanks to July’s designation as National Hot Dog Month. But if hot dogs aren’t your thing, we have good news: It’s also National Ice Cream Month.
Credit: AFP via Getty Images
The First Bikini Debuted in July
Pools, beaches, and aquatic parks are practically midway through their operating season come the dog days of July, the same month that commemorates a popular piece of attire often worn in water: the bikini. French designer Louis Réard unveiled his tiny two-piece swimsuit on July 5, 1946, at a Paris swimming pool. Réard’s goal was to create the smallest two-piece swimsuit possible, and it’s likely he was inspired by postwar fabric shortages; his original design used just 30 inches of material.
The first bikini was scandalous, with Réard initially unable to find a model willing to debut his creation in public. But the small two-piece suits soon became popular in Europe, commonly seen on beaches throughout the 1950s. Within a decade, the bikini trend gained momentum and jumped across the pond to American swimmers. As for the unusual name, Réard named his swimsuit for Bikini Atoll, a coral island in the Marshall Islands used by the U.S. as a nuclear test site — a moniker meant to suggest how monumental his clothing invention would be.
Most people look to the summer night sky in anticipation of fireworks or an astronomical spectacle (like a glimpse of the planet Venus, which appears to glow its brightest in early July). However, July also offers the best odds of catching sight of an unidentified flying object.
The phenomenon dates as far back as July 1947, when New Mexico rancher W.W. Brazel sparked generations of skywatchers thanks to his report of a downed spacecraft — an event we now call the Roswell Incident. Despite a flurry of conspiracy theories, Brazel’s account of finding debris from a skyfallen UFO was disproven and explained by U.S. military officials as a crashed weather balloon. But in the decades to follow, reports of UFO sightings only grew; according to the National UFO Reporting Center, which has collected data on UFO sightings since 1974, more reports are made in July than any other month. While it’s unclear just why summer lends itself to more UFO sightings, one theory nods at the best parts of summer: Spending more time outside gives us chances to see the unusual, paired with spooky summer blockbusters that prime our brain to see the supernatural.
As Earth travels its constant path around the sun, there comes a time when our planet is at its farthest point from our home star — which happens in July. On an average day, Earth sits a snug 93 million miles from the sun, but because the planet’s orbit is an ellipse — in which the sun isn’t perfectly centered — our distance from the star waxes and wanes throughout the year. In early July, Earth experiences its aphelion, aka a planet’s farthest distance from the sun, winding up a mind-bending 94.5 million miles away. (Come January, Earth will reach its perihelion, aka the closest position to the sun, measuring 91.4 million miles away.) Aphelion is predictable, normally occurring around two weeks after the summer solstice. In 2023, Earth’s aphelion occurs on July 6 at 4:07 p.m. EST.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
When you sing along to an old standard, do you ever really think about the story behind the music? Songs like “Happy Birthday to You,” “Twinkle Twinkle Little Star,” and (at least in the U.S.) “The Star Spangled Banner” feel like they’ve been with us forever, but all of them started somewhere. Here are a few fun facts about these beloved songs, from the popular (but false) story about “Twinkle Twinkle Little Star” to what “Hokey Pokey” and “Old MacDonald” sound like in other parts of the world.
“Happy Birthday to You” Was Originally “Good Morning to All”
“Happy Birthday to You” famously went through a huge copyright battle, despite being one of the best-known songs ever written in English. Sisters Patty and Mildred Hill initially published the song in 1893, and it remained under copyright, most recently to Warner Music Group, until a judge deemed the copyright invalid in 2015. But the original song written by the Hill sisters wasn’t “Happy Birthday to You” — the melody was for a greeting song called “Good Morning to All.” “Happy Birthday to You” was simply a variation that popped up in the early 20th century, although it eventually became the main lyric associated with the tune.
Amateur poet Francis Scott Key wrote the U.S. national anthem, “The Star-Spangled Banner,” while under bombardment at Fort McHenry in Maryland during the War of 1812. In practice, the anthem, originally called “Defence of Fort M’Henry,” usually starts with “O say can you see, by the dawn’s early light,” and ends with “O say does that star-spangled banner yet wave/o’er the land of the free and the home of the brave.” However, this passage is just one of four total verses written by Key. Each one ends with a refrain similar to the last two lines.
“Twinkle Twinkle Little Star” — also the tune to “Baa Baa Black Sheep” and the alphabet song — is popularly attributed to 18th-century composer and child prodigy Wolfgang Amadeus Mozart, but that’s not accurate. Mozart did, however, write some variations on the tune, possibly as an exercise for his music students. The ditty wasn’t known as “Twinkle Twinkle Little Star” at the time, but as a French folk song about candy called “Ah, vous dirai-je, Maman” (“Ah, Mother, if I could tell you”). The words to “Twinkle Twinkle Little Star” were written by poet Jane Taylor and published in 1806.
It’s “Hokey Pokey” in the U.S. and “Hokey Cokey” in the U.K.
The origins of this popular dance are murky and difficult to untangle, so it’s hard to say with certainty how it ended up this way. But the fact remains: When you put your right hand in, take your right hand out, put your right hand in, and shake it all about, you’re doing the “Hokey Pokey” in the United States and the “Hokey Cokey” in the United Kingdom.
Despite not being written for New Year’s Day, the tune “Auld Lang Syne” has become the traditional ballad of the holiday in many English-speaking countries, including the United States, where it’s sung by thousands who aren’t exactly sure what it means. It’s written in Scots, which sounds similar to English in some ways but is a distinct language. Scots is descended from Northern English, which replaced Scottish Gaelic in some portions of Scotland between the 11th and 14th centuries. The literal translation of auld lang syne is “old long since,” but it effectively means “for old times’ sake.”
“Alouette” Is About Plucking a Bird’s Feathers … and Eyes
If you don’t know the real French lyrics to “Alouette,” it sounds like a sweet French nursery rhyme with a bouncy beat. If you’re a francophone, however, you know that it gets a little dark. The chorus translates to “lark, nice lark, I’ll pluck you,” and the verses alternate different body parts — so it’s useful for teaching children about them. Examples include “I’ll pluck your beak,” “I’ll pluck your head,” and, in some versions, “I’ll pluck your eyes.”
Old MacDonald Goes by Different Names Around the World
Like many traditional songs, “Old MacDonald Had a Farm” lived a few different lives before it became standardized. One version from 1917 is about Old Macdougal with a farm in “Ohio-i-o.” Another version from the Ozarks is about Old Missouri with a mule, he-hi-he-hi-ho. The English-speaking world has pretty much settled on Old MacDonald, but in different countries, the name gets localized. In Swedish, the farmer is named Per Olsson; in Germany, it’s Onkel (Uncle) Jörg.
Sarah Anne Lloyd
Writer
Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
While the things we see and use daily may sometimes be considered mundane, there’s more to them than you might imagine. Did you know that the stick (like the ones you may have just raked up in your yard) was inducted into the National Toy Hall of Fame? When you ate a bagel for breakfast, did you think back to its days as a gift for new mothers? Learn about these stories and more with some mind-expanding facts about everyday items collected from around our website.
Love Seats Were Originally Designed To Fit Women’s Dresses, Not Couples
The two-seater upholstered benches we associate with cozy couples were initially crafted with another duo in mind: a woman and her dress. Fashionable attire in 18th-century Europe had reached voluminous proportions — panniers (a type of hooped undergarment) were all the rage, creating a wide-hipped silhouette that occasionally required wearers to pass through doors sideways. Upper-class women with funds to spare adopted billowing skirts that often caused an exhausting situation: the inability to sit down comfortably (or at all). Furniture makers of the period caught on to the need for upsized seats that would allow women with large gowns a moment of respite during social calls.
As the 1800s rolled around, so did new dress trends. Women began shedding heavy layers of hoops and skirts for a slimmed-down silhouette that suddenly made small settees spacious. The midsize seats could now fit a conversation companion. When sweethearts began sitting side by side, the bench seats were renamed “love seats,” indicative of how courting couples could sit together for a (relatively) private conversation in public. The seat’s new use rocketed it to popularity, with some featuring frames that physically divided young paramours. While the small sofas no longer act as upholstered chaperones, love seats are still popular — but mostly because they fit well in small homes and apartments.
Bagels Were Once Given as Gifts to Women After Childbirth
After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challah is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child.
Community records in Krakow advised that bagels could be bestowed on both expectant and new moms. They were also regarded as a thoughtful gift for midwives. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.
The Tiny Pocket in Your Jeans Was Created To Store Pocket Watches
Ever notice the tiny pocket-within-a-pocket in your jeans? As a kid you may have put small change in there, whereas most adults tend to forget it even exists. Despite all the names it’s had throughout time — “frontier pocket,” “coin pocket,” and “ticket pocket” being just a few — it originally had a specific purpose that didn’t pertain to any of those objects: It was a place to put your watch.
Originally called waist overalls when Levi Strauss & Co. first began making them in 1879, the company’s jeans have always had this dedicated spot for pocket watches — especially those worn by miners, carpenters, and the like. They only had three other pockets (one on the back and two on the front) at the time, making the watch pocket especially prominent. As for why it’s stuck around, the answer seems to be a familiar one: People were used to it and no one felt inclined to phase it out.
If you’ve ever gotten bored enough to study the cap of your ballpoint pen, you may have noticed that it has a hole in it. The hole wasn’t created to save on plastic or to regulate air pressure. Rather, the design is meant to prevent people — namely small children — from choking should they ever swallow a cap. This was first done by BIC, whose popular Cristal pen had a cap that proved more desirable among undiscerning children than safety-conscious parents would have liked. So while the conspiracy-minded among us tend to think that the holes are there to dry out the ink and ensure that consumers will have to continue buying pens in mass quantities, this particular design choice was actually made with public health in mind.
Credit: Getty Images/ Unplash+
The World’s First Vending Machine Dispensed Holy Water
Democracy, theater, olive oil, and other bedrocks of Western civilization all got their start with the Greeks. Even some things that might seem like squarely modern inventions have Hellenistic roots, including the humble vending machine. In the first century CE, Greek engineer and mathematician Heron of Alexandria published a two-volume treatise on mechanics called Pneumatica. Within its pages was an assortment of mechanical devices capable of all types of wonders: a never-ending wine cup, rudimentary automatic doors, singing mechanical birds, various automata, the world’s first steam engine, and a coin-operated vending machine.
Heron’s invention wasn’t made with Funyuns and Coca-Cola in mind, however: It dispensed holy water. In Heron’s time, Alexandria was a province of the Greek empire and home to a cornucopia of religions with Roman, Greek, and Egyptian influences. To stand out, many temples hired Heron to supply mechanical miracles meant to encourage faith in believers. Some of these temples also had holy water, and experts believe Heron’s vending machine was invented to moderate acolytes who took too much of it. The mechanism was simple enough: When a coin was inserted in the machine, it weighed down a balancing arm, which in turn pulled a string opening a plug on a container of liquid. Once the coin dropped off the arm, the liquid stopped flowing. It would be another 1,800 years before modern vending machines began to take shape — many of them using the same principles as Heron’s miraculous holy water dispenser.
The Ancient Romans Thought Eating Butter Was Barbaric
Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — like gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.
They weren’t alone in their contempt. The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose. To be fair, though, Romans considered anyone who lived beyond the empire’s borders (read: most of the world) to be barbarians, so butter eaters were in good company.
As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.
Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria.
Despite these built-in natural preservatives, it is possible for honey to spoil if it’s improperly stored. In a sealed container, honey is safe from humidity, but when left open it can absorb moisture that makes it possible for bacteria to survive. In most cases, honey can be safely stored for years on end, though the USDA suggests consuming it within 12 months for the best flavor.
The Name for a Single Spaghetti Noodle Is “Spaghetto”
If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction— one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word spago, which means “twine” or “string.”
Though usually used interchangeably, these are technically two different pieces of furniture — and the distinction lies in the words themselves. “Couch” comes to us from French, namely coucher — “to lie down” — whereas we have the Arabic word suffah to thank for “sofa.” In the most traditional sense, a sofa would be a wooden bench that comes complete with blankets and cushions and is intended for sitting. eBay’s selling guide used to distinguish between the two by defining a couch as “a piece of furniture with no arms used for lying.” Though it may be a distinction without a difference these days, purists tend to think of sofas as a bit more formal and couches as something you’d take a nap on and let your pets hang out on.
U.S. Pools Were Originally Designed to Keep the Masses Clean
Boston’s Cabot Street Bath was the nation’s first indoor municipal pool. Founded in 1868, the pool was on the bleeding edge of what would become a boom in baths designed to help the working classes clean up. The short-lived facility (it was open for only eight years) was soon followed by municipal baths and pools all over the nation, especially in cities with growing immigrant populations whose tenement apartments didn’t contain adequate bathing facilities.
In New York, starting in 1870, river water filled floating, poollike public baths that, according to one onlooker, were as filthy as “floating sewers.” Eventually, by about the mid-20th century, the city’s river baths morphed into the indoor pools we know today — though the city does still have some seasonal outdoor pools.
On February 6, 1971, Alan Shepard took one small shot for golf and one giant swing for golfkind. An astronaut on the Apollo 14 landing, Shepard was also a golf enthusiast who decided to bring his hobby all the way to the moon — along with a makeshift club fashioned partly from a sample-collection device. He took two shots, claiming that the second went “miles and miles.” The United States Golf Association (USGA) later put the actual distance of his two strokes at about 24 yards and 40 yards, respectively.
While not enough to land him a spot on the PGA Tour, those numbers are fairly impressive when you remember that the stiff spacesuit Shepard was wearing (in low gravity, no less) forced him to swing with one arm. And while those two golf balls remain on the moon, Shepard brought his club back, later donating it to the USGA Museum in Liberty Corner, New Jersey. Other objects now residing on the moon include photographs, a small gold olive branch, and a plaque that reads: “Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind.”
The Inventor of the Stop Sign Never Learned How To Drive
Few people have had a larger or more positive impact on the way we drive than William Phelps Eno, sometimes called the “father of traffic safety.” The New York City-born Eno — who invented the stop sign around the dawn of the 20th century — once traced the inspiration for his career to a horse-drawn-carriage traffic jam he experienced as a child in Manhattan in 1867: “There were only about a dozen horses and carriages involved, and all that was needed was a little order to keep the traffic moving,” he later wrote. “Yet nobody knew exactly what to do; neither the drivers nor the police knew anything about the control of traffic.”
After his father’s death in 1898 left him with a multimillion-dollar inheritance, Eno devoted himself to creating a field that didn’t otherwise exist: traffic management. He developed the first traffic plans for New York, Paris, and London. In 1921, he founded the Washington, D.C.-based Eno Center for Transportation, a research foundation on multimodal transportation issues that still exists. One thing Eno didn’t do, however, is learn how to drive. Perhaps because he had such extensive knowledge of them, Eno distrusted automobiles and preferred riding horses. He died in Connecticut at the age of 86 in 1945 having never driven a car.
The Stick Has Been Inducted Into the National Toy Hall of Fame
From teddy bears to train sets, classic playthings of youth often conjure memories of a gleaming toy store, holidays, or birthdays. So curators at the Strong National Museum of Play branched out when they added the stick to their collection of all-time beloved toys. Among the most versatile amusements, sticks have inspired central equipment in several sports, including baseball, hockey, lacrosse, fencing, cricket, fishing, and pool. Humble twigs are also ready-made for fetch, slingshots, toasting marshmallows, and boundless make-believe.
Located in Rochester, New York — about 70 miles northeast of Fisher-Price’s headquarters — the Strong acquired the fledgling National Toy Hall of Fame in 2002. (It was previously located in the Gilbert House Children’s Museum in Salem, Oregon.) To date, 74 toys have been inducted, including Crayola Crayons, Duncan Yo-Yos, and bicycles. The stick was added in 2008, three years after another quintessential source of cheap childhood delight: the cardboard box.
Credit: Andrew Burton/ Getty Images News via Getty Images
Eggo Waffles Were Originally Called Froffles
The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.
When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name “Froffles,” a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years. Now a Kellogg’s-owned brand, Eggo serves up waffles as well as other frozen breakfast treats, with mayonnaise — and the name Froffles — but a distant memory.
On January 5, 1858, Ezra J. Warner of Connecticut invented the can opener. The device was a long time coming: Frenchman Nicolas Appert had developed the canning process in the early 1800s in response to a 12,000-franc prize the French government offered to anyone who could come up with a practical method of preserving food for Napoleon’s army. Appert devised a process for sterilizing food by half-cooking it, storing it in glass bottles, and immersing the bottles in boiling water, and he claimed the award in 1810. Later the same year, Englishman Peter Durand received the first patent for preserving food in actual tin cans — which is to say, canned food predates the can opener by nearly half a century.
Though he didn’t initially know why his method of storing food in glass jars and heating them worked, years of experimentation led Appert to rightly conclude that “the absolute deprivation from contact with the exterior air” and “application of the heat in the water-bath” were key. He later switched to working with cans himself. Before Warner’s invention, cans were opened with a hammer and chisel — a far more time-consuming approach than the gadgets we’re used to. Warner’s tool (employed by soldiers during the Civil War) wasn’t a perfect replacement, however: It used a series of blades to puncture and then saw off the top of a can, leaving a dangerously jagged edge. As for the hand-crank can opener most commonly used today, that wasn’t invented until 1925.
Before Erasers, People Used Bread To Rub Out Pencil Marks
The very first pencils arrived around the dawn of the 17th century, after graphite (the real name for the mineral that forms a pencil’s “lead”) was discovered in England’s Lake District. But the eraser didn’t show up until the 1770s, at the tail end of the Enlightenment. So what filled the roughly 170-year-long gap? Look no further than the bread on your table. Back in the day, artists, scientists, government officials, and anyone else prone to making mistakes would wad up a small piece of bread and moisten it ever so slightly. The resulting ball of dough erased pencil marks on paper almost as well as those pink creations found on the end of No. 2 pencils today.
But in 1770, English chemist Joseph Priestly (best known for discovering oxygen) wrote about “a substance excellently adapted to the purpose of wiping from paper the marks of a black lead pencil.” This substance, then known as caoutchouc, was so perfect for “rubbing” out pencil marks that it soon became known simply as “rubber.” Even today, people in the U.K. still refer to erasers as “rubbers.” (The name “lead-eater” never quite caught on.)
On January 9, 2007, Apple CEO Steve Jobs revealed the iPhone to the world. Since then, Apple’s pricey slab of glass stuffed with technology has become more or less synonymous with the word “smartphone” (sorry, Android fans). But smartphones predate the iPhone by more than a decade. To pinpoint the smartphone’s true birthdate, look back to November 23, 1992, and the introduction of IBM’s Simon at a trade show in Las Vegas. Today, IBM is best known for supercomputers, IT solutions, and enterprise software, but in the ’80s and early ’90s the company was a leader in consumer electronics — a position it hoped to solidify with Simon.
Simon was a smartphone in every sense of the word. It was completely wireless and had a digital assistant, touchscreen, built-in programs (calculator, to-do list, calendar, sketch pad, and more), and third-party apps, something even the original iPhone didn’t have. The idea was so ahead of its time, there wasn’t even a word for it yet — “smartphone” wasn’t coined for another three years. Instead, its full name when it debuted to the larger public in 1993 was the Simon Personal Communicator, or IBM Simon for short. But there’s a reason there isn’t a Simon in everyone’s pocket today. For one thing, the phone had only one hour of battery life. Once it died, it was just a $900 brick (technology had a long way to go before smartphones became pocket-sized; Simon was 8 inches long by 2.5 inches wide). Cell networks were still in their infancy, so reception was spotty at best, which is why the Simon came with a port for plugging into standard phone jacks. In the mid-aughts, increases in carrier capacity and the shrinking of electronic components created the perfect conditions for the smartphones of today. Unfortunately for Simon, it was too late.
Governments worldwide have levied taxes for thousands of years; the oldest recorded tax comes from Egypt around 3000 BCE. But England — which relied heavily on taxes to fund its military conquests — is known for a slate of fees that modern taxpayers might consider unusual. Take, for instance, the so-called “window tax,” initially levied in 1696 by King William III, which annually charged citizens a certain amount based on the windows in their homes. Some 30 years before, the British crown had attempted to tax personal property based on chimneys, but clever homeowners could avoid the bill by temporarily bricking up or dismantling their hearths and chimneys before inspections. With windows, assessors could quickly determine a building’s value from the street. The tax was progressive, charging nothing for homes with few or no windows and increasing the bill for dwellings that had more than 10 (that number would eventually shrink to seven).
Not surprisingly, homeowners and landlords throughout the U.K. resented the tax. It didn’t take long for windows to be entirely bricked or painted over (much like fireplaces had been), and new homes were built with fewer windows altogether. Opponents called it a tax on “light and air” that hurt public health, citing reduced ventilation that in turn encouraged disease. Even famed author Charles Dickens joined the fight to dismantle the tax, publishing scathing pieces aimed at Parliament on behalf of poor citizens who were most impacted by the lack of fresh air. Britain repealed its window tax in July 1851, but the architectural impact is still evident — many older homes and buildings throughout the U.K. still maintain their iconic converted windows.
Philadelphia Cream Cheese Isn’t Actually From Philadelphia
The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But one thing’s for sure: Despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods.
So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.
The Color of Your Bread Tag Has an Important Meaning
Ever wonder why the tags used to seal loaves of bread come in different colors? Far from arbitrary, the color-coded system indicates which day of the week the bread was baked. The color system is even alphabetical: Monday is blue, Tuesday is green, Thursday is red, Friday is white, and Saturday is yellow. (Traditionally, bread wasn’t delivered on Wednesday or Sunday.)
Because bread rarely remains on the shelf for more than a few days, this system is more for internal use among employees than it is for customers looking to get the freshest sourdough possible. But if you favor a local bakery and get to know their system, you could either snag the best deals or the fluffiest dinner rolls in town.
The Snickers Candy Bar Was Named After One of the Mars Family’s Favorite Horses
While names like Hershey’s and 3 Musketeers (which originally included three bars) are fairly straightforward, some candy bar monikers are more elusive. Case in point: What, exactly, is a Snickers? Actually it’s a “who” — and not a human “who” at that. The candy bar was named after one of the Mars family’s favorite horses. Franklin Mars founded Mars, Incorporated (originally known as Mar-O-Bar Co.) in 1911, introducing Snickers in 1930; when it came time to name his product, he immortalized his equine friend as only a candy magnate could.
As Mars has grown into America’s fourth-largest private company, it has retained a dual focus on both candy and pets. M&M’s, Twix, and Milky Way are all Mars products, as are Iams, Pedigree, and Royal Canin. If you’ve ever wondered how M&M’s got their name, the story is slightly less interesting — it’s simply the last initials of Forrest Mars (Frank’s son) and partner-in-candy Bruce Murrie. The company is known for secrecy, with the family itself having been described as a “reclusive dynasty,” which means it’s a minor miracle that the identity of Snickers the horse was ever revealed in the first place.
The First Product Scanned With a Barcode Was Juicy Fruit Gum
When Marsh Supermarket cashier Sharon Buchanan rang up a 10-pack of Juicy Fruit on June 26, 1974, and heard a telltale beep, her face must have registered relief. Buchanan’s co-workers at the grocery store in Troy, Ohio, had placed barcodes on hundreds of items the night before, as the National Cash Register Company installed the shop’s new computers and scanners. Buchanan’s “customer” for that first purchase was Clyde Dawson, the head of research and development at Marsh Supermarkets, Inc. For that fateful checkout, Dawson chose the gum, made by the Wrigley Company, because some had wondered if the machine would have trouble reading the item’s very small barcode. It didn’t. Today, one of Marsh’s earliest scanners is part of the Smithsonian Museum of American History.
The Microwave Was Invented by Accident, Thanks to a Melted Chocolate Bar
The development of radar helped the Allies win World War II — and oddly enough, the technological advances of the war would eventually change kitchens forever. In 1945, American inventor Percy Spencer was fooling around with a British cavity magnetron, a device built to make radar equipment more accurate and powerful. To his surprise, microwaves produced from the radar melted a chocolate bar (or by some accounts, a peanut cluster bar) in his pocket. Spencer quickly realized that magnetrons might be able to do something else: cook food.
With the help of a bag of popcorn and, some say, a raw egg, Spencer proved that magnetrons could heat and even cook food. First marketed as the Radarange, the microwave oven launched for home use in the 1960s. Today, they’re as ubiquitous as the kitchen sink — all thanks to the Allied push to win the war.
Credit: Getty Images/ Unsplash+
Libraries Predate Books
While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.
Credit: Getty Images/ Unsplash+
Umbrellas Were Once Used Only by Women
Umbrellas have been around for a long time — at least 3,000 years, according to T.S. Crawford’s A History of the Umbrella — but they were used by only select segments of the population for much of that history. Ancient Egyptians used them to shade their pharaohs, setting the tone for an association with royalty and nobility that would also surface in China, Assyria, India, and other older civilizations. Meanwhile, they were deemed effeminate by ancient Greeks and the Romans who assumed many of their cultural habits. It should be noted that these early umbrellas protected against the sun, not rain, and were generally used by women to shield their complexions. The association between women and umbrellas persisted through much of Europe for centuries, and stubbornly remained into the 18th century, even after the first waterproof umbrellas had been created (around the 17th century in France).
In England, at least, the man credited with ushering in a new age of gender-neutral weather protection was merchant and philanthropist Jonas Hanway. Having spotted the umbrella put to good use during his many travels, Hanway took to carrying one through rainy London in the 1750s, a sight met with open jeering by surprised onlookers. The greatest abuse apparently came from coach drivers, who counted on inclement weather to drive up demand for a dry, comfy ride. But Hanway took the derision in stride. Shortly after his death in 1786, an umbrella advertisement surfaced in the London Gazette, a harbinger of sunnier days to come for the accessory’s reputation as a rain repellant for all.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
While we’ve come a long way from being solely reliant on the sun’s rays to chart the day, the core principles of determining time remain largely the same. Nowadays, some people wear a trusty wristwatch, whereas others glance at their phone for a quick update. No matter your preferred method of tracking the hours, here are six timely facts about clocks and other timekeeping devices.
The Oldest Working Mechanical Clock Is Located at an English Cathedral
England’s Salisbury Cathedral dates back to the 13th century, and is home to one of four surviving original copies of the 1215 Magna Carta. The cathedral is also the site of the world’s oldest working mechanical clock, a machine dating back to 1386, if not earlier.
Composed of hand-wrought iron and intertwined with long ropes that extend halfway up the cathedral walls, the Salisbury Cathedral clock is the brainchild of three clockmakers: Johannes and Williemus Vriemand, as well as Johannes Jietuijt of Delft. The clock operates thanks to a system of falling weights, which are pre-wound once each day, and the device is designed solely to denote each passing hour. It once sat in a detached bell tower before falling into disuse around 1884. Thankfully, the mechanism was rediscovered in 1929 and later restored in 1956; prior to that restoration, the clock had successfully chimed for nearly 500 years on over 500 million separate occasions. It continues to operate today.
Pennies Are Used To Maintain the Accuracy of Big Ben’s Clock Tower
London’s Elizabeth Tower, at the north end of the Palace of Westminster, boasts one of the most recognizable clock faces in the world. Inside the tower’s belfry is where one can find “Big Ben” — though many use the name to refer to the tower as a whole, it actually refers to the mechanism’s grandest and most prominent bell. Name-related confusion aside, the clock is notable for another reason, too: Its accuracy is regulated using old pennies and, on occasion, other coins.
Due to external atmospheric conditions such as air pressure and wind, the exact time depicted on the face of Elizabeth Tower can fall ever so slightly out of sync with reality. In order to right these wrongs, old pennies — coins that existed prior to England’s switch to decimal coinage in 1971 — are added to the bell’s pendulum, which in turn alters the daily clock speed by 0.4 seconds per penny. The process is a long-standing one, having been used to regulate the time as far back as 1859. In 2012, three of the 10 coins relied upon for this purpose were, for a brief time, swapped out for a five-pound crown commemorating that year’s London Olympics.
19th-Century Maritime Signals Inspired Times Square’s New Year’s Ball
The New Year’s Ball drop in Times Square, New York, is a beloved annual tradition, though its origins had nothing to do with revelry. In fact, the ball drop was inspired by a 19th-century timekeeping mechanism targeted at maritime crews. “Time balls” — which dropped at certain times as a signal to passing ships and navigators to set their on-ship chronometers — first appeared in Portsmouth Harbor in 1829 and later at England’s Royal Observatory at Greenwich in 1833. In fact, the giant red time ball located in Greenwich continues to operate today.
The balls were the culmination of an idea suggested by a man known as Robert Wauchope, who promoted the concept of visual clues located ashore to help passing ships tell time. Wauchope originally suggested the use of flags, though orbs that moved up and down were finally settled upon instead. Though these time balls were initially made to help mariners keep track of time, they soon became an attraction among locals, as people would come to watch the ball fall, in a precursor to today’s New Year’s Eve crowds.
Medieval Engineer Ismael al-Jazari Invented an Elephant Clock
Throughout the 12th and early 13th centuries, few inventors pioneered more mechanisms in the world of robotics than Ismael al-Jazari, who lived and worked in what is now Turkey. Al-Jazari was so influential at the time that he’s believed to have even inspired the works of Leonardo da Vinci. Among al-Jazari’s most notable timekeeping inventions was an elephant clock, colorful illustrations of which appeared in his 1206 manuscript, The Book of Knowledge of Ingenious Mechanical Devices.
The clock was an intricate device constructed atop the back of a copper elephant, containing several moving parts as well as a scribe to denote the passing of time. The entire clock relied upon a water-powered timer, which was made up of a bowl that slowly descended into a hidden tank of water. As that bowl sank, the scribe noted the number of minutes. Furthermore, every half hour a ball would be triggered to fall and collide with a fan, which rotated the device’s dial to show how many minutes had passed since sunrise. That same ball ultimately dropped into a vase that in turn triggered a cymbal to begin the cycle anew. The whole mechanism not only incorporated this Indian-inspired timing technology, but also Greek hydraulics as well as design elements from Egyptian, Chinese, and Arabian cultures.
The World’s Most Accurate Clock Is Located in Boulder, Colorado
Located in the basement of a laboratory at the University of Colorado, Boulder, is a clock considered to be the world’s most accurate. Invented by scientist Jun Ye, the clock is so precise that it would take 15 billion years for it to lose a single second of time. That absurd number dwarfs the traditional 100 million years that it takes many modern atomic clocks to lose a second.
The first atomic clock was constructed in 1949 by the National Bureau of Standards, and helped scientists to accurately redefine the measurement of a second by the year 1967. Prior to that point, a second had been calculated as 0.000,011,574 of a mean solar day, which proved to be inaccurate due to the irregular rotation of the Earth. Ye’s new clock optimizes the techniques of those early atomic clocks, using strontium atoms that are arranged in a 3D lattice to tick at 1 million billion times per second. While that science-heavy explanation may not be entirely clear to the average person, Ye’s atomic clock can be summed up like this: It’s really, really accurate.
French Revolutionary Time Instituted a System of 10-Hour Days
While societies around the world may not agree on much, one thing that’s generally accepted is that each day is 24 hours long. Back in 1793, however, during the French Revolution, France took an oppositional stance and adopted a new timekeeping concept. This decimal time concept included 10-hour days, 100 minutes every hour, and 100 seconds per minute. In essence, its base 10 method of timekeeping was proposed as a simpler way to note how much time had passed on any given day.
This new timekeeping plan officially started on November 24, 1793, and was immediately met with resistance and confusion by the public. People were unwilling to change their old habits for telling time, despite French clockmakers producing new mechanisms that featured both traditional timekeeping methods and the new decimal-based technique. In the end, decimal clocks lost their official status in 1795, though the concept wasn’t quite dead yet. France tried yet again in 1897, this time proposing a variant that incorporated 24-hour-long days with 100 minutes per hour, but that proposal was scrapped in 1900.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Thanks to the 1975 blockbuster Jaws, a generation of people have grown up with the mistaken belief that sharks are man-eating monsters, intent on attacking anything that moves. Scientists have worked hard to dispel such myths about the ancient creatures, which roam every ocean and vary widely in size, shape, diet, habitat, and attitude. Here are a few facts about these fascinating fish.
The first shark that really looked shark-like appeared around 380 million years ago in the Devonian period. Just a few million years later, a major extinction wiped out many species that competed with sharks, allowing them to evolve rapidly into numerous new shapes, sizes, and ecological niches — some of which are still around. One of the oldest species living today is the bluntnose sixgill shark, which evolved between 200 million and 175 million years ago in the early Jurassic epoch.
As cartilaginous fishes, sharks don’t leave much behind when they die. Known shark fossils consist mainly of teeth and a handful of scales, vertebrae, and impressions left in rock. Even so, paleontologists have been able to identify about 2,000 species of extinct sharks just by examining differences in fossilized teeth. For example, the oldest shark teeth ever found came from an Early Devonian fish dubbed Doliodus problematicus; bits of its fossilized skeleton showed characteristics similar to bony fishes, while its teeth and jaw were more shark-like, confirming a theory that the species was an ancient ancestor of sharks.
There Are More Than 500 Species of Sharks in the World
Sharks are categorized into nine taxonomic orders. To name a few of the most recognizable types, Carcharhiniformes, the order of ground sharks, encompasses over 290 species, including the bull shark, tiger shark, blue shark, hammerhead, and more. The great white shark, basking shark, and makos, as well as the aptly named goblin shark and other species, belong to Lamniformes — also known as mackerel sharks. The carpet shark order, Orectolobiformes, includes the whale shark, nurse shark, wobbegong, and others. In all, there are more than 500 species of sharks swimming the world’s water.
There’s a Huge Size Difference Between the Largest and Smallest Sharks
With so many shark species swimming Earth’s oceans, there’s incredible variation in their sizes. The largest shark species living today is the whale shark (Rhincodon typus), a gentle, plankton-eating giant that can grow to 45 feet long or more and weigh 20 tons (the biggest accurately measured whale shark reached 61.7 feet!). They can be found in all of the world’s tropical seas. The smallest known shark species, meanwhile, was discovered in 1985 off the coast of Colombia in the Caribbean Sea: The dwarf lantern shark (Etmopterus perryi) averages a length of just under 7 inches. It dwells in the ocean’s twilight zone, about 1,000 feet below the surface, but sometimes feeds in the shallows and uses bioluminescent organs along its belly to camouflage itself against sunlit waters.
Like all fishes, sharks have a sensory organ called the lateral line running down the length of their bodies. The lateral line system involves exterior pores and specialized cells that can detect vibrations in water, which helps sharks locate prey from hundreds of feet away. In addition to sensing water movements, sharks can perceive electric fields surrounding other animals (the fields are caused by the animals’ muscle contractions). This sixth sense, called electroreception, picks up electrical signals that sharks can use to home in on prey. Electroreception can also guide migrating sharks via Earth’s electromagnetic fields.
The slow-growing, Arctic-dwelling Greenland shark (Somniosus microcephalus) is not only the longest-lived shark, but also holds the record for the longest-lived vertebrate on Earth. Unlike other sharks, Greenland sharks don’t have cartilage that shows their growth over time, so scientists have had difficulty estimating their age accurately. In 2016, a study in the journal Science described how a team of biologists carbon-dated eye proteins, which build up continuously during the animals’ lives, in several Greenland sharks. They found the individuals were an average of 272 years old when they died, and the results suggested that the sharks’ maximum life span could be up to 500 years.
You’re More Likely To Be Killed by a Cow Than a Shark
Your risk of suffering a shark attack is practically nil. For its 2022 global summary, the Florida Museum of Natural History’s International Shark Attack File confirmed 57 unprovoked shark bites in 2022, meaning they happened when humans were simply in the shark’s natural habitat, and 32 provoked attacks, such as when people were feeding or harassing the fish. Forty-one of the unprovoked attacks occurred in the U.S., and one was fatal. Other animals are way more likely to kill you, including cows (which kill an average of 20 Americans a year, according to CDC data), hornets, bees, and wasps, (about 48 people a year) and dogs (around 19 a year).
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Most features in an airplane cabin are designed for a very specific purpose. However, due to the cabin’s complex design, the flight attendants don’t usually take the time to explain every detail to their passengers. (They're more concerned with making sure everyone is safe and comfortable.) However, if you're a curious person who likes to know how things work, we've got you covered. Here are six things you never knew about airplane cabins.
Have you noticed that the cabin lights dim during takeoff and landing? It turns out that there are two very good reasons for this. According to Reader's Digest, the first reason is safety. If the lights stayed on and were to suddenly switch from bright to dark in an emergency, it would take precious seconds for passengers' eyes to adjust. With dim lighting during takeoff and landing, our eyes are already adjusted — making it easier to find an exit.
The second reason is the mood. Dim lights are more relaxing than bright lights and might calm a passenger who struggles with flight anxiety. Some airlines such as Virgin Atlantic take this a step further by adding colored lights. Virgin Atlantic uses different shades of their brand color for various situations, like a rosy pink color for boarding and a hot magenta color for drinks.
Passengers often complain about the cold temperature in airplane cabins. Flight staff will provide passengers with a blanket, but they don't ever increase the heat. That's because the temperature on an aircraft has been set in a very intentional way — and it's for your safety.
A study by ATSM International found that people were more likely to faint on an aircraft than on the ground due to a condition called hypoxia. The pressurized environment of an airplane cabin can prevent our body from getting enough oxygen, which causes fainting. The warmer the temperature onboard the aircraft, the more likely this is to happen. To prevent passengers from passing out, airlines intentionally lower the cabin temperature. While this might be slightly uncomfortable, it's much safer for your body.
A common myth about air travel is that you're sharing air — along with germs and food particles — with all the other passengers on board. Gross, right? In reality, airlines do a great job of maintaining clean air quality onboard the aircraft. They actually use a HEPA (High Efficiency Particulate Air) filter system. According to the International Air Transportation Association (IATA), this is the same type of filter used to clean the air in hospital operating rooms. The next time you fly, don’t worry: Cabin air is cleaner than you think.
While there is a lock inside cabin bathrooms for passengers to use, flight attendants also have the ability to quickly unlock the door from the outside as well. According to Aerotime Hub, this is for passenger safety. In the event of an emergency, flight attendants need to be able to access the bathroom without picking the lock or taking the door off its hinges. This is necessary if a passenger has a health scare or needs assistance while in the bathroom. It can also be used for children who are unable to unlock the door themselves. Don't worry, though: A flight attendant would never just open the door for no reason. They respect passenger privacy and would only use the unlock option in an emergency.
During takeoff and landing, most flight attendants will ask that passengers lift their window blinds. Like so many other things on an airplane, there's a real reason for this. Open blinds allow the flight staff to see any issues on the ground or on the airplane itself. Passengers might also report unusual circumstances they observe from their windows. Lifting the blinds also allows our eyes to adjust to the conditions outside quickly in case of an emergency.
Cabin windows also sometimes have triangle stickers on them to mark certain seats. According to Captain Joe, these stickers indicate which windows provide the best view of the wings. Flight attendants can easily look for the triangle when they need to see the wings for safety reasons. According to Captain Joe, these aisles are also great for passengers prone to motion sickness due to the extra stability provided by the wings.
Walking down the aisle of a moving airplane can be a wobbly experience — especially when there's turbulence. Most passengers end up grabbing the seats as they walk, which can disturb the people in those seats, but there's actually a better way.
If you watch the flight attendants, you'll notice that they repeatedly reach up to the ceiling when they walk down the aisle. That's because there's a built-in handle rail along the bottom edge of the storage compartment, which can be used to steady yourself. Next time, copy the flight attendants, avoid aggravating fellow passengers, and use this secret rail instead!
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
When the film version of West Side Story was released on October 18, 1961, it quickly surpassed its theatrical predecessor to become a smash hit. Audiences were blown away by the love story of Tony (Richard Beymer) and Maria (Natalie Wood) and captivated by the dancing and singing of Anita (Rita Moreno) and Bernardo (George Chakiris).
West Side Story swept the Academy Awards, winning 10 statuettes, including Best Picture and Best Supporting Actress and Actor for Moreno and Chakiris, respectively. Today, it’s still one of the most-watched and beloved films of all time. Here are six surprising facts about the movie musical.
Audrey Hepburn, one of the biggest actresses of her time, was originally asked to play the lead character of Maria. However, Hepburn was pregnant with her son Sean and previously suffered several miscarriages, so she turned down the role to not over-exert herself.
Despite saying no to the blockbuster, Hepburn still made a splash on the big screen that same year in Breakfast at Tiffany’s.
One Big Star — and a Few Stars-to-Be — Might Have Portrayed Tony
In one account of West Side Story‘s casting, Elvis Presley was in the running to play the lead role of Tony — until his manager, Colonel Tom Parker, reportedly rejected the part. And while Presley’s name may only have been bandied about and never under serious consideration, several actors who hadn’t yet had their big breaks did audition for the film. These include Warren Beatty (who was also considered for the stage version as Riff), Robert Redford, and Burt Reynolds (though the interview sheet listed him as “Bert”).
Beymer eventually won the part of Tony. However, he ended up displeased with his performance. “It’s a thankless role,” he admitted in 1990. “It could have been played more street-wise, with someone other than me.”
In West Side Story, Tony and Maria embody the instantaneous pull of young love at first sight. Away from the cameras, Wood, by far the movie’s biggest star at the time, didn’t connect with her leading man. One theory posited to explain Wood’s distant attitude was that she would have preferred acting opposite her then-husband, Robert Wagner.
According to West Side Story costar Russ Tamblyn (Riff), Wood’s dressing room contained a “hit list” of people who’d gotten on her bad side, and Beymer was one of the names on that list. When Tamblyn asked Wood what Beymer had done, she reportedly answered, “I just don’t like him.”
Wood’s Singing Voice Was Dubbed — To the Surprise of the Actress
After accepting the lead role of Maria, Wood spent the entire production certain her vocals would be heard when the movie headed to theaters. She received intense coaching, and the music department assured Wood that her takes were wonderful. Though singer Marni Nixon also recorded Maria’s songs, Wood believed Nixon’s voice would solely be used for a few high notes. (Ironically, Nixon also was the singing voice for Hepburn in “My Fair Lady.”)
It wasn’t until the end of production that Wood discovered Nixon would be singing the entire role. Wood was an actress, not a trained singer, so it’s not shocking filmmakers wanted a more skilled vocalist to perform Maria’s challenging songs. But Wood would never forgive co-director Robert Wise for keeping her in the dark for so long.
Making West Side Story wouldn’t have been possible without Robbins, who conceptualized the stage musical and did the choreography. So when Robbins wanted to direct the movie version, producers agreed, though they did install Robert Wise as co-director.
As the film was shot, Robbins’ choreography was, as always, impressive. But he demanded numerous takes, which held back production. When most of the big dance numbers were finished, the producers fired Robbins. His assistants handled the remaining dance scenes in the movie. Robbins considered removing his name from the finished project but ultimately decided not to, which turned out to be a wise decision, as he (along with Wise) ended up being awarded an Oscar for Best Director.
When Robbins came up with the show in 1949, the original plot was about a Catholic boy and a Jewish girl living on the Upper East Side of Manhattan, appropriately called “East Side Story.” The project was eventually shelved while Robbins, along with composer Leonard Bernstein and playwright Arthur Laurents, took on other projects.
The show resurfaced in 1955 — but with a plot twist. Latin gang violence in Los Angeles was making headlines, inspiring Laurents to propose switching locations from the swanky Upper East Side to the then-rundown Upper West Side and centering the conflict around Puerto Rican and white gangs.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Plastic is everywhere. Look around and you’re bound to immediately notice something made from the stuff. It houses the milk in our grocery carts, makes up the components in our phones, and is woven into the fibers of our clothes. Here are nine facts you might not know about one of the most common materials we interact with every day.
Plastic may seem like a manufacturing miracle limited to the 20th century, but its earliest version actually cropped up around the mid-1800s. English chemist and inventor Alexander Parkes created the first known plastic, eponymously named Parkesine, in 1855, and exhibited it in London in 1862. Parkes’ moldable material was formed from cellulose (aka plant fibers), and it wasn’t cheap. It was also brittle and prone to cracking — two reasons that kept Parkesine from gaining widespread popularity.
By the late 1860s, American inventor John Wesley Hyatt had created celluloid, the first plastic product that would be used for everyday products. Hyatt initially marketed celluloid as a substitute for natural materials, suggesting it was an environmentally friendly swap for the ivory used in billiard balls and tortoiseshell harvested for jewelry and combs. However, celluloid did have a major drawback: It could catch on fire. Billiard balls made from the substance reportedly ran the risk of spontaneously bursting into flame, and cinema film made from celluloid was known to be extremely flammable.
Belgian chemist Leo Baekeland had already made his fortune from inventing specialty photo paper when he began experimenting with polymers, aka the molecules that make up plastics. Baekeland created a new version of plastic, called Bakelite, in 1907; historians consider it the first fully synthetic plastic made with no naturally occurring materials. Bakelite’s popularity skyrocketed, and the heat-resistant plastic was used for everything from irons, kitchen cookware handles, and telephones to smaller items like buttons and chess pieces. You can still find Bakelite used for electrical components today thanks to its superior insulating properties.
Modern consumers began using the term “plastic” as far back as 1909, to refer to Bakelite, though the word is actually centuries old. “Plastic” has roots in Latin — from plasticus, meaning something like “fit for molding” or “moldable” — and before that the Greek plastikos, which had a similar meaning.
Plastics slowly made their way into everyday products during the early part of the 20th century, though World War II had a profound impact on the industry and caused a surge in production. Lower-quality plastics replaced rationed and hard-to-find materials for consumer products, and higher-end versions were used in the war effort. Acrylic and plexiglass made their way inside bombers and fighter planes in place of glass, and nylon was created as a synthetic silk for parachutes, body armor, and ropes. Plastic technology improved during wartime and led to a boom after, with shoppers buying tons of plastic products that were marketed as durable and easy to clean.
Technically, there are hundreds of types of plastic, though most of the kinds we interact with on a daily basis fit into seven categories. Polyethylene terephthalate (PET, or #1) plastics are most common, found in water bottles, food containers, and polyester. Milk and laundry detergent jugs come from high-density polyethylene (HDPE, or #2). Squeezable bottles, shopping bags, and garbage bags are made from low-density polyethylene (LDPE, or #4). Polypropylene (PP, or #5) is often used for straws and takeout containers. Plastic types #3 (PVC, or polyvinyl chloride) and #6 (PS, or polystyrene) are considered more difficult to recycle, and #7 is a catch-all category for combination plastics, like electronics, DVDs, and clear plastic forks.
Is it safe to reheat your lunch if it’s stored in a plastic container? It depends on the type of plastic used. Heating some plastics can cause the materials to release additives, aka chemicals that help them stay durable and flexible (BPA and phthalates are the most common causes of concern). Polypropylene (aka plastic #5) is generally considered the safest to microwave because it’s heat-resistant, though plastics #3, #6, and #7 should never be heated. Researchers recommend checking to see if a container is labeled as microwave-safe, and steering clear of plastics that are damaged or unlabeled.
Nearly all plastic can be melted down and turned into something new, though most never is. Less than 10% of all plastic products ever created have been recycled, with most ending up burned, in the ocean, or in landfills. That’s because collecting, sorting, and the actual recycling is expensive, far outweighing the cost of producing new plastic items. And plastic manufacturers say that the used containers can only be recycled once or twice before the materials degrade in quality, meaning creating new containers is more reliable — though obviously not great for the environment.
Swedish engineer Sten Gustaf Thulin created the plastic shopping bag in 1965. At the time, Thulin’s intention was to reduce the number of trees harvested to make paper bags; the plastic version he invented was sturdier and could be used over and over again. The bags, which were cheap to produce, became so popular that they nearly replaced paper bags by the end of the 1980s. Yet there were unintended consequences. In 2002, Bangladesh became the first country to ban plastic bags, a movement that’s now grown to more than 100 countries in an effort to safeguard the seas and reduce landfill waste.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
One thing’s for sure: U.S. presidents are the stuff of legends. However, just because personal tales about the leaders are passed down from generation to generation doesn’t mean the stories are rooted in truth. In fact, many of the stories are so outlandish that it’s amazing people believed them in the first place.
From flammable teeth to ridiculous bathtub debacles, we take a look at the eight of the oddest presidential myths out there — and set the record straight.
Cherry tree aside, one of the most chewable facts is that the nation’s first president had a mouth full of wooden teeth. While it seems like an odd story to be linked to the founding father, a deeper dig gets to the root of the issue. Washington did indeed have terrible teeth, so much so that he had multiple dentures made. Those mouthpieces were made out of ivory, gold, lead, and even human teeth, but never any wood. Wood was not used by dentists at the time, because not only could wooden dentures cause splinters, but wood is also susceptible to expanding and contracting due to moisture — not ideal for something that lives in your mouth.
It seems incomprehensible that a big-name founding father like Thomas Jefferson missed out on signing the U.S. Constitution, but he never inked the deal. He was actually absent during the Philadelphia Constitutional Convention in 1787, as he was across the Atlantic Ocean in Paris, France, as the U.S.’s envoy.
Myth: Abraham Lincoln Wrote the Gettysburg Address on an Envelope
There’s no doubt that the 16th president was a brilliant orator. But the idea that he haphazardly scribbled one of the most important speeches in American history on the back of an envelope during a train ride sounds a little far-fetched. In reality, Abraham Lincoln toiled away at different versions of the Gettysburg Address, which he gave on November 19, 1863. Not just that, it was anything but a solo project. He collaborated with several associates on it — and there are even five original copies of the speech, not one of them on an envelope.
One of the stranger presidential myths might be chalked up to potty humor. Somehow, 27th President William Howard Taft became associated with an embarrassing incident around getting stuck in a bathtub. While it’s true that he was larger in stature, weighing in at 350 pounds, he never had to be rescued from a tub.
That said, there is a reason he’s associated with baths. During his presidency, a super-sized porcelain tub that was 7 feet long, 41 inches wide, and a ton in weight was installed in the White House. It was so massive that four grown men could fit inside. In another bath incident after his presidency, he filled a tub at a hotel in Cape May, New Jersey, a little too high and when he stepped into it, it overflowed to the point that the guests in the dining room below got a bit of a shower.
Myth: The Teddy Bear Got Its Name After Theodore Roosevelt Saved a Real Bear
Theodore Roosevelt had long been a hunter, but didn’t exactly show off his best skills on a bear hunt in November 1902. Everyone else in the group had had a fruitful hunt, so to help Roosevelt, the guide tracked a 235-pound bear to a watering hole, clubbed it, and tied it to a tree so the president could claim it. As the story goes, Roosevelt refused to shoot the bear.
The incident made its way to the Washington Post, which published a satirical cartoon about the president sparing the bear. New York City store owners Morris and Rose Mitchom saw the cartoon, were inspired by the president’s act of heroism, and created stuffed animals in his honor, appropriately naming them “Teddy’s bear.”
The problem? Roosevelt didn’t shoot the bear, but he didn’t save it either. He saw that it had been mauled by dogs so savagely already that he asked for the bear to be killed with a hunting knife. Given the dark nature of this true tale, it makes sense that the details are often ignored when talking about this beloved childhood toy.
Myth: John F. Kennedy Won the Election Because of the TV Debates Against Richard Nixon
The televised broadcast of a 1960 presidential debate between John F. Kennedy and Richard Nixon is often said to have clinched the victory for JFK, who many found to be more photogenic and charismatic. But when you truly look at the election numbers, it didn’t really have that big of an effect on the results. The candidates were pretty much neck-and-neck throughout the campaign, even appearing to be tied in the polls before and after the four debates. Kennedy seemed to have a slight boost after the first one on September 26, but then Nixon hit it out of the park on the others, especially with his foreign policy take during the final one. In the end, Kennedy won the election by a mere 119,000 votes.
Kennedy and Nixon’s September 1960 debate is often credited as the first televised presidential debate, but that is also a myth. In 1956, a televised debate aired during the run-off between Republican President Dwight Eisenhower and Democrat Adlai Stevenson. However, neither of them attended, and sent surrogates in their place. Eisenhower sent Maine senior senator Margaret Chase Smith, while Democrats went with Eleanor Roosevelt, and it aired on CBS’ Face the Nation.
Just over a year and four months into his term, 12th President Zachary Taylor fell ill and died while in office. For years, many thought that he may have been the first president to be assassinated, since it was rumored that he was poisoned. Despite his death in July 1850, it wasn’t until 1991 that Kentucky scientists definitively concluded there was no arsenic in his blood. Another story, that he died of eating cherries in iced milk, unfortunately may have more truth to it. After leaving the Washington Monument dedication in 1850, he had that combo as a snack and likely came down with severe gastroenteritis — an inflammation of the digestive system — dying five days later.
Throughout Gerald Ford’s presidency, many joked that his vice president, Nelson Rockefeller, was only a banana peel away from the presidency, since the 38th president was so often caught being clumsy. He tumbled down ski slopes, slipped in the rain, and fell coming out of Air Force One, so much so that he was spoofed by Chevy Chase on Saturday Night Live. But in actuality, Ford was quite an athlete in his younger days. He was a football star at the University of Michigan, where he earned his letter for three years. He even tackled future Heisman Trophy winner Jay Berwander in 1934. During his White House years, he also swam and skied regularly, and played tennis and golf, so perhaps all that falling was just to add to his relatability.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
As America’s first national park and one of its most important biosphere reserves, Yellowstone holds a unique place in our national consciousness — more than 4 million people visit the park each year. However, with its rich history, there are likely many facts you’ve probably never heard of, even if you consider yourself a park aficionado. Here are eight fascinating Yellowstone National Park facts that will take your knowledge of America’s favorite national park to the next level.
When most people think of the Grand Canyon, they think of Arizona. But what about the Grand Canyon of the Yellowstone River? This 20-mile long canyon is said to be an important example of river-type erosion, with a depth of more than 1,000 feet. On the ridge of the canyon lies Artist Point, which offers one of the most beautiful views in the park. From this spot on the trail, you can see a majestic, 300-foot waterfall flowing into the canyon. If you look down, you’ll see steep canyon walls in gorgeous hues of pink, orange, yellow, and red.
Yellowstone is home to a whopping 10,000-plus hydrothermal features, including 500 geysers — which scientists estimate is about half of the world’s geysers. The most famous is Old Faithful, which erupts around 17 times a day. Other breathtaking features, like the Beehive Geyser and Grotto Geyser, are somewhat less popular but still provide a thrilling show of geothermal action. So, if you’re worried about Old Faithful being too crowded at peak times of the year, don’t worry — you still have hundreds of other geysers to see.
While many other grassland areas have been over-hunted and bison have been driven to extinction, Yellowstone’s herd has remained intact. According to the History Channel, Yellowstone’s bison population is the only herd that has existed since prehistoric times in the United States. In the 19th century, the herd was hunted down to its last 23 members by avid fur traders exploring the Wild West. Today, however, the park is home to 5,500 bison, making it the biggest bison population in the country.
For 30 years, the United States Army kept order at Yellowstone. Until 1916, soldiers patrolled the park to protect the wildlife from unscrupulous poachers. The park spans three states — Montana, Idaho, and Wyoming — all of which have differing laws pertaining to wildlife and preservation. To fix this decades-old issue of disputes in different parts of the park, Yellowstone officially created the Yellowstone County judicial system in 2006. That means if you break the law while you’re visiting the park, you’ll be put in the official Yellowstone jail. And your mugshot may just be the only souvenir you get to take home.
The Park Is One of Only UNESCO World Heritage Sites in the U.S.
Around the world, 878 extraordinary locations have been designated as United Nations Education, Science, and Cultural Organization (UNESCO) World Heritage Sites. The United States only has 20 sites across the entire country, and Yellowstone is one of the most important.
UNESCO’s website provides a list of reasons for Yellowstone’s coveted honor, including its distinctive manifestation of geothermal forces and vast number of rare species. These ecological features are why Yellowstone stands alongside culturally significant sites like the Great Barrier Reef and Machu Picchu.
Hot spots and geysers represent just a fraction of the action beneath the surface at Yellowstone. The whole park is actually a supervolcano, although it’s not supposed to erupt anytime soon. But, how do we know this? Despite the warnings, Yellowstone is quite safe: Its supervolcano is made up of two magma chambers. The first chamber contains no more than 15% molten. Meanwhile, the second chamber contains only two percent molten. According to Forbes, it’s practically impossible for a supervolcano to erupt unless its magma chambers contain at least 50% molten. So, rest easy — and don’t forget to enjoy the view.
In the entire history of Yellowstone, only eight people have ever been killed by bears in the park. To put this in perspective, that means only one in 2.7 million visitors will have a fatal bear encounter. Getting injured by a bear is a bit more common, but still happens only about every 20 years. The National Park Service cautions people to look out for falling trees instead, which kill the same number of people (but get a lot less media attention).
An estimated 1,350 different types of flowering plants grow wild at Yellowstone, the vast majority native to the region. One remarkable plant that calls the park home is Yellowstone sand verbena, a flower which normally thrives in warm environments but has managed to grow at a 7,700 foot altitude inside the park. Another unique floral trademark of Yellowstone is Ross’s Bentgrass, which grows exclusively in hot, vapor-heavy environments. This plant is a common sight at the park but rare everywhere else in the world.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.