In Hollywood, few moments have you on the edge of your seat more than when a character gets trapped in quicksand. The stunt is sometimes used during a peak dramatic moment, like in Lawrence of Arabia, but it can also be used for comedic relief, like on Gilligan’s Island.
Yet quicksand is more than a plot device — it’s very real, and has been trapping people for centuries. The messy stuff exists just about everywhere on the planet, so if you do encounter it, be careful. Though it can’t drag people under as you may have seen on screen, quicksand is still very dangerous. Check out these common questions about the stuff and let the answers sink in.
At first glance, quicksand looks like wet sand you typically see at the beach, which explains why people step in it without thinking twice. However, unlike slightly damp beach sand, quicksand is saturated with water. This saturation occurs when water isn’t able to flow away, usually due to the presence of dense material, such as a bed of clay that stops drainage.
Quicksand often has salt and clay mixed in as well. The presence of salt results in clay particles clumping together instead of intermingling with the grains of sand, and adds to quicksand’s instability.
But some quicksands form without the presence of salt. In Brazil, quicksand was found around a lagoon where bacteria created a crust that looked like the regular ground but turned into quicksand when stepped on.
Quicksand can form in any location where water and sand meet, such as near rivers, lakes, beaches, marshy areas, and natural springs. Quicksand can also be created when water escapes from an underground reservoir, perhaps due to a natural disaster.
The water saturation of quicksand cuts down the friction between sand particles. As a result, quicksand cannot support any weight. If someone steps on quicksand and adds weight, the pressure disturbs the quicksand’s structure. Quicksand then turns into a viscous liquid, which people and animals sink into.
After someone or something has sunk into the liquified sand, quicksand becomes more viscous and starts to solidify once more. This makes it harder to move, increasing the difficulty of escape.
Quicksand’s inability to support weight can have consequences beyond individuals sinking into it. If quicksand appears around a bridge or building, the structure can collapse.
It’s true that if you ever step into quicksand, you’ll start sinking into its murky depths. Fortunately, despite what’s shown in movies and on television, you don’t need to worry about being swallowed up and disappearing beneath the surface.
A human body is less dense than quicksand (one gram per milliliter for humans vs. two grams per milliliter for quicksand). If you go into quicksand feet first, buoyancy will keep you from sinking much more than waist-deep — though your legs are denser, your lungs offer enough upthrust to keep your head above the surface.
That doesn’t mean quicksand isn’t dangerous, though. For one thing, it takes time for anyone to get themselves out of it. A man hiking in Utah’s Zion National Park in 2019 was sucked into quicksand and remained stuck for hours. He experienced hypothermia, exposure, and other injuries before he could be rescued.
In some situations, such as when water levels rise due to incoming tides, people can drown when they’re trapped in place by quicksand. Such fatalities are rare, but they’ve happened even as recently as 2012 and 2015.
If you do stumble upon quicksand, here are a few tips about what to do next. First, though it may be hard advice to follow, try not to panic. You risk being dragged down further if you struggle.
Second, don’t just ask someone who’s nearby to pull you out of the quicksand. The amount of force required to lift a foot out of quicksand at a rate of one centimeter per second is 100,000 newtons. This is enough power to lift a midsize car, and it means that the amount of force needed to lift someone out of quicksand could backfire and cause more extreme injury.
Instead, take time to work your way out of the quicksand. One tactic is to slowly move your legs and feet. This allows water to get to the quicksand that’s gripping you, which will lessen the strength of that hold. Spreading your weight over as much space as possible will also help.
Remember, your body’s density is less than quicksand’s density, so you have buoyancy on your side. With time, you’ll be able to get free.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The iconic American mall emerged in the 1950s as a “third place” — a social environment separate from home and work — for the growing suburbs. Typically anchored by a department store, malls offered shoppers a chance to reach multiple places on foot, at least once they parked their cars in the giant surrounding parking lot.
Over time, malls grew into community hubs and hangout spaces in suburban and urban environments alike, offering a growing list of amenities such as food courts, multiplexes, and fun little playgrounds. At their peak in the 1980s and 1990s, they became essential teen hangouts, cementing their neon place in popular culture. The mall as we know it is changing with the rise of e-commerce, but new malls are still being built with evolved features for 21st-century shoppers.
How did malls become a place to not just shop, but hang out? Which state has the most malls for its size? How big can malls really get? These six facts about malls might just give you a new perspective.
In their heyday, malls became best known as teen hangout spots, and social places as much as shopping centers. While retail sales were always the central goal, having space for the surrounding community to gather was also part of the plan. Plantings, fountains, seating, and other non-retail draws were staples of mall design from the beginning.
“Between each of the buildings, landscaped plazas with fountains, flowers, sculpture, and trees offered seating and shade,” writes architecture critic Alexandra Lange in her book Meet Me by the Fountain, describing one of America’s first malls. The designer, Victor Gruen, was a pioneer in mall design, and called these in-between spaces a vital “town-building element.”
Air-conditioning also became a major draw, and in enclosed malls, environmental controls became more efficient, eventually leading to indoor environments that mimicked bustling town centers. They even got their name from the linear, landscaped promenades referred to as “malls” long before shopping malls existed, like the National Mall in Washington, D.C., or Pall Mall in London.“The malls of the late 1950s and early 1960s were Main Street under glass, their dimensions taken from the street fronts of prewar downtowns but without the clamor,” Lange notes.
While malls have a complicated architectural legacy, they seemed to provide a uniquely pedestrian-level shopping experience when they first emerged. The carefully curated indoor environments of early malls were praised by critics at Architectural Forum, Fortune,Architectural Record, and other publications.
Even Jane Jacobs, an urbanist best known for her efforts to protect people-centric neighborhoods from predatory developers and urban renewal, had exciting things to say about Victor Gruen’s first mall project, Northland in Detroit, in the June 1954 issue of Architectural Forum.
“Northland is a planning classic because it is the first modern pedestrian commercial center to use an urban ‘market town’ plan, a compact form physically and psychologically suited to pedestrian shopping,” Jacobs wrote. “Other points about Northland will become yardsticks. For instance, its high standards in public signs; its uninhibited, generous, and lighthearted use of art.”
“Shopping traffic has come full circle. It is right back where it started — with the pedestrian,” she added.
Not everyone was happy though, even at the beginning; Frank Lloyd Wright notably called the first fully enclosed mall “desolate-looking” and said that the building’s garden court “has all the evils of the village street and none of its charm.”
New Jersey Has More Malls Per Square Mile Than Any Other U.S. State
The state of New Jersey packs around 28 malls into its relatively small size (just 7,354 square miles of land area), giving it the most malls per square foot out of any state in America. Some of these malls have seen better days — but even in an era of “dead malls,” Jersey is still getting new ones.
The World’s Biggest Mall Is 15 Million Square Feet
While the title of World’s Largest Mall was held by the Dubai Mall in United Arab Emirates for around a decade, the Iran Mall in Tehran, Iran, unseated it in 2018 when it opened its 15 million-square-foot first phase. When completed, the whole project will be close to 21 million square feet.
So far, the mall boasts a rooftop sports complex with a path for hiking and jogging, a musical fountain, a mosque, a traditional bazaar, a giant library with elaborate millwork, and picturesque gardens, including green space and areas designed to showcase Persian architecture. At the beginning of the COVID-19 pandemic, the entire facility was temporarily transformed into a 3,000-bed treatment center in just around five days, but the stores have since reopened.
The New Century Global Center in Chengdu, China, is the world’s largest building by land area at around 19 million square feet, but malls are only parts of that gargantuan facility.
Credit: Raymond Boyd/ Michael Ochs Archives via Getty Images
The Mall of America Has Its Own Counterterrorism Unit
Since the attacks on September 11, 2001, the massive Mall of America, which gets around 40 million visitors a year, has been treated as a potential terrorist target. As a result, it has its own private counterterrorism unit — although the results have been mixed. Having a specialized team certainly came in handy when an actual threat was issued, but ordinary visitors have also been flagged for suspicious behavior and detained in the mall’s basement police station over normal activities like taking videos or forgetting a cellphone on a table.
Even without the specialized unit, Mall of America security is much beefier than that of a typical mall. All security officers get 240 hours of training, and they even have a canine bomb detection unit.
“Dead Malls” Are Finding New Lives as Housing, Health Care Facilities, and More
Although reports of the mall’s death are somewhat exaggerated, many areshuttering or coming close to it. But that doesn’t mean the buildings, or even still-operational malls, have outlived their usefulness. Some companies, governments, and organizations are finding plenty of use for all that space to address gaps in housing, health care, and even community gathering spaces.
One former mall in Rochester, New York, has become senior housing. While much of the construction was new, 73 of its 157 units were built within a former Sears department store. The complex as a whole includes three courtyards, two large patios, a gym, and community space. Other parts of the mall have become a rec center and a nursing school.
Alderwood Mall near Seattle still operates as a mall, but it’s adapting as it goes (and preparing for incoming light rail) by replacing the former Sears with new construction containing around 300 apartments and ground-floor retail. Another mall in Providence, Rhode Island, converted its existing building into mixed-use residential and retail, converting unused retail spaces upstairs into itty-bitty microlofts.
Meanwhile, more than 30 malls across the country have converted into medical complexes; the properties tend to be easy to get to and, when the original buildings stick around, they’re designed to be easier to navigate than many medical buildings.
Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Blending science and spectacle, fireworks have been entertaining crowds and marking important holidays for centuries. Chinese chemists are said to have discovered the formula for explosive powder by accident, while a famous explorer kick-started an Italian craze for bigger and more beautiful displays. Even a future U.S. President opined on the virtue of “illuminations” for public celebrations. Let’s shed some light on fireworks with the six facts below.
The World’s First Fireworks Were Developed in China
The precursors to fireworks were simple bamboo sticks thrown into bonfires, which popped loudly when the air inside the bamboo’s chambers heated up and burst. Chinese villagers were using these rudimentary fireworks to celebrate the new year and ward off nefarious spirits by the second century BCE.
Legend suggests that around 1000 CE, Chinese alchemists stumbled upon the formula for gunpowder and applied the discovery to weapons of war. During the Song dynasty (960 to 1276 CE), people began packing gunpowder into paper tubes and lighting them on fire. They also created bunches of firecrackers tied together so that the initial explosion would start a chain reaction.
When Marco Polo returned to Venice from Asia at the end of the 13th century, he brought back a number of curious items. Among them were fireworks, which he reportedly introduced, or at least helped popularize, in the region that is now Italy. During the Renaissance, metallurgists in Florence created fireworks displays during religious holidays, Romans set them off after the elections of popes, and firework shows were popular theater at court in Ferrara. Italian craftspeople developed colored fireworks in the 1830s, and the tradition lives on. Two of the largest American fireworks companies today, Zambelli and Fireworks by Grucci, were founded by Italian immigrants in the 19th century.
In pyrotechnic parlance, a shell is a bundle of explosive materials that forms the nucleus of a firework. They’re usually packaged in a paper tube in a specific order. In the center of the shell are tiny cubes of chemicals, called stars; other compounds to create special effects; a bursting charge; and an internal time fuse connected to a packet of black powder called the lifting charge. A fuse leads from the lifting charge to the exterior of the shell.
When the fuse is lit, it eventually ignites the lifting charge, sending the shell into the sky. As it rises, the flame burns through the internal time fuse to the bursting charge, which explodes the whole shell in a shower of colored sparks.
Specifically, the chemistry that creates black powder, also known as gunpowder. The standard mix for the black powder used in modern fireworks hasn’t changed much in the last few centuries: It’s a combination of 75% potassium nitrate (saltpeter), 15% charcoal, and 10% sulfur. When ignited, the sulfur melts over the charcoal and saltpeter, which combust and cause the explosion.
There is chemistry in fireworks colors as well. Specific chemical compounds in the stars release different wavelengths of light when ignited. Strontium salts create red fireworks, barium salts make green, and sodium salts create yellow. Blue, made by copper salts, is the hardest color to achieve; too much copper and the explosions are too dark to be seen at night, while too little copper makes the fireworks appear plain white.
Fourth of July Fireworks Displays Are Almost as Old as the United States
The Second Continental Congress declared the American colonies’ freedom from British control on July 2, 1776, a day that Massachusetts delegate John Adams called “the most memorable Epocha, in the History of America.” In a letter to his wife Abigail, Adams continued: “It ought to be solemnized with Pomp and Parade, with Shews, Games, Sports, Guns, Bells, Bonfires and Illuminations from one End of this Continent to the other from this Time forward forever more.” On July 4, 1777 (the first anniversary of Congress’ adoption of the final Declaration of Independence), that’s exactly what happened, though it wasn’t all Adams’ idea. Fireworks had been a part of public patriotic festivals for centuries by that time.
While fireworks are a visual spectacle, they also produce a lot of noise. Social reformers concerned about the din’s effect on people’s health and safety formed the Society for the Suppression of Unnecessary Noise in New York City in December 1906. Led by physician Julia Barnett Rice, the group convinced lawmakers to regulate steamboat whistles, train noise, and fireworks. In an essay titled “Our Barbarous Fourth,” Rice condemned the cacophony of ceaseless firecrackers and “every noisy device from the tin trumpet to the dangerous pistol.” The society eventually convinced city officials to establish noise-free quiet zones around hospitals and facilities serving poor residents.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
From sea to shining sea, a lot of history has taken place on America’s shores. While you may think you’re familiar with the country we call home, some of these facts may surprise you. Did you know that you used to be able to send children through the U.S. mail? Or that the Statue of Liberty was originally brown? We’ve collected some of our top facts about American history from around the site, so be prepared to wonder why you probably never learned any of this in school.
The U.S. Actually Voted for Its Independence on July 2
In June 1776, the Second Continental Congress selected a Committee of Five — John Adams, Roger Sherman, Robert Livingston, Benjamin Franklin, and Thomas Jefferson — to draft a statement of independence that severed the colonies from British rule. When the draft was presented to Congress, only nine of the 13 colonies favored independence. However, the delegates largely fell into line from that point, and on July 2, Congress formally approved the resolution that proclaimed the United States of America as an independent country. Following additional edits, the Declaration of Independence was completed, adopted, and sent for printing on July 4, and on August 2, the rank-and-file delegates began adding their signatures to an engrossed version of the document.
According to historian Pauline Maier, the idea of commemorating the anniversary of independence didn’t gain any traction in 1777 until it was too late to recognize the date of July 2. However, a pair of notable celebrations popped up on July 4 — fireworks in Boston, a military demonstration and more pyrotechnics in Philadelphia — setting forth an annual tradition.
In 1885, French sculptor Frédéric Auguste Bartholdi dismantled his gleaming copper-skinned creation — a gift to the U.S. from the French people — into 350 pieces for its voyage across the Atlantic. The statue was eventually rebuilt atop Bedloe’s Island (now called Liberty Island) in New York Harbor, but over the next two decades or so, the landmark underwent a prominent color change.
The now-familiar minty tint is actually a patina, a common coating that forms on copper as well as its alloys brass and bronze. The patina is a result of the chemical reactions the statue has endured in its environment, an urban center surrounded by water. Over the years, the copper has reacted to oxygen, sulfuric acid, chloride, and other components of the surrounding air and water, changing its mineral composition in a gradual evolution. Today, chemists believe the seafoam-green hue has stabilized. And while there’s occasionally been talk of repainting the statue or polishing off her patina, public sentiment — and input from copper manufacturers — has kept “Liberty Enlightening the World” from being returned to her initial metallic sheen. Fortunately, the patina is protective, which means Liberty’s chameleonlike qualities actually help preserve her.
What’s the most you’ve ever lost in a coin toss? For Asa Lovejoy, it was the opportunity in 1845 to name the city he’d recently established with Francis Pettygrove. The men decided to settle their disagreement as to what their new land claim should be called with a two-out-of-three coin flip that Pettygrove won. Pettygrove chose “Portland” because he hailed from the city of the same name in Maine; Lovejoy had intended to name the place after his hometown of Boston. Now known as the Portland Penny, the one-cent piece used in the fateful toss was minted in 1835 and retrieved by Pettygrove after his victory. It remained with him when he founded Port Townsend, Washington, and was eventually given to the Oregon Historical Society, which now keeps it on display.
Shirley Temple, the beloved child star who was Hollywood’s No. 1 box-office draw from 1935 to 1938, announced her retirement from film at the age of 22 in 1950. It was anyone’s guess what Temple would do next, but it’s unlikely that many predicted her eventual diplomatic career. After she ran (unsuccessfully) for Congress in 1967, President Nixon appointed her as a delegate to the 24th United Nations General Assembly in 1969, and President Ford named her the ambassador to Ghana in 1974.
Temple’s foreign service didn’t end there. In 1989, just before the Velvet Revolution, President George H.W. Bush made her ambassador to the former Czechoslovakia, a post she held until 1992, as the country became a parliamentary democracy. According to Norman Eisen, who held the same role from 2011 to 2014, the “sunny confidence and optimism” that made Temple a movie star also helped her “really infuse the United States’ role — as our representative here, in the Velvet Revolution — with that good cheer and that hope.”
Credit: Hulton Deutsch/ Corbis Historical via Getty Images
Napoleon’s Grandnephew Created the Forerunner of the FBI
A grandson of Napoleon Bonaparte’s younger brother Jérôme, Charles Bonaparte lacked his famous relative’s ambition for world domination yet displayed a talent for visionary authority that might have impressed the Little Corporal. In the late 19th century, Charles Bonaparte, then a lawyer from Baltimore, came into the orbit of fast-rising New York politician Theodore Roosevelt through their shared interest in civil service reform. Bonaparte later became President Roosevelt’s secretary of the Navy and then attorney general, a position that thrust “Charlie the Crook Chaser” into the spotlight as a face of the administration’s trust-busting efforts.
Behind the scenes, the attorney general fumed at the lack of an established investigative team within the Department of Justice, which often led to the borrowing of spare Secret Service agents from the Treasury Department for investigating cases that involved federal law. Congressional leaders also frowned on what they felt was becoming an overreach of the executive branch, and in May 1908, Congress passed a bill that halted the DOJ’s ability to commandeer Secret Service personnel. Seizing the opportunity, Bonaparte culled together a “special agent force” of 31 detectives, and on July 26, 1908, he issued an order that directed DOJ attorneys to refer investigative matters to his chief examiner, Stanley Finch.
Bonaparte’s oversight of this unit was short-lived, as he exited the federal government at the end of the Roosevelt administration in March 1909. Nevertheless, his special agent force remained in place under new Attorney General George Wickersham, who began referring to the group as the Bureau of Investigation. By 1935, the now-renamed Federal Bureau of Investigation was firmly embedded as a U.S. law enforcement institution under Director J. Edgar Hoover.
By Some Accounts, North Dakota Didn’t Technically Become a State Until 2012
North Dakota was admitted to the Union as the 39th state on November 2, 1889, except it kind of sort of wasn’t. Its constitution left out a key detail that, according to some, was enough of a technicality that North Dakota didn’t actually become a state until 2012. A local historian by the name of John Rolczynski first noticed in 1995 that North Dakota’s state constitution failed to mention the executive branch in its section concerning the oath of office, which he felt made it invalid; the United States Constitution requires that officers of all three branches of a state’s government be bound by said oath, and North Dakota’s only mentioned the legislative and judiciary branches.
This led to a campaign that included an unanswered letter to then-President Bill Clinton and ended with the successful 2012 passage of an amendment to Section 4 of Article XI of the state constitution, which fixed the omission.
Credit: Buyenlarge/ Archive Photos via Getty Images
Before Time Zones Were Established in 1883, North America Had Over 144 Local Times
Before time zones were established in 1883, North America alone had at least 144 local times. Noon was when the sun reached its zenith, and in many places the only thing making time official was a town clock. This didn’t affect many people’s day-to-day lives, as it often took several days to travel from one place to another, but confusion intensified once the expanding railroad system drastically cut travel times. Because time wasn’t standardized, coordinating schedules across multiple rail lines was nearly impossible, and travelers occasionally found themselves arriving at their final destination earlier than they’d departed. Sometimes, trains even collided.
Those problems more or less evaporated after November 18, 1883, when American railroads adopted the first four time zones (Eastern, Central, Mountain, and Pacific) and all clocks in each zone were synchronized. The number of time zones rose to five with the passage of 1918’s Standard Time Act, which added Alaska. (The act also established the use of daylight saving time in the U.S., to the chagrin of many.) Including its territories, the United States now has four more time zones — Chamorro (which is used in Guam and the Northern Mariana Islands), Samoa, Hawaii-Aleutian, and Atlantic — for a total of nine.
Tenth President John Tyler Still Has a Living Grandson
More than 200 years after the 10th President of the United States was born, one of his grandsons is still alive. As impossible as that may seem, the math — and biology — checks out. John Tyler, who was born in 1790 and became President in 1841 after William Henry Harrison died in office, had a son named Lyon Gardiner Tyler in 1853. This son was born to the then-60-something Tyler and his second, much younger, wife, Julia Gardiner. Lyon then had two sons of his own in his 70s (also with a much younger second wife), one of whom — Harrison Ruffin Tyler, born in 1928 — is still gracing the Earth in his mid-nineties.
Amelia Earhart Once Took Eleanor Roosevelt on a Nighttime Joyride
Although her aviation career lasted just 17 years, Amelia Earhart remains one of the most famous people ever to take to the sky. In addition to being renowned for her many firsts — including being the first woman to fly solo across the Atlantic and the first person to fly alone from Hawaii to the mainland U.S. — she’s known for her 1937 disappearance and the many theories it spawned. Less well known but considerably more fun to imagine is the time she took Eleanor Roosevelt on a nighttime joyride from Washington, D.C., to Baltimore on April 20, 1933. The brief flight took place with both of them in their evening wear following a White House dinner party.
“I’d love to do it myself. I make no bones about it,” the First Lady told the Baltimore Sun after the flight. “It does mark an epoch, doesn’t it, when a girl in an evening dress and slippers can pilot a plane at night.”
In fact, Roosevelt herself had recently received a student pilot license and briefly took over the controls of the twin-engine Curtiss Condor, borrowed from Eastern Air Transport at nearby Hoover Field. Eleanor’s brother Hall also ditched the dinner party in favor of the flight that night, as did Thomas Wardwell Doe, the president of Eastern Air Transport, and Eugene Luther Vidal (head of the Bureau of Air Commerce) and his wife Nina Gore, parents of author Gore Vidal. When the plane returned after the short journey, the Secret Service guided everyone back to the White House table for dessert.
Memorial Day’s Date Was First Chosen Because It Was When Flowers Would Be in Full Bloom
Deciding when to observe holidays isn’t always an exact science. George Washington wasn’t born on the third Monday of February, for example. Memorial Day’s precise date on the calendar also shifts from year to year (though it’s always the final Monday of May), but at least the reasoning behind it is sound: The late spring date was chosen because it was when flowers would be in full bloom. As adorning the graves of fallen soldiers with wreaths was once the most important part of the holiday, it’s difficult to imagine Memorial Day taking place at another time of year — especially considering that it was first celebrated in the 1860s, when floristry wasn’t quite as commercially developed as it is today. Originally celebrated on a state and community level, Memorial Day became an official federal holiday in 1971.
Jimmy Carter Was the First President Born in a Hospital
In his nearly 100 years on Earth, Jimmy Carter has set a number of records and achieved almost as many firsts. In addition to being the longest-living President in U.S. history, he was also the first one born in a hospital — an event that occurred on October 1, 1924, in Plains, Georgia. It was much more common for babies to be born at home in the early 20th century than it is now, but Carter’s mother was a nurse at what was then known as Wise Sanitarium. There happened to be a room available on that fateful October night, and the hospital has since been renamed the Lillian G. Carter Nursing Center.
When it comes to the American flag, it’s not just about 13 stripes and 50 stars — the number 27 also has an important meaning. That’s how many different versions of Old Glory have been officially recognized since the nation began. The inaugural 13-star, 13-stripe flag was approved by the Continental Congress on June 14, 1777, and later underwent an update in May 1795. That redesign — due to Vermont and Kentucky joining the Union — featured 15 stars and 15 stripes. While the number of stripes initially continued to increase as more states were admitted, the government reverted back to 13 stripes in 1818, representing the original 13 colonies, and let the stars represent the number of states instead. The current and 27th official design was adopted on July 4, 1960, after Hawaii’s admission into the United States. It is the only version in U.S. history to remain unchanged for over 50 years.
You Used To Be Able To Send Children Through U.S. Mail
You can send a lot of things in the mail, but you can’t send a person — at least not anymore. There was nothing preventing people from mailing their own children in the early days of the U.S. Postal Service’s parcel post service, and at least seven families took advantage of it. That includes the Beagues, an Ohio couple who in 1913 paid 15 cents in postage to mail their newborn son to his grandmother’s house a mile down the road. Beyond the novelty of it — when the parcel post service began on January 1, 1913, some were eager to see which packages they could get away with sending — it was a surprisingly practical way of getting one’s kiddo from point A to point B.
To start with, many people in rural areas knew their postal carriers fairly well, which meant the children were simply walked or carried on often-short trips. In other instances, children traveled on trains as Railway Mail, but with stamps instead of (usually more expensive) train tickets. The longest known trip of a child through the mail occurred in 1915, when a 6-year-old was sent 720 miles from Florida to Virginia — a lengthy trip that cost just 15 cents. Fortunately, there are no reports of children being injured by being sent through the mail. (Pictures of children in literal mailbags were staged.) The practice ended, as so many do, when certain higher-ups became aware of the loophole and decided to close it, also around 1915.
Credit: Historical/ Corbis Historical via Getty Images
No U.S. President Has Been an Only Child
In the sibling department, every President has had, at minimum, one half-brother or half-sister. However, a few Presidents are sometimes considered to have been raised as only children — most notably Franklin D. Roosevelt, whose only half-sibling (his father’s oldest son, James) was 28 years FDR’s senior. Bill Clinton’s half-brother, Roger, is about a decade younger than him. Barack Obama also has a 10-year age gap with his younger half-sister Maya, although he learned later in life that he possessed at least five more half-siblings on his father’s side. Meanwhile, Gerald Ford is the only child his mother and father produced, but he was raised with three younger half-brothers after his mother remarried, and as a teen, he learned that he also had three younger half-sisters, via his father.
The Labor Department Was the First U.S. Cabinet Agency Led by a Woman
George Washington held the country’s first full Cabinet meeting on November 26, 1791. That meeting, and every subsequent Cabinet meeting over the next 142 years, consisted exclusively of men. But all that changed on March 4, 1933, when Frances Perkins became secretary of labor under President Franklin D. Roosevelt — and the first woman to hold any position in a presidential Cabinet. The occasion was marked several months later by Time, which put Perkins on the cover of its August 14, 1933, edition. Perkins had previously served under FDR in a similar capacity, having been appointed commissioner of the New York State Department of Labor after Roosevelt was elected governor of New York in 1929.
Perkins’ tenure lasted for the entirety of Roosevelt’s 12-year administration, making her the longest-serving secretary of labor in U.S. history. Described by historian Arthur Schlesinger Jr. as “brisk and articulate” and “intent on beating sense into the heads of those foolish people who resisted progress,” Perkins is best known for her role as chairwoman of the President’s Committee on Economic Security, which led to the 1935 act that created Social Security. She was also active in issues around child labor, safety, minimum-wage laws, worker’s compensation, and more. She resigned in 1945, after Roosevelt’s death, and then served on the United States Civil Service Commission under President Truman until 1952.
Every flag has a story, but few are as endearing as Alaska’s. One of the rare places to have a flag before it was actually a state, the Last Frontier held a contest to design its territorial standard in 1926 and 1927 — and a 13-year-old won. (The contest was only open to Alaskan children in the seventh to 12th grade, but still.) Benny Benson lived in an orphanage known as the Jesse Lee Home in Seward, Alaska, when he came up with the winning design, which included a description he wrote himself: “The blue field is for the Alaska sky and the forget-me-not, an Alaska flower. The North Star is for the future of the state of Alaska, the most northerly in the Union. The dipper is for the Great Bear — symbolizing strength.” His design also featured “1867” in commemoration of the year the United States bought Alaska from Russia, although the numbers didn’t make the final cut.
In addition to being hailed as a local hero, Benson won a watch with his design on it and a $1,000 scholarship. He eventually used that money to attend Hemphill Diesel Engineering School after moving to Seattle in 1936. He was 45 when Alaska became a state in 1959, fulfilling the hopeful description of his design. Alaska kept its flag rather than adopt a new one, and Benson’s work lives on today.
U.S. Elections Used To Be Held Over a 34-Day Window
As implied by its name, Election Day is, well, a single day. That wasn’t always the case, however: States used to hold elections whenever they wanted within a 34-day period leading up to the first Wednesday in December. This ultimately created some issues, as you might imagine — early voting results ended up holding too much sway over late-deciding voters, for one thing. The current date was implemented by the Presidential Election Day Act of 1845, and federal elections now occur every two years on the first Tuesday after the first Monday in November.
That may sound arbitrary at first, but the date was chosen quite deliberately. American society was much more agrarian in the mid-19th century than it is today, and it took a full day of traveling for many to reach their polling place. Church made weekends impractical, and Wednesday was market day for farmers, so Tuesday proved ideal. November, meanwhile, worked because weather was still fairly mild, and the harvest was complete by then.
The Last U.S. President With Facial Hair Was William Howard Taft
On Inauguration Day in 1913, mustachioed President William Howard Taft passed the presidential baton to clean-shaven Woodrow Wilson. What Taft couldn’t have known at the time was that his departure began a long streak of clean-shaven faces occupying the Oval Office.
In fact, out of the 46 Presidents in U.S. history so far, only 13 have had any facial hair whatsoever. Although sixth President John Quincy Adams, eighth President Martin Van Buren, and 12th President Zachary Taylor sported impressive mutton chops, the first serious presidential facial fuzz belonged to 16th President Abraham Lincoln — thanks to an 11-year-old girl whose 1860 letter convinced him to grow out his whiskers. After Lincoln, eight of the next 10 Presidents sported some sort of facial hair.
Some Historians Consider Cracker Jack America’s First Junk Food
Cracker Jack’s early marketing warned prospective customers about the effects of the product. “Do not taste it,” one 1896 article cautioned. “If you do, you will part with your money easy.” Some historians believe that the caramel-coated popcorn and peanut treat jump-started the American snack food industry around the turn of the 20th century. It may even hold the title of the country’s first junk food, though the types of junk food popular today didn’t make their appearances until the 1950s. It all started with Chicago candy and popcorn peddlers Frederick and Louis Rueckheim, German immigrants who crafted a nonsticky caramelized popcorn as a way to stand out from other popcorn vendors. Their version — with a sweet, crunchy coating that was different from the salted popcorn and kettle corn available at the time — became a hit after it was mass-produced in 1896.
It was a song, however, that helped cement Cracker Jack’s snack status. In 1908, songwriter Jack Norworth — entirely unknown to the Rueckheims — composed “Take Me Out to the Ball Game” after seeing an advertisement for an upcoming game. The song, which mentions the snack by name, led to a surge in sales that forever linked Cracker Jack with sports. Four years later, the Rueckheims sweetened their popcorn business with a marketing gimmick that would eventually be replicated by cereal brands, fast-food restaurants, and candymakers for decades to come: a toy in every box. By 1916, Cracker Jack was the bestselling snack worldwide.
Credit: HUM Images/ Universal Images Group via Getty Images
Before They Built Airplanes, the Wright Brothers Owned a Bicycle Shop
The Wright brothers are best known for their historical flight over Kitty Hawk, North Carolina, in 1903, but years before the siblings made aviation history, they were busy running a bicycle shop in western Ohio. Wilbur Wright and his younger brother Orville had long dreamed of gliding through the wild blue yonder, but it would take years of work to finance their costly first attempts. In the 1880s, the brothers undertook their first joint business, a small printing shop in Dayton that churned out local newspapers, church pamphlets, and bicycle parts catalogs. By 1892 the brothers had moved from printing for bicycle companies to starting their own, inspired by their shared passion for cycling; Wilbur reportedly loved leisurely rides through the countryside, while Orville was known to participate in bike races.
The Wright Cycle Company initially offered repairs and rentals, but as cycling became more popular, the brothers turned to manufacturing their own designs in an effort to compete with the dozens of nearby bike shops. Their first model, the “Wright Special,” was released in May 1896, followed by the “Van Cleve.” Together, Wilbur and Orville hand-built around 300 bikes per year during their peak production years before 1900, using the profits to fund their flight experiments. By 1908, they had abandoned their shop to focus solely on aeronautics. Today, only five antique Van Cleve bikes exist, two of which remain in the brothers’ hometown at the Wright Brothers National Museum in Dayton.
Credit: Heritage Images/ Hulton Fine Art Collection via Getty Images
The Earliest U.S. Presidents Didn’t Wear Pants
The very first American Presidents — George Washington included — led the country through the American Revolution and its earliest days without wearing a single pair of pants. That’s because the Founding Fathers actually wore breeches, pairs of tight-fitting men’s bottoms that cut off at the knee. (Their calves were covered with knee-high stockings.) Breeches were a status symbol; full-length pantaloons were generally reserved for working folk who needed more ease to complete manual labor, which was difficult to do in custom-fitted breeches.
Another revolution — in France — eventually led Americans to turn their backside on breeches around the start of the 19th century. French political groups such as the sans-culottes (literally meaning “without knee breeches”) stylized longer trousers as the apparel of the everyday man, disparaging breeches as the clothing of the wealthy elite. For a while, American Presidents continued to stick with cropped breeches, though pants slowly crept into everyday style. Americans wouldn’t see the country’s highest leader don full-length pants until 1825, when John Quincy Adams became the sixth President — and the first to be inaugurated while wearing a pair of trousers. (He also ditched the powdered wig.)
The Library of Congress Has a Piece of Wedding Cake From the 1800s
Celebrity weddings — love them or ignore them, they’ve seemingly always been a topic of fascination for Americans. One famous case: the wedding of Charles Stratton, aka General Tom Thumb, an entertainer known for his particularly small stature. At 40 inches tall, Stratton enjoyed a lucrative career singing, dancing, and acting; part of his success came from employment with famed showman P.T. Barnum, who dubbed him the “smallest man alive.” In February 1863, Stratton married the similarly sized “Queen of Beauty,” Lavinia Warren, in a dazzling New York display that attracted thousands of onlookers trying to get a glimpse of the couple. After the ceremony, a reception — to which Barnum had sold thousands of tickets — allowed guests to meet the pair in a receiving line. Ladies were handed a boxed slice of brandy-soaked wedding fruitcake on their way out.
After the wedding, Stratton and Lavinia were even welcomed at the White House by President Lincoln and his wife, Mary Todd. But Lavinia’s career dimmed after Stratton’s death in 1883, and she used a slice of her wedding cake at least once to help her career. In 1905, she sent the then-42-year-old slice of cake to actress Minnie Maddern Fiske and her husband, an editor at a theater publication, along with a letter that said, “The public are under the impression that I am not living.” Lavinia would eventually continue performing until her 70s, even starring in a silent film in 1915 with her second husband, “Count” Primo Magri. Today, two pieces of Stratton and Lavinia’s wedding cake have outlived the couple — one donated to the Library of Congress in the 1950s as part of the Fiskes’ papers, another at the Barnum Museum in Connecticut.
While President, Ulysses S. Grant Was Arrested for Speeding
Nearly 25 years after Ulysses S. Grant’s death, a peculiar story hit the pages of the Washington Evening Star. Within the paper’s Sunday edition one day in 1908, retired police officer William H. West recounted how he had caught the 18th President speeding through the streets of Washington, D.C. — and decided the only appropriate course of action was to proceed with an arrest.
West’s tale harkened back to 1872, during a particularly bad bout of traffic issues, when complaints of speeding carriages were on the rise. West had been out investigating a collision when he witnessed Grant — then the sitting President — careening his horse-drawn carriage down the road. The officer flagged down the carriage, issued a warning, and sent Grant on his way. But Grant, who had a reputation for hightailing horse rides, couldn’t resist the need to speed. West caught him the very next day once again tearing through the city. Feeling he had no other option, the officer placed the President under arrest. At the police department, Grant was required to put $20 (about $490 in today’s money) toward his bond before being released.
Historian John F. Marszalek, who oversaw Grant’s presidential collection at Mississippi State University, says the situation blew over pretty quickly. Grant’s arrest wasn’t the first time he had been cited for speeding. It also wasn’t a political quagmire for either party. At the time, West — a formerly enslaved Civil War veteran who became one of just two Black police officers in Washington, D.C., immediately after the war — was commended for his actions in trying to make the city streets safer. And Grant owned up to his mistake — though he did choose to skip his court appearance scheduled for the following day, which meant he forfeited his $20. He didn’t face any further consequences, however.
The Last American To Collect a Civil War Pension Died in 2020
Irene Triplett, a 90-year-old North Carolina woman, was the last person to receive a Civil War pension, thanks to her father’s service in the Union Army. Mose Triplett was originally a Confederate soldier who deserted in 1863 and later joined a Union regiment, a move that kept him out of the fight at Gettysburg, where 90% of his former infantry was killed. Switching sides also guaranteed Mose a pension for the remainder of his life, which would later play a role in him remarrying after the death of his first wife. At age 78, Mose married 27-year-old Elida Hall — a move historians say was common during the Great Depression, when aging veterans needing care could provide financial security to younger women. The couple had two children, including Irene, who was diagnosed with cognitive impairments that allowed her to qualify for her father’s pension after both parents’ deaths. By the time of Irene’s own passing in 2020, the U.S. government had held up its duty, paying out Mose Triplett’s pension for more than 100 years.
Idaho became the 43rd state on July 3, 1890, formed from a territory that once included land in present-day Montana and Wyoming. Upon statehood, Idaho legislators looked to commission the state seal’s design by way of a competition, with a generous $100 prize (about $3,300 today) for the winning artist. Emma Edwards Green, an art teacher who had relocated to Boise after attending school in New York, was in part inspired by the fact that it seemed Idaho would soon give women the right to vote. In March 1891, Green’s work was selected as the winner, beating out submissions from around the country.
The final design, which is also featured on Idaho’s flag, is packed with symbolism. Worked into the design are cornucopias and wheat to represent Idaho’s agriculture, a tree meant to be reminiscent of the state’s vast timberlands, and a pick and shovel held by a miner. Green’s most forward-thinking detail, however, is a man and woman standing at equal heights in the seal’s center, a symbol of gender equality that would eventually come with voting rights for all. True to their word, Idaho legislators passed women’s suffrage in 1896 — five years after Green’s seal became the state’s official symbol — making Idaho the fourth state to enfranchise women, more than 20 years before the 19th Amendment gave the same right to women nationwide.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Sometimes, a President’s words continue to echo long after they were first uttered, and some become a permanent part of our cultural fabric. Here’s a look at eight noteworthy presidential quotations — and the oft-forgotten story behind each one.
“Few men have virtue to withstand the highest bidder.” — George Washington
At the height of the Revolutionary War, General George Washington was worried about spies. He was especially suspicious of double agents enlisted among his own ranks — particularly one named Elijah Hunter, a prominent farmer who had first been recruited as a British spy by a Loyalist governor and then convinced to play both sides by patriot leaders. In August 1779, Washington wrote a letter to Major General Robert Howe, explaining why he didn’t trust the young spy. The British, Washington explained, possessed significantly more money than the Americans and could corrupt the agents to favor their side. He advised Howe that he thought it “necessary to be very guarded, with those who are professedly acting as double characters.”
“Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.” — John Adams
In March 1770, a group of British soldiers fired into a rebellious crowd of Boston colonists, killing five civilians. John Adams, a lawyer who steadfastly believed in the right to counsel, was asked to defend the redcoats when everyone else refused. In the trial, Adams claimed that the soldiers were victims of a mob — “if an assault was made to endanger their lives, the law is clear, they had a right to kill in their own defence [sic]” — and had fired their muskets in self defense. He uttered the quote “facts are stubborn things …” while making his case to the jury, and the strategy worked: The Captain and six of his soldiers were found innocent, with only two convicted of manslaughter.
“If I were two-faced, would I be showing you this one?” — Abraham Lincoln
Today, lacking a sense of humor in politics can be a liability. But back in the 19th century, the opposite was true: humor could be considered a sign that the office holder was not taking the gravity of elected office seriously. That problem rang true for Abraham Lincoln, who loved getting a laugh. A journalist once said of Lincoln: “I could not take a real personal liking to the man, owing to an inborn weakness for … jokes, anecdotes, and stories.” Lincoln didn’t care. The quip above — or some variant of it — reportedly came to his mind when Stephen A. Douglas called him a “two-faced man.”
“Speak softly, and carry a big stick.” — Theodore Roosevelt
Roosevelt’s “Big Stick Diplomacy” policy was a mainstay during his career. When he was New York’s governor, he credited the phrase as a South African proverb, according to a 1900 article in the Brooklyn Eagle. The following year, as Vice President, he gave a speech at the 1901 Minnesota State Fair that touted his approach to American power, metaphorically explaining how the soft-power of diplomacy was best bolstered by the lingering presence of military might. (Roosevelt became President just two weeks later, when President William McKinley was assassinated.) The rest of Teddy’s quotation, however, is worth hearing: “If a man continually blusters, if he lacks civility, a big stick will not save him from trouble, and neither will speaking softly avail, if back of the softness there does not lie strength, power. In private life there are few beings more obnoxious than the man who is always loudly boasting, and if the boaster is not prepared to back up his words, his position becomes absolutely contemptible.”
“The only thing we have to fear is fear itself.” — Franklin D. Roosevelt
Franklin D. Roosevelt took office in 1933, when the Great Depression was arguably at its worst. In the months between his election and inauguration, unemployment had exploded and anxieties were high. Franklin tried to soothe matters during a solemn inaugural address, saying, “This Nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself.” Within hours, his administration would shut down the country’s banking system in the hopes of resetting it; a move that’s regarded as having helped to set the nation’s banking system right after years of losses.
“Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.” — Dwight D. Eisenhower
It was March 1953, and Josef Stalin had just died. Seeing an opportunity to end Cold War hostilities, President Eisenhower wanted to give a speech that would do more than just indict the Soviet regime — he wanted to call an end to the growing arms race. Eisenhower, formerly a five-star general and Supreme Commander of NATO, sincerely believed that war-fighting nations were bound to be derelict of their duties at home and needed to shift priorities. He presented the speech, called “The Chance for Peace,” to the American Society of Newspaper Editors. He considered these words so important, they’d later be engraved over his tomb.
“Ask not what your country can do for you, but what you can do for your country.” — John F. Kennedy
This brilliant chiasmus appeared during Kennedy’s inaugural address on a cold January day in 1961. At the time, the Cold War was still roaring and many Americans were worried. Kennedy decided to transform his speech into a challenge, a plea for Americans to honor a duty to help the greater good. Kennedy’s speechwriting team employed material from an abundance of sources, with his most famous line echoing the words of President Warren G. Harding, who once said, “We must have a citizenship less concerned about what the government can do for it and more anxious about what it can do for the nation.” (But Harding didn’t originate it, either. Variations existed as early as 1884.)
“We did get something — a gift — a gift after the election. A man down in Texas heard Pat on the radio mention the fact that our two youngsters would like to have a dog.” — Richard Nixon
These might not be the most famous words uttered by President Richard Nixon — that title probably belongs to “I am not a crook” — but they’re arguably the most consequential. In 1952, Nixon was running for Vice President when allegations appeared that he had improperly taken $18,000 from a secret campaign fund. To counter the charges, Nixon gave America’s first nationally televised address, openly discussing his family’s financial history before an audience of 60 million people. But what resonated most with viewers was Nixon’s story about a supporter who mailed his family a dog named Checkers. The so-called “Checkers Speech” would not only save Nixon’s career, it also demonstrated to politicians how, in the words of Nixon’s speechwriter, “television was a way to do an end-run around the press.” Politics hasn’t been the same since.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
In the Northern Hemisphere, July’s arrival signals the full swing of summer. With school out and vacation on the mind, the days of our seventh month become jam-packed with barbecues, adventure, and holiday celebrations — all enjoyed while enduring some of the hottest weather of the year. But there’s more to the month than just its scorching temps, so read on for six interesting facts about the dog days of July.
The Gregorian calendar divides our year into 12 months, with July sitting in the seventh spot. But it wasn’t always this way; at one time, July had an entirely different name. Under the Roman calendar, July was called Quintilis — the Roman word for “fifth,” marking its place as the fifth month of the 10-month year. Statesman and leader Julius Caesar influenced the name change: Quintilis was renamed to honor Caesar following his assassination in 44 BCE, in an ode to his birth month. By then, July had slid out of fifth place and into its current seventh spot.
If you’ve RSVP’d to a seemingly endless stream of birthday parties in July, it’s no surprise. That’s because July’s arrival marks the start of one of the most popular birth months in the year. While August reigns as the most common birth month, July comes in second, with the 12-week popularity wave ending in September (the third-most popular month).
When it comes down to the specific date, July 7 is circled as the sixth-most popular birthday, with an average of 12,108 babies born on that day each year. However, not every day in the summer month is popular for new arrivals; July 4 is the fifth-least common birthday among Americans, with an average of just 8,796 babies born. The summer holiday is beaten out only by three other unpopular days, all of which occur in winter: Christmas, New Year’s Day, and New Year’s Eve.
In the U.S., July’s grandest holiday is Independence Day, marked with a day of fireworks, fanfare, and food. But the culinary celebrations don’t have to end after the Fourth of July is through. The summer month hosts a handful of unofficial food-related holidays that are perfectly timed to summer cravings. Dessert lovers can celebrate the season with National Apple Turnover Day on July 5, along with National Sugar Cookie Day on July 9 and National Hot Fudge Sundae Day on July 25. National Piña Colada Day arrives on July 10, followed by Mojito Day on July 11, and both chicken wings and lasagna are honored on July 29. However, one star of the seasonal backyard barbecue gets more than just a day; sausages are honored for a full four weeks thanks to July’s designation as National Hot Dog Month. But if hot dogs aren’t your thing, we have good news: It’s also National Ice Cream Month.
Credit: AFP via Getty Images
The First Bikini Debuted in July
Pools, beaches, and aquatic parks are practically midway through their operating season come the dog days of July, the same month that commemorates a popular piece of attire often worn in water: the bikini. French designer Louis Réard unveiled his tiny two-piece swimsuit on July 5, 1946, at a Paris swimming pool. Réard’s goal was to create the smallest two-piece swimsuit possible, and it’s likely he was inspired by postwar fabric shortages; his original design used just 30 inches of material.
The first bikini was scandalous, with Réard initially unable to find a model willing to debut his creation in public. But the small two-piece suits soon became popular in Europe, commonly seen on beaches throughout the 1950s. Within a decade, the bikini trend gained momentum and jumped across the pond to American swimmers. As for the unusual name, Réard named his swimsuit for Bikini Atoll, a coral island in the Marshall Islands used by the U.S. as a nuclear test site — a moniker meant to suggest how monumental his clothing invention would be.
Most people look to the summer night sky in anticipation of fireworks or an astronomical spectacle (like a glimpse of the planet Venus, which appears to glow its brightest in early July). However, July also offers the best odds of catching sight of an unidentified flying object.
The phenomenon dates as far back as July 1947, when New Mexico rancher W.W. Brazel sparked generations of skywatchers thanks to his report of a downed spacecraft — an event we now call the Roswell Incident. Despite a flurry of conspiracy theories, Brazel’s account of finding debris from a skyfallen UFO was disproven and explained by U.S. military officials as a crashed weather balloon. But in the decades to follow, reports of UFO sightings only grew; according to the National UFO Reporting Center, which has collected data on UFO sightings since 1974, more reports are made in July than any other month. While it’s unclear just why summer lends itself to more UFO sightings, one theory nods at the best parts of summer: Spending more time outside gives us chances to see the unusual, paired with spooky summer blockbusters that prime our brain to see the supernatural.
As Earth travels its constant path around the sun, there comes a time when our planet is at its farthest point from our home star — which happens in July. On an average day, Earth sits a snug 93 million miles from the sun, but because the planet's orbit is an ellipse — in which the sun isn’t perfectly centered — our distance from the star waxes and wanes throughout the year. In early July, Earth experiences its aphelion, aka a planet’s farthest distance from the sun, winding up a mind-bending 94.5 million miles away. (Come January, Earth will reach its perihelion, aka the closest position to the sun, measuring 91.4 million miles away.) Aphelion is predictable, normally occurring around two weeks after the summer solstice. In 2023, Earth’s aphelion occurs on July 6 at 4:07 p.m. EST.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
When you sing along to an old standard, do you ever really think about the story behind the music? Songs like “Happy Birthday to You,” “Twinkle Twinkle Little Star,” and (at least in the U.S.) “The Star Spangled Banner” feel like they’ve been with us forever, but all of them started somewhere. Here are a few fun facts about these beloved songs, from the popular (but false) story about “Twinkle Twinkle Little Star” to what “Hokey Pokey” and “Old MacDonald” sound like in other parts of the world.
“Happy Birthday to You” Was Originally “Good Morning to All”
“Happy Birthday to You” famously went through a huge copyright battle, despite being one of the best-known songs ever written in English. Sisters Patty and Mildred Hill initially published the song in 1893, and it remained under copyright, most recently to Warner Music Group, until a judge deemed the copyright invalid in 2015. But the original song written by the Hill sisters wasn’t “Happy Birthday to You” — the melody was for a greeting song called “Good Morning to All.” “Happy Birthday to You” was simply a variation that popped up in the early 20th century, although it eventually became the main lyric associated with the tune.
Amateur poet Francis Scott Key wrote the U.S. national anthem, “The Star-Spangled Banner,” while under bombardment at Fort McHenry in Maryland during the War of 1812. In practice, the anthem, originally called “Defence of Fort M’Henry,” usually starts with “O say can you see, by the dawn’s early light,” and ends with “O say does that star-spangled banner yet wave/o’er the land of the free and the home of the brave.” However, this passage is just one of four total verses written by Key. Each one ends with a refrain similar to the last two lines.
“Twinkle Twinkle Little Star” — also the tune to “Baa Baa Black Sheep” and the alphabet song — is popularly attributed to 18th-century composer and child prodigy Wolfgang Amadeus Mozart, but that’s not accurate. Mozart did, however, write some variations on the tune, possibly as an exercise for his music students. The ditty wasn’t known as “Twinkle Twinkle Little Star” at the time, but as a French folk song about candy called “Ah, vous dirai-je, Maman” (“Ah, Mother, if I could tell you”). The words to “Twinkle Twinkle Little Star” were written by poet Jane Taylor and published in 1806.
It’s “Hokey Pokey” in the U.S. and “Hokey Cokey” in the U.K.
The origins of this popular dance are murky and difficult to untangle, so it’s hard to say with certainty how it ended up this way. But the fact remains: When you put your right hand in, take your right hand out, put your right hand in, and shake it all about, you’re doing the “Hokey Pokey” in the United States and the “Hokey Cokey” in the United Kingdom.
Despite not being written for New Year’s Day, the tune “Auld Lang Syne” has become the traditional ballad of the holiday in many English-speaking countries, including the United States, where it’s sung by thousands who aren’t exactly sure what it means. It’s written in Scots, which sounds similar to English in some ways but is a distinct language. Scots is descended from Northern English, which replaced Scottish Gaelic in some portions of Scotland between the 11th and 14th centuries. The literal translation of auld lang syne is “old long since,” but it effectively means “for old times’ sake.”
“Alouette” Is About Plucking a Bird’s Feathers … and Eyes
If you don’t know the real French lyrics to “Alouette,” it sounds like a sweet French nursery rhyme with a bouncy beat. If you’re a francophone, however, you know that it gets a little dark. The chorus translates to “lark, nice lark, I’ll pluck you,” and the verses alternate different body parts — so it’s useful for teaching children about them. Examples include “I’ll pluck your beak,” “I’ll pluck your head,” and, in some versions, “I’ll pluck your eyes.”
Old MacDonald Goes by Different Names Around the World
Like many traditional songs, “Old MacDonald Had a Farm” lived a few different lives before it became standardized. One version from 1917 is about Old Macdougal with a farm in “Ohio-i-o.” Another version from the Ozarks is about Old Missouri with a mule, he-hi-he-hi-ho. The English-speaking world has pretty much settled on Old MacDonald, but in different countries, the name gets localized. In Swedish, the farmer is named Per Olsson; in Germany, it’s Onkel (Uncle) Jörg.
Sarah Anne Lloyd
Writer
Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
While the things we see and use daily may sometimes be considered mundane, there’s more to them than you might imagine. Did you know that the stick (like the ones you may have just raked up in your yard) was inducted into the National Toy Hall of Fame? When you ate a bagel for breakfast, did you think back to its days as a gift for new mothers? Learn about these stories and more with some mind-expanding facts about everyday items collected from around our website.
Love Seats Were Originally Designed To Fit Women’s Dresses, Not Couples
The two-seater upholstered benches we associate with cozy couples were initially crafted with another duo in mind: a woman and her dress. Fashionable attire in 18th-century Europe had reached voluminous proportions — panniers (a type of hooped undergarment) were all the rage, creating a wide-hipped silhouette that occasionally required wearers to pass through doors sideways. Upper-class women with funds to spare adopted billowing skirts that often caused an exhausting situation: the inability to sit down comfortably (or at all). Furniture makers of the period caught on to the need for upsized seats that would allow women with large gowns a moment of respite during social calls.
As the 1800s rolled around, so did new dress trends. Women began shedding heavy layers of hoops and skirts for a slimmed-down silhouette that suddenly made small settees spacious. The midsize seats could now fit a conversation companion. When sweethearts began sitting side by side, the bench seats were renamed “love seats,” indicative of how courting couples could sit together for a (relatively) private conversation in public. The seat’s new use rocketed it to popularity, with some featuring frames that physically divided young paramours. While the small sofas no longer act as upholstered chaperones, love seats are still popular — but mostly because they fit well in small homes and apartments.
Bagels Were Once Given as Gifts to Women After Childbirth
After a woman has had a bun in the oven for nine months, presenting her with a bagel might seem like a strange choice. But some of the earliest writings on bagels relate to the idea of giving them as gifts to women after labor. Many historians believe that bagels were invented in the Jewish community of Krakow, Poland, during the early 17th century. Their circular shape echoes the round challah bread eaten on the Jewish new year, Rosh Hashanah. Enjoying round challah is meant to bring good luck, expressing the hope that endless blessings — goodness without end — will arrive in the coming year. Likewise, in Krakow centuries ago, a bagel signified the circle of life and longevity for the child.
Community records in Krakow advised that bagels could be bestowed on both expectant and new moms. They were also regarded as a thoughtful gift for midwives. In addition to the symbolism of the round shape, the bread was believed to bring a pregnant woman or midwife good fortune in a delivery by casting aside evil spirits. Some pregnant women even wore bagels on necklaces as protection, or ensured bagels were present in the room where they gave birth.
The Tiny Pocket in Your Jeans Was Created To Store Pocket Watches
Ever notice the tiny pocket-within-a-pocket in your jeans? As a kid you may have put small change in there, whereas most adults tend to forget it even exists. Despite all the names it’s had throughout time — “frontier pocket,” “coin pocket,” and “ticket pocket” being just a few — it originally had a specific purpose that didn’t pertain to any of those objects: It was a place to put your watch.
Originally called waist overalls when Levi Strauss & Co. first began making them in 1879, the company’s jeans have always had this dedicated spot for pocket watches — especially those worn by miners, carpenters, and the like. They only had three other pockets (one on the back and two on the front) at the time, making the watch pocket especially prominent. As for why it’s stuck around, the answer seems to be a familiar one: People were used to it and no one felt inclined to phase it out.
If you’ve ever gotten bored enough to study the cap of your ballpoint pen, you may have noticed that it has a hole in it. The hole wasn’t created to save on plastic or to regulate air pressure. Rather, the design is meant to prevent people — namely small children — from choking should they ever swallow a cap. This was first done by BIC, whose popular Cristal pen had a cap that proved more desirable among undiscerning children than safety-conscious parents would have liked. So while the conspiracy-minded among us tend to think that the holes are there to dry out the ink and ensure that consumers will have to continue buying pens in mass quantities, this particular design choice was actually made with public health in mind.
Credit: Getty Images/ Unplash+
The World’s First Vending Machine Dispensed Holy Water
Democracy, theater, olive oil, and other bedrocks of Western civilization all got their start with the Greeks. Even some things that might seem like squarely modern inventions have Hellenistic roots, including the humble vending machine. In the first century CE, Greek engineer and mathematician Heron of Alexandria published a two-volume treatise on mechanics called Pneumatica. Within its pages was an assortment of mechanical devices capable of all types of wonders: a never-ending wine cup, rudimentary automatic doors, singing mechanical birds, various automata, the world’s first steam engine, and a coin-operated vending machine.
Heron’s invention wasn’t made with Funyuns and Coca-Cola in mind, however: It dispensed holy water. In Heron’s time, Alexandria was a province of the Greek empire and home to a cornucopia of religions with Roman, Greek, and Egyptian influences. To stand out, many temples hired Heron to supply mechanical miracles meant to encourage faith in believers. Some of these temples also had holy water, and experts believe Heron’s vending machine was invented to moderate acolytes who took too much of it. The mechanism was simple enough: When a coin was inserted in the machine, it weighed down a balancing arm, which in turn pulled a string opening a plug on a container of liquid. Once the coin dropped off the arm, the liquid stopped flowing. It would be another 1,800 years before modern vending machines began to take shape — many of them using the same principles as Heron’s miraculous holy water dispenser.
The Ancient Romans Thought Eating Butter Was Barbaric
Our friends in ancient Rome indulged in a lot of activities that we would find unseemly today — like gladiators fighting to the death — but they drew the line at eating butter. To do so was considered barbaric, with Pliny the Elder going so far as to call butter “the choicest food among barbarian tribes.” In addition to a general disdain for drinking too much milk, Romans took issue with butter specifically because they used it for treating burns and thus thought of it as a medicinal salve, not a food.
They weren’t alone in their contempt. The Greeks also considered the dairy product uncivilized, and “butter eater” was among the most cutting insults of the day. In both cases, this can be partly explained by climate — butter didn’t keep as well in warm southern climates as it did in northern Europe, where groups such as the Celts gloried in their butter. Instead, the Greeks and Romans relied on olive oil, which served a similar purpose. To be fair, though, Romans considered anyone who lived beyond the empire’s borders (read: most of the world) to be barbarians, so butter eaters were in good company.
As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.
Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria.
Despite these built-in natural preservatives, it is possible for honey to spoil if it’s improperly stored. In a sealed container, honey is safe from humidity, but when left open it can absorb moisture that makes it possible for bacteria to survive. In most cases, honey can be safely stored for years on end, though the USDA suggests consuming it within 12 months for the best flavor.
The Name for a Single Spaghetti Noodle Is “Spaghetto”
If you go into an Italian restaurant and order spaghetto, chances are you’ll leave hungry. That’s because “spaghetto” refers to just a lone pasta strand; it’s the singular form of the plural “spaghetti.” Other beloved Italian foods share this same grammatical distinction— one cannoli is actually a “cannolo,” and it’s a single cheese-filled “raviolo” or “panino” sandwich. Though this may seem strange given that these plural terms are so ingrained in the English lexicon, Italian language rules state that a word ending in -i means it’s plural, whereas an -o or -a suffix (depending on whether it’s a masculine or feminine term) denotes singularity. (Similarly, “paparazzo” is the singular form of the plural “paparazzi.”) As for the term for the beloved pasta dish itself, “spaghetti” was inspired by the Italian word spago, which means “twine” or “string.”
Though usually used interchangeably, these are technically two different pieces of furniture — and the distinction lies in the words themselves. “Couch” comes to us from French, namely coucher — “to lie down” — whereas we have the Arabic word suffah to thank for “sofa.” In the most traditional sense, a sofa would be a wooden bench that comes complete with blankets and cushions and is intended for sitting. eBay’s selling guide used to distinguish between the two by defining a couch as “a piece of furniture with no arms used for lying.” Though it may be a distinction without a difference these days, purists tend to think of sofas as a bit more formal and couches as something you’d take a nap on and let your pets hang out on.
U.S. Pools Were Originally Designed to Keep the Masses Clean
Boston’s Cabot Street Bath was the nation’s first indoor municipal pool. Founded in 1868, the pool was on the bleeding edge of what would become a boom in baths designed to help the working classes clean up. The short-lived facility (it was open for only eight years) was soon followed by municipal baths and pools all over the nation, especially in cities with growing immigrant populations whose tenement apartments didn’t contain adequate bathing facilities.
In New York, starting in 1870, river water filled floating, poollike public baths that, according to one onlooker, were as filthy as “floating sewers.” Eventually, by about the mid-20th century, the city’s river baths morphed into the indoor pools we know today — though the city does still have some seasonal outdoor pools.
On February 6, 1971, Alan Shepard took one small shot for golf and one giant swing for golfkind. An astronaut on the Apollo 14 landing, Shepard was also a golf enthusiast who decided to bring his hobby all the way to the moon — along with a makeshift club fashioned partly from a sample-collection device. He took two shots, claiming that the second went “miles and miles.” The United States Golf Association (USGA) later put the actual distance of his two strokes at about 24 yards and 40 yards, respectively.
While not enough to land him a spot on the PGA Tour, those numbers are fairly impressive when you remember that the stiff spacesuit Shepard was wearing (in low gravity, no less) forced him to swing with one arm. And while those two golf balls remain on the moon, Shepard brought his club back, later donating it to the USGA Museum in Liberty Corner, New Jersey. Other objects now residing on the moon include photographs, a small gold olive branch, and a plaque that reads: “Here men from the planet Earth first set foot upon the Moon July 1969, A.D. We came in peace for all mankind.”
The Inventor of the Stop Sign Never Learned How To Drive
Few people have had a larger or more positive impact on the way we drive than William Phelps Eno, sometimes called the “father of traffic safety.” The New York City-born Eno — who invented the stop sign around the dawn of the 20th century — once traced the inspiration for his career to a horse-drawn-carriage traffic jam he experienced as a child in Manhattan in 1867: “There were only about a dozen horses and carriages involved, and all that was needed was a little order to keep the traffic moving,” he later wrote. “Yet nobody knew exactly what to do; neither the drivers nor the police knew anything about the control of traffic.”
After his father’s death in 1898 left him with a multimillion-dollar inheritance, Eno devoted himself to creating a field that didn’t otherwise exist: traffic management. He developed the first traffic plans for New York, Paris, and London. In 1921, he founded the Washington, D.C.-based Eno Center for Transportation, a research foundation on multimodal transportation issues that still exists. One thing Eno didn’t do, however, is learn how to drive. Perhaps because he had such extensive knowledge of them, Eno distrusted automobiles and preferred riding horses. He died in Connecticut at the age of 86 in 1945 having never driven a car.
The Stick Has Been Inducted Into the National Toy Hall of Fame
From teddy bears to train sets, classic playthings of youth often conjure memories of a gleaming toy store, holidays, or birthdays. So curators at the Strong National Museum of Play branched out when they added the stick to their collection of all-time beloved toys. Among the most versatile amusements, sticks have inspired central equipment in several sports, including baseball, hockey, lacrosse, fencing, cricket, fishing, and pool. Humble twigs are also ready-made for fetch, slingshots, toasting marshmallows, and boundless make-believe.
Located in Rochester, New York — about 70 miles northeast of Fisher-Price’s headquarters — the Strong acquired the fledgling National Toy Hall of Fame in 2002. (It was previously located in the Gilbert House Children’s Museum in Salem, Oregon.) To date, 74 toys have been inducted, including Crayola Crayons, Duncan Yo-Yos, and bicycles. The stick was added in 2008, three years after another quintessential source of cheap childhood delight: the cardboard box.
Credit: Andrew Burton/ Getty Images News via Getty Images
Eggo Waffles Were Originally Called Froffles
The brothers behind your favorite frozen waffles took a while to iron out the details of their signature product. Working in their parents’ basement in San Jose, California, in the early 1930s, Frank, Anthony, and Sam Dorsa first whipped up their own brand of mayonnaise. Since the base ingredient of mayonnaise is egg yolks — and the brothers took pride in using “100% fresh ranch eggs” — they christened their fledgling company “Eggo.” Despite launching the business during the Great Depression, Eggo mayonnaise sold like hotcakes, motivating the Dorsas to extend their product line. Soon, they were selling waffle batter — another egg-based product. To simplify shipping, they also whipped up a powdered mix that required only the addition of milk.
When the frozen food industry took off in the 1950s, the brothers wanted to take advantage of the rush to the freezer aisle. Frank Dorsa (a trained machinist) repurposed a carousel engine into a rotating device that could anchor a series of waffle irons, each cooking a breakfast treat that was flipped by a factory employee. The machine allowed Eggo to prepare thousands of freezer-bound waffles per hour. These debuted in grocery stores in 1953 under the name “Froffles,” a portmanteau of “frozen” and “waffles.” Customers referred to them simply as “Eggos,” and the Froffles moniker was dropped within two years. Now a Kellogg’s-owned brand, Eggo serves up waffles as well as other frozen breakfast treats, with mayonnaise — and the name Froffles — but a distant memory.
On January 5, 1858, Ezra J. Warner of Connecticut invented the can opener. The device was a long time coming: Frenchman Nicolas Appert had developed the canning process in the early 1800s in response to a 12,000-franc prize the French government offered to anyone who could come up with a practical method of preserving food for Napoleon’s army. Appert devised a process for sterilizing food by half-cooking it, storing it in glass bottles, and immersing the bottles in boiling water, and he claimed the award in 1810. Later the same year, Englishman Peter Durand received the first patent for preserving food in actual tin cans — which is to say, canned food predates the can opener by nearly half a century.
Though he didn’t initially know why his method of storing food in glass jars and heating them worked, years of experimentation led Appert to rightly conclude that “the absolute deprivation from contact with the exterior air” and “application of the heat in the water-bath” were key. He later switched to working with cans himself. Before Warner’s invention, cans were opened with a hammer and chisel — a far more time-consuming approach than the gadgets we’re used to. Warner’s tool (employed by soldiers during the Civil War) wasn’t a perfect replacement, however: It used a series of blades to puncture and then saw off the top of a can, leaving a dangerously jagged edge. As for the hand-crank can opener most commonly used today, that wasn’t invented until 1925.
Before Erasers, People Used Bread To Rub Out Pencil Marks
The very first pencils arrived around the dawn of the 17th century, after graphite (the real name for the mineral that forms a pencil’s “lead”) was discovered in England’s Lake District. But the eraser didn’t show up until the 1770s, at the tail end of the Enlightenment. So what filled the roughly 170-year-long gap? Look no further than the bread on your table. Back in the day, artists, scientists, government officials, and anyone else prone to making mistakes would wad up a small piece of bread and moisten it ever so slightly. The resulting ball of dough erased pencil marks on paper almost as well as those pink creations found on the end of No. 2 pencils today.
But in 1770, English chemist Joseph Priestly (best known for discovering oxygen) wrote about “a substance excellently adapted to the purpose of wiping from paper the marks of a black lead pencil.” This substance, then known as caoutchouc, was so perfect for “rubbing” out pencil marks that it soon became known simply as “rubber.” Even today, people in the U.K. still refer to erasers as “rubbers.” (The name “lead-eater” never quite caught on.)
On January 9, 2007, Apple CEO Steve Jobs revealed the iPhone to the world. Since then, Apple’s pricey slab of glass stuffed with technology has become more or less synonymous with the word “smartphone” (sorry, Android fans). But smartphones predate the iPhone by more than a decade. To pinpoint the smartphone’s true birthdate, look back to November 23, 1992, and the introduction of IBM’s Simon at a trade show in Las Vegas. Today, IBM is best known for supercomputers, IT solutions, and enterprise software, but in the ’80s and early ’90s the company was a leader in consumer electronics — a position it hoped to solidify with Simon.
Simon was a smartphone in every sense of the word. It was completely wireless and had a digital assistant, touchscreen, built-in programs (calculator, to-do list, calendar, sketch pad, and more), and third-party apps, something even the original iPhone didn’t have. The idea was so ahead of its time, there wasn’t even a word for it yet — “smartphone” wasn’t coined for another three years. Instead, its full name when it debuted to the larger public in 1993 was the Simon Personal Communicator, or IBM Simon for short. But there’s a reason there isn’t a Simon in everyone’s pocket today. For one thing, the phone had only one hour of battery life. Once it died, it was just a $900 brick (technology had a long way to go before smartphones became pocket-sized; Simon was 8 inches long by 2.5 inches wide). Cell networks were still in their infancy, so reception was spotty at best, which is why the Simon came with a port for plugging into standard phone jacks. In the mid-aughts, increases in carrier capacity and the shrinking of electronic components created the perfect conditions for the smartphones of today. Unfortunately for Simon, it was too late.
Governments worldwide have levied taxes for thousands of years; the oldest recorded tax comes from Egypt around 3000 BCE. But England — which relied heavily on taxes to fund its military conquests — is known for a slate of fees that modern taxpayers might consider unusual. Take, for instance, the so-called “window tax,” initially levied in 1696 by King William III, which annually charged citizens a certain amount based on the windows in their homes. Some 30 years before, the British crown had attempted to tax personal property based on chimneys, but clever homeowners could avoid the bill by temporarily bricking up or dismantling their hearths and chimneys before inspections. With windows, assessors could quickly determine a building’s value from the street. The tax was progressive, charging nothing for homes with few or no windows and increasing the bill for dwellings that had more than 10 (that number would eventually shrink to seven).
Not surprisingly, homeowners and landlords throughout the U.K. resented the tax. It didn’t take long for windows to be entirely bricked or painted over (much like fireplaces had been), and new homes were built with fewer windows altogether. Opponents called it a tax on “light and air” that hurt public health, citing reduced ventilation that in turn encouraged disease. Even famed author Charles Dickens joined the fight to dismantle the tax, publishing scathing pieces aimed at Parliament on behalf of poor citizens who were most impacted by the lack of fresh air. Britain repealed its window tax in July 1851, but the architectural impact is still evident — many older homes and buildings throughout the U.K. still maintain their iconic converted windows.
Philadelphia Cream Cheese Isn’t Actually From Philadelphia
The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But one thing’s for sure: Despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods.
So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.
The Color of Your Bread Tag Has an Important Meaning
Ever wonder why the tags used to seal loaves of bread come in different colors? Far from arbitrary, the color-coded system indicates which day of the week the bread was baked. The color system is even alphabetical: Monday is blue, Tuesday is green, Thursday is red, Friday is white, and Saturday is yellow. (Traditionally, bread wasn’t delivered on Wednesday or Sunday.)
Because bread rarely remains on the shelf for more than a few days, this system is more for internal use among employees than it is for customers looking to get the freshest sourdough possible. But if you favor a local bakery and get to know their system, you could either snag the best deals or the fluffiest dinner rolls in town.
The Snickers Candy Bar Was Named After One of the Mars Family’s Favorite Horses
While names like Hershey’s and 3 Musketeers (which originally included three bars) are fairly straightforward, some candy bar monikers are more elusive. Case in point: What, exactly, is a Snickers? Actually it’s a “who” — and not a human “who” at that. The candy bar was named after one of the Mars family’s favorite horses. Franklin Mars founded Mars, Incorporated (originally known as Mar-O-Bar Co.) in 1911, introducing Snickers in 1930; when it came time to name his product, he immortalized his equine friend as only a candy magnate could.
As Mars has grown into America’s fourth-largest private company, it has retained a dual focus on both candy and pets. M&M’s, Twix, and Milky Way are all Mars products, as are Iams, Pedigree, and Royal Canin. If you’ve ever wondered how M&M’s got their name, the story is slightly less interesting — it’s simply the last initials of Forrest Mars (Frank’s son) and partner-in-candy Bruce Murrie. The company is known for secrecy, with the family itself having been described as a “reclusive dynasty,” which means it’s a minor miracle that the identity of Snickers the horse was ever revealed in the first place.
The First Product Scanned With a Barcode Was Juicy Fruit Gum
When Marsh Supermarket cashier Sharon Buchanan rang up a 10-pack of Juicy Fruit on June 26, 1974, and heard a telltale beep, her face must have registered relief. Buchanan’s co-workers at the grocery store in Troy, Ohio, had placed barcodes on hundreds of items the night before, as the National Cash Register Company installed the shop’s new computers and scanners. Buchanan’s “customer” for that first purchase was Clyde Dawson, the head of research and development at Marsh Supermarkets, Inc. For that fateful checkout, Dawson chose the gum, made by the Wrigley Company, because some had wondered if the machine would have trouble reading the item’s very small barcode. It didn’t. Today, one of Marsh’s earliest scanners is part of the Smithsonian Museum of American History.
The Microwave Was Invented by Accident, Thanks to a Melted Chocolate Bar
The development of radar helped the Allies win World War II — and oddly enough, the technological advances of the war would eventually change kitchens forever. In 1945, American inventor Percy Spencer was fooling around with a British cavity magnetron, a device built to make radar equipment more accurate and powerful. To his surprise, microwaves produced from the radar melted a chocolate bar (or by some accounts, a peanut cluster bar) in his pocket. Spencer quickly realized that magnetrons might be able to do something else: cook food.
With the help of a bag of popcorn and, some say, a raw egg, Spencer proved that magnetrons could heat and even cook food. First marketed as the Radarange, the microwave oven launched for home use in the 1960s. Today, they’re as ubiquitous as the kitchen sink — all thanks to the Allied push to win the war.
Credit: Getty Images/ Unsplash+
Libraries Predate Books
While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.
Credit: Getty Images/ Unsplash+
Umbrellas Were Once Used Only by Women
Umbrellas have been around for a long time — at least 3,000 years, according to T.S. Crawford’s A History of the Umbrella — but they were used by only select segments of the population for much of that history. Ancient Egyptians used them to shade their pharaohs, setting the tone for an association with royalty and nobility that would also surface in China, Assyria, India, and other older civilizations. Meanwhile, they were deemed effeminate by ancient Greeks and the Romans who assumed many of their cultural habits. It should be noted that these early umbrellas protected against the sun, not rain, and were generally used by women to shield their complexions. The association between women and umbrellas persisted through much of Europe for centuries, and stubbornly remained into the 18th century, even after the first waterproof umbrellas had been created (around the 17th century in France).
In England, at least, the man credited with ushering in a new age of gender-neutral weather protection was merchant and philanthropist Jonas Hanway. Having spotted the umbrella put to good use during his many travels, Hanway took to carrying one through rainy London in the 1750s, a sight met with open jeering by surprised onlookers. The greatest abuse apparently came from coach drivers, who counted on inclement weather to drive up demand for a dry, comfy ride. But Hanway took the derision in stride. Shortly after his death in 1786, an umbrella advertisement surfaced in the London Gazette, a harbinger of sunnier days to come for the accessory’s reputation as a rain repellant for all.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
While we’ve come a long way from being solely reliant on the sun’s rays to chart the day, the core principles of determining time remain largely the same. Nowadays, some people wear a trusty wristwatch, whereas others glance at their phone for a quick update. No matter your preferred method of tracking the hours, here are six timely facts about clocks and other timekeeping devices.
The Oldest Working Mechanical Clock Is Located at an English Cathedral
England’s Salisbury Cathedral dates back to the 13th century, and is home to one of four surviving original copies of the 1215 Magna Carta. The cathedral is also the site of the world’s oldest working mechanical clock, a machine dating back to 1386, if not earlier.
Composed of hand-wrought iron and intertwined with long ropes that extend halfway up the cathedral walls, the Salisbury Cathedral clock is the brainchild of three clockmakers: Johannes and Williemus Vriemand, as well as Johannes Jietuijt of Delft. The clock operates thanks to a system of falling weights, which are pre-wound once each day, and the device is designed solely to denote each passing hour. It once sat in a detached bell tower before falling into disuse around 1884. Thankfully, the mechanism was rediscovered in 1929 and later restored in 1956; prior to that restoration, the clock had successfully chimed for nearly 500 years on over 500 million separate occasions. It continues to operate today.
Pennies Are Used To Maintain the Accuracy of Big Ben’s Clock Tower
London’s Elizabeth Tower, at the north end of the Palace of Westminster, boasts one of the most recognizable clock faces in the world. Inside the tower’s belfry is where one can find “Big Ben” — though many use the name to refer to the tower as a whole, it actually refers to the mechanism’s grandest and most prominent bell. Name-related confusion aside, the clock is notable for another reason, too: Its accuracy is regulated using old pennies and, on occasion, other coins.
Due to external atmospheric conditions such as air pressure and wind, the exact time depicted on the face of Elizabeth Tower can fall ever so slightly out of sync with reality. In order to right these wrongs, old pennies — coins that existed prior to England’s switch to decimal coinage in 1971 — are added to the bell’s pendulum, which in turn alters the daily clock speed by 0.4 seconds per penny. The process is a long-standing one, having been used to regulate the time as far back as 1859. In 2012, three of the 10 coins relied upon for this purpose were, for a brief time, swapped out for a five-pound crown commemorating that year’s London Olympics.
19th-Century Maritime Signals Inspired Times Square’s New Year’s Ball
The New Year’s Ball drop in Times Square, New York, is a beloved annual tradition, though its origins had nothing to do with revelry. In fact, the ball drop was inspired by a 19th-century timekeeping mechanism targeted at maritime crews. “Time balls” — which dropped at certain times as a signal to passing ships and navigators to set their on-ship chronometers — first appeared in Portsmouth Harbor in 1829 and later at England’s Royal Observatory at Greenwich in 1833. In fact, the giant red time ball located in Greenwich continues to operate today.
The balls were the culmination of an idea suggested by a man known as Robert Wauchope, who promoted the concept of visual clues located ashore to help passing ships tell time. Wauchope originally suggested the use of flags, though orbs that moved up and down were finally settled upon instead. Though these time balls were initially made to help mariners keep track of time, they soon became an attraction among locals, as people would come to watch the ball fall, in a precursor to today’s New Year’s Eve crowds.
Medieval Engineer Ismael al-Jazari Invented an Elephant Clock
Throughout the 12th and early 13th centuries, few inventors pioneered more mechanisms in the world of robotics than Ismael al-Jazari, who lived and worked in what is now Turkey. Al-Jazari was so influential at the time that he’s believed to have even inspired the works of Leonardo da Vinci. Among al-Jazari’s most notable timekeeping inventions was an elephant clock, colorful illustrations of which appeared in his 1206 manuscript, The Book of Knowledge of Ingenious Mechanical Devices.
The clock was an intricate device constructed atop the back of a copper elephant, containing several moving parts as well as a scribe to denote the passing of time. The entire clock relied upon a water-powered timer, which was made up of a bowl that slowly descended into a hidden tank of water. As that bowl sank, the scribe noted the number of minutes. Furthermore, every half hour a ball would be triggered to fall and collide with a fan, which rotated the device’s dial to show how many minutes had passed since sunrise. That same ball ultimately dropped into a vase that in turn triggered a cymbal to begin the cycle anew. The whole mechanism not only incorporated this Indian-inspired timing technology, but also Greek hydraulics as well as design elements from Egyptian, Chinese, and Arabian cultures.
The World’s Most Accurate Clock Is Located in Boulder, Colorado
Located in the basement of a laboratory at the University of Colorado, Boulder, is a clock considered to be the world’s most accurate. Invented by scientist Jun Ye, the clock is so precise that it would take 15 billion years for it to lose a single second of time. That absurd number dwarfs the traditional 100 million years that it takes many modern atomic clocks to lose a second.
The first atomic clock was constructed in 1949 by the National Bureau of Standards, and helped scientists to accurately redefine the measurement of a second by the year 1967. Prior to that point, a second had been calculated as 0.000,011,574 of a mean solar day, which proved to be inaccurate due to the irregular rotation of the Earth. Ye’s new clock optimizes the techniques of those early atomic clocks, using strontium atoms that are arranged in a 3D lattice to tick at 1 million billion times per second. While that science-heavy explanation may not be entirely clear to the average person, Ye’s atomic clock can be summed up like this: It’s really, really accurate.
French Revolutionary Time Instituted a System of 10-Hour Days
While societies around the world may not agree on much, one thing that’s generally accepted is that each day is 24 hours long. Back in 1793, however, during the French Revolution, France took an oppositional stance and adopted a new timekeeping concept. This decimal time concept included 10-hour days, 100 minutes every hour, and 100 seconds per minute. In essence, its base 10 method of timekeeping was proposed as a simpler way to note how much time had passed on any given day.
This new timekeeping plan officially started on November 24, 1793, and was immediately met with resistance and confusion by the public. People were unwilling to change their old habits for telling time, despite French clockmakers producing new mechanisms that featured both traditional timekeeping methods and the new decimal-based technique. In the end, decimal clocks lost their official status in 1795, though the concept wasn’t quite dead yet. France tried yet again in 1897, this time proposing a variant that incorporated 24-hour-long days with 100 minutes per hour, but that proposal was scrapped in 1900.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Thanks to the 1975 blockbuster Jaws, a generation of people have grown up with the mistaken belief that sharks are man-eating monsters, intent on attacking anything that moves. Scientists have worked hard to dispel such myths about the ancient creatures, which roam every ocean and vary widely in size, shape, diet, habitat, and attitude. Here are a few facts about these fascinating fish.
The first shark that really looked shark-like appeared around 380 million years ago in the Devonian period. Just a few million years later, a major extinction wiped out many species that competed with sharks, allowing them to evolve rapidly into numerous new shapes, sizes, and ecological niches — some of which are still around. One of the oldest species living today is the bluntnose sixgill shark, which evolved between 200 million and 175 million years ago in the early Jurassic epoch.
As cartilaginous fishes, sharks don’t leave much behind when they die. Known shark fossils consist mainly of teeth and a handful of scales, vertebrae, and impressions left in rock. Even so, paleontologists have been able to identify about 2,000 species of extinct sharks just by examining differences in fossilized teeth. For example, the oldest shark teeth ever found came from an Early Devonian fish dubbed Doliodus problematicus; bits of its fossilized skeleton showed characteristics similar to bony fishes, while its teeth and jaw were more shark-like, confirming a theory that the species was an ancient ancestor of sharks.
There Are More Than 500 Species of Sharks in the World
Sharks are categorized into nine taxonomic orders. To name a few of the most recognizable types, Carcharhiniformes, the order of ground sharks, encompasses over 290 species, including the bull shark, tiger shark, blue shark, hammerhead, and more. The great white shark, basking shark, and makos, as well as the aptly named goblin shark and other species, belong to Lamniformes — also known as mackerel sharks. The carpet shark order, Orectolobiformes, includes the whale shark, nurse shark, wobbegong, and others. In all, there are more than 500 species of sharks swimming the world’s water.
There’s a Huge Size Difference Between the Largest and Smallest Sharks
With so many shark species swimming Earth’s oceans, there’s incredible variation in their sizes. The largest shark species living today is the whale shark (Rhincodon typus), a gentle, plankton-eating giant that can grow to 45 feet long or more and weigh 20 tons (the biggest accurately measured whale shark reached 61.7 feet!). They can be found in all of the world’s tropical seas. The smallest known shark species, meanwhile, was discovered in 1985 off the coast of Colombia in the Caribbean Sea: The dwarf lantern shark (Etmopterus perryi) averages a length of just under 7 inches. It dwells in the ocean’s twilight zone, about 1,000 feet below the surface, but sometimes feeds in the shallows and uses bioluminescent organs along its belly to camouflage itself against sunlit waters.
Like all fishes, sharks have a sensory organ called the lateral line running down the length of their bodies. The lateral line system involves exterior pores and specialized cells that can detect vibrations in water, which helps sharks locate prey from hundreds of feet away. In addition to sensing water movements, sharks can perceive electric fields surrounding other animals (the fields are caused by the animals’ muscle contractions). This sixth sense, called electroreception, picks up electrical signals that sharks can use to home in on prey. Electroreception can also guide migrating sharks via Earth’s electromagnetic fields.
The slow-growing, Arctic-dwelling Greenland shark (Somniosus microcephalus) is not only the longest-lived shark, but also holds the record for the longest-lived vertebrate on Earth. Unlike other sharks, Greenland sharks don’t have cartilage that shows their growth over time, so scientists have had difficulty estimating their age accurately. In 2016, a study in the journal Science described how a team of biologists carbon-dated eye proteins, which build up continuously during the animals’ lives, in several Greenland sharks. They found the individuals were an average of 272 years old when they died, and the results suggested that the sharks’ maximum life span could be up to 500 years.
You’re More Likely To Be Killed by a Cow Than a Shark
Your risk of suffering a shark attack is practically nil. For its 2022 global summary, the Florida Museum of Natural History’s International Shark Attack File confirmed 57 unprovoked shark bites in 2022, meaning they happened when humans were simply in the shark’s natural habitat, and 32 provoked attacks, such as when people were feeding or harassing the fish. Forty-one of the unprovoked attacks occurred in the U.S., and one was fatal. Other animals are way more likely to kill you, including cows (which kill an average of 20 Americans a year, according to CDC data), hornets, bees, and wasps, (about 48 people a year) and dogs (around 19 a year).
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.