Original photo by georgeclerk/ iStock

We often rely on the medicine cabinet when headaches and stomach pains creep up, but how often have you paused in a moment of discomfort to think about the origins of the medications that aid your maladies? Chances are, not very often. Yet many of the over-the-counter drugs kept in a home first-aid kit have their own interesting stories — like these eight common items.

Close-up of a pack of Benadry pills.
Credit: Smith Collection/Gado/ Archive Photos via Getty Images

Benadryl

The tiny pink tablets that can clear up allergic reactions may never have been invented if chemist George Rieveschl had succeeded in his first career. The Ohio-born scientist initially planned to become a commercial artist, but couldn’t line up much work thanks to the Great Depression. Instead of pursuing his art dream, Rieveschl studied chemistry at the University of Cincinnati, where his experiments years later on muscle-relaxing drugs uncovered a histamine-blocking medication. “It seemed like bad luck at the time,” Rieveschl once told the Cincinnati Post about his unexpected career shift, “but it ended up working pretty well.”

Tums in a store isle.
Credit: Jeff Greenberg/ Universal Images Group via Getty Images

Tums

Creating a new medication is sometimes a labor of love — at least, that’s the case for Tums. In 1928, Missouri pharmacist Jim Howe developed the chalky tablets to treat his wife’s indigestion. However, it wasn’t until Nellie Howe gave out samples of her husband’s concoction to seasick travelers on a cruise ship that Jim was inspired to sell his acid relievers. Tums hit pharmacy shelves two years later, sold for 10 cents per roll, with a name chosen from a radio contest in St. Louis, the same city where 99% of Tums have been made for nearly 100 years.

Eye drop entering an eye.
Credit: Marina Demeshko/ Shutterstock

Eye Drops

Soothing irritated eyes isn’t just a modern problem — researchers believe humans have been using some form of eye drops for at least 3,500 years. Medicinal recipes from ancient Egyptians included heavy metals like copper and manganese. Modern eye drops are far removed from these early origins, but most contain saline, which was first used for medical treatment in 1832.

Pepto Bismol in the grocery store.
Credit: Justin Sullivan/ Getty Images Entertainment via Getty Images

Pepto-Bismol

No one knows why the original maker of bismuth subsalicylate, aka Pepto-Bismol, chose to dye it a bright pink, considering the solution is actually beige before coloring is added. However, the unnamed physician from the early 1900s who first mixed up the stuff is credited with trying to cure cholera, a deadly food and waterborne illness that causes severe stomach distress. Initially called “Mixture Cholera Infantum” and meant for small children, the stomach-soothing blend contained zinc salts, oil of wintergreen, and a now-iconic pink coloring, among other ingredients. While the early version of Pepto couldn’t cure cholera (which requires rehydration and sometimes antibiotics), it did help treat symptoms, which is why it became popular with doctors. In the early 20th century, New York’s Norwich Pharmacal Company sold its version, called Bismosal, in 20-gallon tubs; today, name brand Pepto-Bismol is manufactured by Procter & Gamble, who continue to dye it the recognizably rosy hue.

Aspirin pills in a hand to be taken orally.
Credit: dszc/ iStock

Aspirin

The headache-melting ingredient in aspirin, acetylsalicylic acid, is a human-made substance, though it’s a cousin to salicylic acid, a naturally occurring substance found in the bark of willow and myrtle trees. Humans have gathered those ingredients for medicinal remedies for millennia; ancient Egyptians and Greeks used them to tamp down fevers and pain. Synthetic versions were first made in 1874, and by the turn of the century, German chemist Felix Hoffman created the first aspirin — initially sold in powder form — as a remedy for his father’s rheumatism.

Acetaminophen drug In prescription medication pills bottle.
Credit: luchschenF/ Shutterstock

Acetaminophen

American chemist Harmon N. Morse first developed acetaminophen, which would eventually become the world’s most widely used pain reliever, in 1878. However, it would take decades for the medication — called paracetamol outside the U.S. — to become an over-the-counter medication, thanks to fears that it could cause methemoglobinemia, a blue skin discoloration that signals issues with how blood cells carry oxygen throughout the body. (The fears turned out to be unfounded.) In 1955, McNeil Laboratories began manufacturing its version of the drug, called Tylenol, which would eventually be purchased by Johnson & Johnson and marketed as more safe and effective than aspirin. While many scientists didn’t agree with this claim, Tylenol became an over-the-counter medication within five years and soared in popularity in the following decades.

Woman hold a box of 400mg ibuprofen tablets in her hand and a glass of water.
Credit: Cristian Storto/ Shutterstock

Ibuprofen

The inventors of ibuprofen created the inflammation-relieving drug with one health condition in mind: rheumatoid arthritis. Pharmacologist Stewart Adams and chemist John Nicholson began their hunt for an aspirin alternative in the 1950s, with the goal of creating a safe, long-term option for patients with the autoimmune condition. Testing one experimental compound on his own headache following a night of drinking, Adams found the prizewinning formula, which was patented in 1962 and rolled out to U.K. pharmacies under the name Brufen. By 1984, ibuprofen had become an over-the-counter medication in the U.K. and U.S., and by the time its original patent expired in 1985, more than 100 million people in 120 countries had taken the medication.

opening capsule filled with fruits and nutrients.
Credit: Eoneren/ iStock

Multivitamins

Vitamins are more preventative than reactive; they can’t cure indigestion or discomfort. But in their earliest forms, vitamins were meant to ward off health conditions caused by nutritional deficiencies. The first commercial vitamin tablets emerged in 1920 following decades of research into illnesses such as beriberi (a vitamin B1 deficiency), and gained traction during World War II due to fears of nutritional constraints caused by rationing. Vitamin consumption held on after the war ended, and today nearly 60% of Americans take a daily vitamin or dietary supplement.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Philip Reeve/ Shutterstock

At only about 10% of the global population, left-handed people are definitely in the minority. But even though left-handers are few and far between, some of society’s most notable figures have written, thrown a ball, or played an instrument with their left hand. There are even some famous fictional characters who are avowed southpaws, including Ned Flanders from The Simpsons, who runs a store, the Leftorium, catering to left-handed people. As we celebrate International Left Handers Day — a holiday that falls each August 13 — let’s get to know a little more about some of the most prominent lefties throughout history.

Leonardo Da Vinci statue in Firenze, Italia.
Credit: IPGG/ Shutterstock

Leonardo da Vinci

Though there’s some argument over whether Leonardo da Vinci was exclusively a lefty or actually ambidextrous, his peers referred to him by the term “mancino,” which is Italian slang for a left-hander. Leonardo was known for a unique style of taking notes, referred to as “mirror writing,” in which he wrote from right to left. (One theory is that the method was meant to avoid ink smudges with his left hand.) His left-handedness also now plays a key role in authenticating his drawings, as experts often look for signs of left-handed strokes and slants in order to confirm whether a piece is a genuine Leonardo da Vinci work. While the Renaissance polymath embraced being a lefty, one of his contemporaries defied it — Michelangelo actually retrained himself to write and draw with his right hand instead of accepting his natural left-handedness.

Babe Ruth smacks pout a few homers in practice before game.
Credit: Bettmann via Getty Images

Babe Ruth

Known for being arguably the greatest baseball slugger of all time, the left-handed-hitting Babe Ruth began his career as one of the most dominant southpaw pitchers of the 1910s. Ruth switched to the outfield after being sold from the Boston Red Sox to the New York Yankees, where his lefty power stroke earned him nicknames like the “Great Bambino” and “Sultan of Swat.” All told, Ruth socked 714 homers during his illustrious career, good enough for third place behind the scandal-plagued Barry Bonds and legendary Hank Aaron. On rare occasions, Ruth would experiment with batting right-handed, though his success from that side was limited.

Jimi Hendrix (1942 - 1970) caught mid guitar-break during his performance.
Credit: Evening Standard/ Hulton Archive via Getty Images

Jimi Hendrix

Revered as one of the greatest guitar virtuosos in rock history, Hendrix made the unique choice to play a right-handed guitar upside down in order to accommodate his left-handed proclivities (although he also performed some tasks with his right hand). His father, Al, forced Jimi to play guitar right-handed, because he believed that left-handedness had sinister connotations (a belief that was once common — the word “sinister” comes from Latin meaning “on the left side”). While Jimi did his best to oblige his father when Al was present, he would flip the guitar as soon as his dad left the room, and he also had it restrung to more easily be played left-handed. Hendrix isn’t the only legendary lefty rocker: Paul McCartney of the Beatles and Nirvana’s Kurt Cobain also strummed their guitars with their left hands.

Marie Curie (1867-1934), Polish-French physicist who won two Nobel Prizes.
Credit: Everett Collection/ Shutterstock

Marie Curie

Given the fact that men are more likely to be left-handed than women, this list has been sorely lacking thus far in terms of famous females. One of history’s greatest left-handers, however, was none other than the groundbreaking scientist Marie Curie. A Nobel Prize winner, Curie helped to discover the principles of radioactivity and was the matriarch of a family full of lefty scientists; her husband Pierre and daughter Irene also possessed the trait. Left-handedness is surprisingly common among well-known scientists even outside of the Curie family — Sir Isaac Newton and computer scientist Alan Turing were southpaws too.

Astronaut Neil Armstrong smiles inside the Lunar Module.
Credit: NASA/ Hulton Archive via Getty Images

Neil Armstrong

According to NASA, more than 20% of Apollo astronauts were lefties, which makes them more than twice as likely to be left-handed compared to the average person. Neil Armstrong was no exception to this statistical oddity — the first man to walk on the moon was indeed left-handed. Needless to say, Armstrong’s left-handedness was truly out of this world.

U.S. President Barack Obama waves after he spoke to the American people.
Credit: Alex Wong/ Getty Images News via Getty Images

Barack Obama

The 44th President of the United States was the eighth left-handed individual to hold said office, though prior to the 20th century, only President James Garfield is known to have been a lefty. Lefties were elected to the presidency more frequently beginning in 1929 with Herbert Hoover: Six Presidents since have been left-handed, including a run of three straight with Ronald Reagan, George H. W. Bush, and Bill Clinton. While signing his first executive order in 2009, Obama quipped: “That’s right. I’m a lefty. Get used to it.” Presidential left-handedness may not be a coincidence — some experts believe that lefties have a stronger penchant for language skills, which could help their rhetoric on the campaign trail.

Statue of Queen Victoria.
Credit: Philip Reeve/ Shutterstock

Queen Victoria (And Other Members of the Royal Family)

England’s great monarch Queen Victoria (who ruled 1837-1901) was known for her left-handedness. Though she was trained to write with her right hand, she would often paint with her natural left. She’s just one of a few members of the royal family with the trait. Victoria’s great-grandson, King George VI — as well as George’s wife, Elizabeth — were also regal lefties, and George’s left-handedness was often prominently on display while playing tennis, one of his favorite hobbies. Two current heirs to the throne and presumed future kings are also proud lefties: Prince William has joked that “left-handers have better brains,” and his young son George has shown a penchant for using his left hand while doing everything from clutching toys to waving at adoring fans.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Jeremy Bezanger/ Unsplash

For some, cooking is an art. For others, a hobby. But for a considerable number of people, cooking is a daily obstacle that must be overcome in order to satiate hunger. But whether you’re new to the kitchen or know a stovetop like the back of your hand, there are some food questions that just have to be asked. From understanding chemical reactions to testing food myths, here are a few things that every cook in the kitchen should know.

Close-up of a woman picking up a dropped cupcake from the floor.
Credit: New Africa/ Shutterstock

Is the “Five-Second Rule” Real?

Most people know the “five-second rule”: the idea that if food that’s fallen on the floor has been there less than five seconds, it’s still acceptable to eat. No one knows the origins of this questionable rule — and plenty of people think it’s kind of gross — but that hasn’t stopped anyone from picking up a dropped Oreo and shouting “five-second rule!” before.

Though its origins may be murky, actual scientists have devoted time and resources to testing the five-second rule. And surprisingly, it’s not an entirely bogus theory — depending on the cleanliness of the floor.

To be clear, no scientist has gone on record recommending that you eat dropped food. However, a science experiment conducted at the University of Illinois Urbana-Champaign proved that as long as the food was picked up within the five-second time limit, the presence of microorganisms on the dropped food was minimal. However, the experiment was conducted after first sanitizing the flooring, and it only applied to hard flooring like tile and wood, which are less likely to serve as an incubator for pathogens. No testing was conducted on carpeting and other soft surfaces, which can hold moisture and become breeding grounds for bacteria.

Let’s cut to the chase: It’s definitely not recommended to blindly follow the five-second rule. You have no way of knowing which pathogens are on your floor, so unless you regularly disinfect, it’s best to play it safe. According to the experts, dry foods are slightly safer than wet ones because moisture is a great medium for pathogens to attach themselves. So, a potato chip or cracker might acquire a minimal pathogen transfer whereas an apple or slice of banana might test positive for a higher pathogen count. But we recommend a new rule: When in doubt, throw it out.

Aerial view of a person cutting a raw onion.
Credit: Alina Kholopova/ Shutterstock

Why Do You Cry When You Cut Onions?

There’s no need to cry over spilled milk, but what about chopped onions? You can thank a chemical combination of enzymes and sulfur for the tears that well up while you make dinner.

Onions use sulfur to make a mixture of amino acids and enzymes during the growing process. The acids and enzymes are separated and stored in different regions of the onion’s cells, which are called vacuoles. While the onion remains whole, the amino acids and enzymes in the onion’s cells remain separated. Once you cut into the onion, however, everything mixes together. When the two substances are combined, they form a chemical known as syn-Propanethial-S-oxide, or lachrymatory factor (LF). LF is an irritant that’s easily vaporized when it reacts with the air.

LF isn’t strong enough to affect tougher parts of your body such as your skin, but it can irritate more sensitive regions. As the vapors waft up toward your face, your eyes will begin to sting. Your body — sensing the irritant — will release a torrent of tears in an attempt to wash the chemicals from your eyes. Luckily, LF can’t do any serious damage, even in high quantities.

Producing LF is the onion’s way of defending against anything that may want to eat it. As soon as an animal bites into the root, its eyes start to burn and it’s reminded to stay away from onions.

Unfortunately for onions, humans are persistent.

A little boy licking the snow from outside.
Credit: Nicole Elliott/ Unsplash

How Do Taste Buds Work?

Taste buds are the reason we pucker our lips when we suck on a lemon wedge or smile when savor a piece of chocolate. They’re how we can identify our favorite foods. In fact, without taste buds we wouldn’t be able to sense the five basic tastes: salty, sweet, sour, bitter, and umami. But what exactly are taste buds and how do they work?

Every tongue is covered in visible bumps known as papillae that fall into four categories: filiform, fungiform, circumvallate, and foliate. Each papillae type except for filiform carries a number of taste buds that are continuously being replaced. In total, every tongue has an average of 10,000 taste buds, which are replaced about every two weeks.

Despite what some may believe, there are no specific areas of the tongue responsible for a particular taste. Instead, it’s the taste receptors scattered across your tongue that pinpoint the proper flavor.

The taste buds in your different papillae are simply a combination of basal cells, columnar (structural) cells, and receptor cells. Different types of receptor cells are coated with proteins intended to attract specific chemicals that are linked to one of the five basic tastes. When the receptor cell identifies the chemical it binds with, it will send a signal through a neural network to the brain via microvilli, or microscopic hairs on every taste bud.

There is more to taste than just the tongue, however. Lining the uppermost part of the human nose are olfactory receptors that are responsible for smell, and they send messages that further hone in on specific tastes. When you chew food, a chemical is released that travels to the upper part of your nose and activates the olfactory receptor. These receptors work in conjunction with the receptors on taste buds to help the brain recognize the taste. This helps explain why a cold or allergies can hinder one’s sense of taste, making everything taste bland.

Young woman goes to sleep under a white comforter.
Credit: Troyan/ Shutterstock

Does Tryptophan Really Make You Tired?

Anyone who’s passed out after indulging in a Thanksgiving feast knows the theory: tryptophan, an amino acid found in turkey, makes you sleepy. But is this conventional wisdom actually true?

The short answer is … not exactly. L-tryptophan, as it’s officially known, can also be found in everything from chicken and yogurt to fish and cheese, none of which are typically associated with sleepiness. Once ingested, tryptophan is converted into the B-vitamin niacin, which helps create the neurotransmitter serotonin. Serotonin plays a key role in melatonin levels and sleep itself, hence the apparent causal link between turkey and fatigue.

Plenty of other amino acids are present in turkey, however, and most of them are found in greater abundance — meaning that, when all those chemicals are rushing to your brain after your second helping, tryptophan rarely wins the race.

If, however, the tryptophan gets a little assistance in the form of carbohydrates, it gets a better shot at dominating your system. Eating carbs — which abound in Thanksgiving dishes like mashed potatoes and stuffing — produces insulin, which flushes every amino acid except tryptophan from your bloodstream. Thus, your post-Thanksgiving sleepiness is actually the result of a perfect storm composed of tryptophan, carbs, and the large portions typically associated with the holiday.

Raspberries and blueberries organized on a pink background.
Credit: Jeremy Bezanger/ Unsplash

What Are Superfoods?

While many nutritionists and physicians recommend healthy eating over fad diets, some foods offer more nutritional benefits than others. That’s why, in recent years, you might have heard about “superfoods” and why you should incorporate them into your diet.

The term “superfood” doesn’t come from medical science. Instead, it was designed by marketers at food companies to help boost sales. But in general, the term applies to particular foods that are nutrient-rich and provide significant health benefits when consumed regularly.

One example of a superfood are eggs, which feature two powerful antioxidants: lutein and zeaxanthin. They are also low in calories, averaging 77 calories per egg. And most importantly, they’re full of nutrients such as iron, phosphorous, selenium, and a myriad of vitamins including A, B2, B5, and B12.

There are also a variety of fruits and vegetables that qualify as superfoods, including berries. Berries are rich in antioxidants, high in fiber, and contain a wide array of vitamins, particularly Vitamins C and K1, manganese, copper, and folate. But nutrient levels can vary widely between berries. For example, strawberries have the highest vitamin C levels of the superfood berries. This heart-healthy food can also help lower inflammation and improve blood sugar and insulin response.

While the phrase “superfoods” might not have a hard definition, there’s plenty of evidence to show that certain foods can improve your health and reduce your risk of serious conditions such as cancer, high blood pressure, and heart disease.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Subbotina Anna/ Shutterstock

Oysters contain multitudes: They’re protein-rich treats, highly efficient water filters, reef builders, and pretty rock-makers. They exist in coastal regions all over the world, from the Aleutian Islands of Alaska to the warm waters around New Zealand, and we’ve been eating them for thousands of years. Yet as Jonathan Swift wrote in his book Polite Company, “He was a bold man that first ate an oyster.” After all, the shellfish can be a little treacherous to open, and what’s inside isn’t everybody’s cup of tea. But how much do you really know about oysters? Do they all make pearls? Have they always been a delicacy? What can you do with the shells? Pry open these six interesting facts about some of the world’s most divisive shellfish.

A man opening oysters and extracting pearls.
Credit: ebonyeg/ Shutterstock

Not All Pearls Are Shiny

Pearls, the semiprecious gems popular for jewelry and other adornment, are created when some kind of unwelcome object, such as a grain of sand, enters an oyster’s shell. The oyster shields itself by wrapping the irritating object in a substance called nacre, a tough material that develops inside the shells of oysters from the Aviculidae family, also known as pearl oysters. The nacre builds up into a pearl, which can be of several different colors.

Oysters cultivated for food are from the Ostreidae family, or true oysters. They create pearls when things sneak into their shells, too, but their pearls don’t have the same lustrous coating that those of pearl oysters do — so they end up just small and bland.

European flat oyster (Ostrea edulis) underwater.
Credit: aquapix/ Shutterstock

Oysters Can Change Sex

Many of the oysters commonly used for food, including European flat oysters, Pacific oysters, and Atlantic oysters, change sex during their lifetimes — sometimes a few times. European flat oysters alternate based on seasons and water temperature. In other species, most oysters are born male and eventually the population evens out. Most older oysters are female, but some change back at some point. The exact mechanism that makes this happen is still something of a mystery.

Living oyster under the sea water.
Credit: Pix Box/ Shutterstock

One Oyster Can Filter Up to 50 Gallons of Water a Day

Oysters are a critical part of marine ecosystems because they eat by filtering water, removing sediment and nitrogen in the process. One adult oyster can filter up to 50 gallons of water a day, although the exact rate depends on water conditions. A typical ocean environment has stressors, like high or low temperatures, predators, and especially dirty water, that can slow down their feeding process. In more typical conditions, an oyster filters 3 to 12.5 gallons of water a day, which is still extraordinarily helpful.

All this water filtration does have a couple of drawbacks: Too many oysters can reduce the nutrients in the water for other animals, and because they take in a lot of junk, they can pass toxins onto us when we eat them.

Copious Oysters shells.
Credit: Sun_Shine/ Shutterstock

Oyster Shells Are Recyclable

Don’t throw away your oyster shells when you’re done shucking — they’re the best material for rebuilding oyster beds, which sometimes create giant reefs that can be home to all kinds of marine life. When oysters reproduce, they release larvae into the ocean, which float around looking for somewhere to attach themselves. With the loss of reef habitats, those spots can be harder to find. Those larvae love to cling to old oyster shells, which makes discarded shells one of the best tools for sustainable oyster farming and rebuilding marine ecosystems — something they certainly can’t do in a landfill. Many municipalities and conservation groups in oyster-rich areas offer some kind of recycling program.

Man placing metal bag with oysters on oyster farm.
Credit: Bartosz Luczak/ Shutterstock

Humans Have Been Cultivating Oysters for Thousands of Years

Oyster farms, particularly sustainable oyster farms, are nothing new. A 2022 archaeological study in the United States and Australia found that Indigenous groups cultivated oyster reefs as far back as 6,000 years ago, and managed to maintain healthy oyster populations for as long as 5,000 years, even with intense harvesting.The oldest oyster middens — hills of oyster shells — were in California and Massachusetts. One midden in Florida contained more than 18 billion oyster shells.

Overharvesting has damaged modern-day oyster populations; the study also found that 85% of oyster habitats from the 19th century were gone by the 21st century.

Fresh oysters platter with sauce and lemon.
Credit: Artur Begel/ Shutterstock

Oysters certainly have their fans in the 21st century, but not like they did in the 19th century. Back then, they were a staple protein because they were both abundant and extremely cheap. They were equally treasured in both fine dining establishments and on city streets. Oyster houses were incredibly common, and inspired the kind of camaraderie and revelry that bars do today — some of them sold beer with oysters as a free snack.

Their popularity wasn’t limited to coastlines; middle America couldn’t get enough of them, either. Oysters were shipped via rail even before beef was. Households would buy them by the barrel and put them in soups, sauces, and even stuffing.

So what happened to the oyster craze of yesteryear? Several things. With overharvesting, the supply wasn’t as great as it once was. Growing cities started dumping sewage into the water, and oyster beds became disease vectors. New food safety regulations — and an end to child labor — meant businessmen couldn’t get away with shady practices that made oysters cheap.

The final nail in the oyster coffin was Prohibition. Oyster bars had already mostly disappeared by then, but they lost their drinking clientele to speakeasies, and their nondrinking regulars thought they were still too much like saloons.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by wertinio/ Shutterstock

Although they’re called public servants, presidents don’t take on the job for free — they’re compensated with a substantial salary while in office. From the time of George Washington until today, the presidential salary has been established by Congress, with the total amount raised on several occasions. Here are six facts about presidential salaries and how they’ve changed since the earliest days of the country.

The Title Page of the Federalist: a Collection of Essays circa 1787.
Credit: Fotosearch/ Archive Photos via Getty Images

Presidents Were Initially Paid To Help Discourage Outside Influence

America’s first president, George Washington, apparently had no desire to be paid as commander in chief. Even prior to serving as POTUS, Washington never accepted money during his time as a military officer. However, the framers of the Constitution decided that the president should be properly compensated in order to prevent whoever held the role from falling victim to financial influence. As Alexander Hamilton explained in the Federalist Papers, establishing an official salary for the president would make the individual less susceptible to bribery.
In order to further limit such influence, the Domestic Emoluments Clause was set in place to prevent the legislature from altering the president’s salary during their term; this kept Congress from using that power as a way to affect the president’s policy decisions. In the end, it was decided by Congress that the president was to receive an annual salary of $25,000 (around $4.5 million today), with the Vice President receiving $5,000, the chief justice earning $4,000, and members of the president’s Cabinet receiving $3,500.

Portrait of American military commander (and US President) Ulysses S. Grant.
Credit: Stock Montage/ Archive Photos via Getty Images

Ulysses S. Grant Was the First POTUS To Receive a Raise

Ulysses S. Grant served as president from 1869 until 1877. On March 3, 1873, Congress passed a law that was referred to by its deriders as the “Salary Grab Act.” The law awarded retroactive pay raises to departing members of Congress, raised the salaries of incoming members of Congress, and doubled the president’s salary, to $50,000 per year. It was the first presidential salary increase in American history. The act was signed by President Grant just hours before he was set to be sworn in for a second term. The signing was deeply controversial, but the pay raise stuck, and presidents continued to earn $50,000 annually until the position’s salary was raised yet again in 1909.

 Babe Ruth during the Yankees' season opener at Yankee Stadium.
Credit: Bettmann via Getty Images

Babe Ruth Was the First Athlete To Earn More Than the President

From 1909 through 1948, the U.S. president made $75,000 annually, beginning with the administration of William Howard Taft. At the time Taft took office, baseball’s top stars, including Nap Lajoie and Ty Cobb, only earned around $9,000 per year, a sizable amount less than POTUS. In 1930, while Herbert Hoover was in office, that all changed. New York Yankees slugger Babe Ruth was given a whopping raise and signed a contract that earned him $80,000 a year, becoming the first athlete to rake in a higher salary than the president. Ruth was famously asked if he felt he should make more money than the president, to which he replied, “Why not? I had a better year than he did.” In 1949, the U.S. president’s salary was increased yet again, to $100,000, but by that time, Hall of Fame baseballers Hank Greenberg and Joe DiMaggio were already earning that much per year.

President Truman putting his signature to a bill that rushed through Congress.
Credit: Bettmann via Getty Images

Harry Truman Was the First President To Receive an Expense Allowance

On January 19, 1949, one day prior to President Truman’s second inauguration, Congress passed a law not only raising the president’s salary from $75,000 to $100,000, but also granting POTUS an annual $50,000 tax-free expense account. That allowance was later made taxable on October 20, 1951, though it’s since fluctuated between nontaxable and taxable status (it’s currently nontaxable). Over the course of subsequent presidential pay raises, this expense account has remained in effect all the way to the modern day.

President John F. Kennedy waves goodbye as he leaves Berlin for Ireland.
Credit: Bettmann via Getty Images

John F. Kennedy Donated His Entire Presidential Salary to Charity

When JFK took office in 1961, he was the richest individual to hold the role. Kennedy was born into a wealthy family, and while he accepted the president’s annual $100,000 salary, he opted to donate those funds to charity rather than pocketing any for himself. Kennedy’s decision was reminiscent of a predecessor from several decades prior; Herbert Hoover was independently wealthy too, and decided to donate his presidential salary as well. After taking office on January 20, 1961, JFK’s prorated salary of $94,583.32 for the remaining year was dispersed among several charitable causes. Throughout Kennedy’s entire political career — a period that included six years as a congressman, eight years in the Senate, and an abbreviated term as president — he donated nearly $500,000 of his various government salaries to charity. Some of the charities he contributed to include the Boy Scouts and Girl Scouts of America, the United Negro College Fund, and the Cuban Families Committee.

Kennedy is not the only president to donate his salary. Herbert Hoover and Donald Trump also gave their presidential earnings to various charities and government agencies.

The White House in Washington DC with a beautiful blue sky.
Credit: Vacclav/ Shutterstock

Presidential Salaries Are Supplemented by Additional Financial Perks

In 1969, Congress raised the presidential salary to $200,000, and in 2001 it was doubled to $400,000. In addition to these substantial amounts, each president’s annual compensation is supplemented by a number of financial perks. For instance, presidents are given a $100,000 annual travel budget, in addition to $19,000 each year set aside for entertaining foreign dignitaries and other notable figures. Furthermore, presidents and their families are given an optional $100,000 for White House redecoration, though some presidents, such as Barack Obama, have opted instead to pay for those redesigns out of pocket.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by BooFamily/ Shutterstock

Guinness World Records collects a hodgepodge of some of the globe’s most impressive and unlikeliest accomplishments. Among the many record-holders are those who set out to establish dominance in some of the wackiest categories imaginable, from “most spoons balanced on the body” to “farthest distance squirting milk from the eye.” Some of these categories may even leave you wondering who came up with the idea for the record in the first place. Here are six of the weirdest Guinness World Records we could find.

Boy balancing spoon on his nose.
Credit: Paul Bradbury/ iStock

Most Spoons Balanced on the Body

Spoons are a commonplace household item, but for one man, they were also tools for setting a seemingly unbreakable world record. Meet Abolfazl Saber Mokhtari, an Iranian man who holds the Guinness World Record for most spoons balanced on the human body. Mokhtari earned himself a spot in the record books by balancing 85 spoons on his person on December 24, 2021. He claimed that he “noticed this talent of [his] when [he] was a kid,” though admitted it took many more years of practice for him to transform into the world-record-holder that he is today.

Mokhtari’s efforts shattered the previous record of 64 spoons held by Marcos Ruiz Ceballos of Spain. However, Mokhtari’s path to victory was far from easy: Due to the humidity in the air, spoons began to slip off his body around number 80. But in the end, he managed to successfully balance 85 utensils on his bare body. Mokhtari’s talent isn’t limited to just spoons, either — he claims these balancing skills extend to plastic, glass, fruit, wood, and even other humans.

Toilet seat taped down and broken.
Credit: courtneyk/ iStock

Most Toilet Seats Broken by the Head in One Minute

Kevin Shelley may not look superhuman, but this German man possesses a powerful forehead unlike any other. In fact, Shelley’s cranium is so strong that he used it to set the record for most wooden toilet seats broken with the head in one minute. On September 1, 2007, Shelley shattered a staggering 46 toilet seats with his forehead over the course of just 60 seconds, toppling the previous record of 42.

Prior to this feat, Shelley was known as an acclaimed martial artist who held several other records for breaking things with a forehead strike. He was invited by Guinness to attempt the toilet-seat challenge based on his reputation, and did so on a German television show. Though only 46 toilet seats were shattered within the allotted time frame, Shelley broke an additional two after the buzzer went off. According to Shelley, it wasn’t the forehead smashing that exhausted him most, but rather running down the line of toilet seats to accomplish the record as quickly as possible.

Close-up of a driving bathtub on three wheels.
Credit: ullstein bild Dtl via Getty Images

Fastest Bathtub

Toilets aren’t the only bathroom fixtures to make their way into Guinness World Records. In 2015, a Swiss man by the name of Hannes Roth piloted a lightning-quick motorized bathtub that set the world record for fastest bathtub. Roth spent over 300 hours creating the speedy appliance, placing a tub atop a go-kart chassis and outfitting it with a 120-horsepower engine. For added realism, Roth even affixed a long metal shower head dangling above the bathtub. After his construction work was done, Roth took the tub out for a spin at a test track in Vauffelin, Switzerland. He averaged speeds of 116.08 miles per hour throughout his two fastest runs, reaching maximum speeds of 118 in his seventh run. Those totals were later verified by Guinness on May 6, 2016, and then enshrined in the record books.

Close-up of milk splashing out of a glass.
Credit: lisegagne/ iStock

Farthest Distance Squirting Milk From the Eye

Ilker Yilmaz may not be a household name, but in the dairy world he’s a legend. That’s because Yilmaz holds the record for farthest distance milk has been squirted from the eye. Yilmaz accomplished the feat at Istanbul, Turkey’s Armada Hotel on September 1, 2004, successfully launching a stream of milk out of his eye socket a total distance of 9 feet, 2 inches. Yilmaz’s accomplishment surpassed the previous record of 8.745 feet, set by Mike “The Milkman” Moraal, which had stood since 2001. As for Yilmaz’s technique, it involved pouring the milk into his hand, snorting it up his nose, and then launching it at what proved to be a record distance.

Close-up of a blue-headed parrot.
Credit: Banu R/ iStock

Most Canned Drinks Opened by a Parrot in One Minute

It’s not just humans who can be Guinness World Record holders — animals can too. Meet Zac the Macaw, a parrot in San Jose, California, who opened a record 35 cans of soda in just one minute. Zac accomplished the feat on January 12, 2012, but that’s not all Zac can do. This parrot can bike, skateboard, and roll over, and he even holds an entirely different Guinness World Record in another category! That’s right, Zac the Macaw famously set the bar for most basketballs slam-dunked by a parrot in one minute, showing that this parrot can do it all.

Close up view of red onion halves on wooden table.
Credit: Yuriy Gluzhetsky/ iStock

Heaviest Onion

In some villages throughout the United Kingdom, onion-growing competitions are a local tradition. But according to Guinness, no man has ever grown a more sizable onion than Tony Glover. On September 12, 2014, Glover’s mighty bulb weighed in at 18 pounds, 11.84 ounces at North Yorkshire’s Harrogate Autumn Flower Show. This impressive weight was good enough for the vegetable to be enshrined in Guinness as the heaviest onion ever.

While only one onion stands atop the list in its category, other farmers have entered the record books with other gargantuan veggies. Christopher Qualley’s 22.44-pound carrot broke records in 2017, a record-shattering head of broccoli weighed in at 35 pounds in 1993, and in 1990, the world’s heaviest zucchini tipped the scales at a staggering 64 pounds, 8 ounces.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Denys Kurbatov/ Shutterstock

None of our senses is as frequently maligned as our sense of smell. Our noses, while convenient, often are portrayed as second-tier sniffers, behind the gifted olfactory senses of dogs, sharks, and other animals. It’s said our noses can’t even smell that many distinct odors. And after all, smell doesn’t help us navigate our world as much as our sight or touch does — supposedly. However, all of these purported “facts” are actually fictions. The seven (real) facts below explain just how amazing our sense of smell really is.

Graphic showing the process of the sense of smell.
Credit: Axel_Kock/ Shutterstock

Olfaction Is the Oldest Sense

When complex life established a foothold on Earth 1.5 billion years ago, smell was the first sense that evolution developed. Chemoreception — detecting chemicals in the environment by scent — is a common trait among all animals, and even single-celled organisms. Smell is vital not only for finding food but also for finding a mate, a pretty key ingredient for furthering your particular limb on the tree of life. But although this sense is ubiquitous throughout the various phyla, kingdoms, and domains, not all organisms smell the same way. Single-celled organisms use a protein in the cell wall to detect chemicals, while plants use mechanisms baked in their genes to detect volatile organic compounds. Snakes, meanwhile, use their forked tongues to “grab” scents and then quickly return them to the olfactory bulb located at the roof of their mouths. This allows snakes to discern the direction of smells, which scientists describe as “smelling in stereo.” When it comes to life’s oldest sense, evolution has had more than enough time to create a variety of techniques.

A woman sniffing laundry.
Credit: yamasan0708/ Shutterstock

Our Nose Can Sense 1 Trillion Odors

Humans are actually better than dogs when it comes to detecting certain smells, and our noses can actually sniff out a staggering 1 trillion (yes, with a “t”) odors. The impressive nature of our noses is a relatively recent discovery, however, which may explain why the human sense of smell long got a bad rap. Nearly a century ago, scientists pegged the human nose’s olfactory abilities at about 10,000 distinct smells — not bad, but far less impressive than our eye’s ability to glimpse 1 million colors (or more, for those lucky tetrachromats). In 2014, researchers from Rockefeller University in New York City decided to take a closer look at the nose’s true powers, and found that the human nose was much more capable than we imagined, giving the phrase “the nose knows” a whole new level of credibility.

Young woman holding a baking tray.
Credit: Paparacy/ Shutterstock

Smell Is Closely Entwined With Memory

Sometimes smells like fresh-cut grass, a delicious baked pie, or a particular deodorant will bring back a long-forgotten recollection that sends you strolling wistfully down memory lane. Well, that’s by biological design, and it has to do with the way the human brain is wired. Unlike our four other best-known senses — which are first routed through the thalamus before reaching the hippocampus, the area of the brain associated with memory — smells are sent directly to the olfactory bulb located above the nasal cavity. While this bulb is directly tied into the hippocampus, it’s also connected with the emotion-processing amygdala, which is why smells can elicit such potent memories. Because smells can deliver these powerful whiffs of nostalgia, companies including Nike, Verizon, and many others have developed — and sometimes even trademarked — certain smells associated with their retail stores and products.

Close-up of a women's nose.
Credit: kei907/ Shutterstock

Some Women Don’t Have Olfactory Bulbs (But Smell Fine)

The olfactory bulb, as the name suggests, is central to our sense of smell. With millions of olfactory receptor cells, the bulb helps translate smells into signals the brain can interpret, so no olfactory bulb means no sense of smell — or so we thought. In 2019, scientists at the Weizmann Institute of Science in Israel were studying why some people have such a particularly strong sense of smell when they discovered in an MRI scan that one 29-year-old participant was completely missing her olfactory bulb. The woman was also left-handed — a factor that when combined with a missing olfactory bulb has a known effect on the organization of the brain. After poring over more data, the scientists estimated that 0.6% of women (and 4.25% of left-handed women) don’t have an olfactory bulb, but nevertheless can smell as well as — and in some cases even better than — those with one. Strangely, missing bulbs were not found in men.

How is this possible? Well, we don’t really know (yet). It’s possible that functions associated with the bulb were disorganized in these particular southpaws, meaning the necessary receptors are still there, just arranged in ways imperceptible to MRI scans. For now, the mystery remains.

Relaxed woman smelling aromatic candles at night.
Credit: Antonio Guillem/ Shutterstock

Our Sense of Smell Is Strongest in the Evening

Although it’s imperceptible to the average person, smell actually fluctuates throughout the day. Research conducted by Brown University in 2017 studied 37 teenagers for a week and measured their sense of smell in relation to levels of melatonin, a hormone that helps induce sleep. The study found that our sense of smell is intimately entwined with our circadian rhythm — the natural cycle our bodies experience every day. When participants were approaching “biological night” around 9 p.m., their sense of smell was heightened, but strangely the opposite was true between 3 a.m. and 9 a.m. Although scientists don’t know the reason for this sniffing discrepancy, one theory harkens back to our evolution. In our hunter-gatherer days, the body might’ve ramped up our sense of smell right before sleep in an effort to hunt (or forage) for that last meal or to detect any nearby threats before bedding down for the night. An increased sense of smell might’ve also encouraged some pheromone-induced mating. Whatever the reason, humanity’s sense of smell appears to be a bit of a night owl.

A woman sitting next to a window with rain drops against.
Credit: fizkes/ Shutterstock

Humans Are Wired To Smell Petrichor

You know that earthy smell that always accompanies rain after a long dry spell? That specific aroma has a name, “petrichor.” Coined in 1964 by Australian scientists (who would know a thing or two about dry spells), “petrichor” is a portmanteau of “petros” (stone) and “ichor,” which is a name for a bloodlike “ethereal fluid” of the gods in Greek mythology. The smell comes from actinobacteria that release organic compounds known as geosmin into the air when it rains. What’s strange is that humans are incredibly sensitive to the stuff — our ability to smell petrichor is far more sensitive than the ability of sharks to smell blood in the water. Many of humanity’s modern biological oddities can be explained by the hundreds of thousands of years spent living in hunter-gatherer tribes, and our keen sense for petrichor is another one to add to the list. Some scientists theorize our noses are so fine-tuned to sniff out this smell because finding water and rainy weather was often a matter of survival — and where there’s petrichor, there’s water. So the next time that telltale smell tickles your nostrils, sit back and marvel at the meticulous machinations of evolution that made such a moment possible.

Varieties of moods and emotions on sticky notes.
Credit: Black Salmon/ Shutterstock

You Can Smell Emotions

Most people are familiar with the power of pheromones, but research is hazy at best regarding what role they play in human behavior and sexual attraction. Bees, for example, possess a vomeronasal organ that detects pheromones and sends signals to the brain. While some humans possess the remnants of this organ, it’s vestigial and nonfunctional. However, scientists expect that other parts of the olfactory system might’ve picked up the slack, making chemoreception between humans possible. A study conducted in 2012 collected sweat from male participants as they watched fear- or disgust-inducing movies. When women participants were asked to do a visual task while exposed to the sweat samples, scientists monitored their facial expressions and discovered that women matched the emotion that originally elicited the sample — a sign that something in the sweat activated some form of chemoreception locked away in the human mind.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Andrii Spy_k/ Shutterstock

Is there anything more commonplace than water? Every day, we drink it and bathe in it, and in certain climates, walk right through it. But the reason water is everywhere is the same reason it’s interesting: It’s in almost everything, including us. Humans, and all other life on Earth, literally couldn’t exist without it.

So let’s take a few moments to pause and appreciate water — water inside our bodies, water on the surface of the Earth, and even water in space. In what unexpected places can we find water? How does water behave in different places? Grab a glass of water and sip along to these seven interesting facts about H2O.

Futuristic human body in bright volumetric light.
Credit: voxelvoxel/ Shutterstock

Our Bodies Are More Than Half Water

We don’t just need water to survive — water makes up a large part of our bodies. Babies are born at about 78% water, and adults are up to 60% water, though adult women are slightly less watery (55%) than adult men. Similarly, some body parts are more watery than others. Your bones are around 31% water, but your brain and heart are around 73%. The lungs are one of the wateriest parts of the body, at 83%.

So what does this bodily water do? It helps regulate your temperature, produce hormones and neurotransmitters, digest your food, deliver oxygen throughout your body, protect your brain and spine, flush out waste, and more — you know, basic survival stuff.

hands covered in a big water splash.
Credit: Volodymyr Burdiak/ Shutterstock

The Earth Contains 332.5 Million Cubic Miles of Water

There’s a fixed amount of water on Earth, so it’s a good thing that we have a lot of it. All together, the Earth’s water adds up to 332.5 million cubic miles (or 326 million trillion gallons). This includes liquid water, ice, groundwater, water in the atmosphere, and the water that’s in our bodies.

The vast majority of the Earth’s water — more than 96% — is in oceans, with ice caps, glaciers, and permanent snow at a very distant second (1.74%) and groundwater a close third (1.69%).

You might be wondering: If the amount of water on Earth doesn’t change, why are the sea levels rising? There are a couple of reasons. For one, the oceans are warming, and water expands when it gets hotter. The oceans are also taking in some extra water: The Earth’s water supply includes glaciers, and those are warming up, too. When they melt, they flow into oceans.

Lake Baikal frozen water in the Winter.
Credit: Katvic/ Shutterstock

Most of the World’s Fresh Water Is Ice

Oceans are salty, and since they account for so much of the world’s water, very, very little of our water supply is fresh — only about 3%. Out of that tiny fraction of fresh water, nearly 70% of it is frozen. Only about 1% of all water can meet the hydration, agricultural, and manufacturing needs of humans. Most drinking water comes from rivers, which make up only 0.006% of the world’s fresh water.

You can convert salt water to fresh water using a process called desalination, but it’s both expensive and costly to the environment, and it’s not just salt that has to come out of ocean water to make it potable (it often contains other contaminants). Still, some desalination plants do exist, especially in the Middle East and Africa, and technology is improving.

Boiling water in a pan on an electric stove in the kitchen.
Credit: Africa Studio/ Shutterstock

Water Doesn’t Always Boil at the Same Temperature

You may have been taught that water boils at 212 degrees Fahrenheit, or a tidy 100 degrees Celsius, but that’s not strictly accurate. That boiling point applies to water at sea level, but not at higher altitudes.

Water boils when the water vapor’s pressure exceeds the atmospheric pressure around it, and atmospheric pressure drops at higher elevations — so the higher the elevation, the lower the boiling point. In fact, water boils about 10 degrees cooler in Denver compared to Death Valley. At the peak of Mount Everest, it only takes 162 degrees Fahrenheit to boil water. Low atmospheric pressure is why some recipes have separate instructions for high elevations, too.

female enjoying eating her food.
Credit: Leeyakorn06/ Shutterstock

Food Counts Toward Your Water Intake

“Drink eight cups of water a day” is a common piece of hydration advice, but it isn’t appropriate for everybody. Some people need more or less depending on all kinds of factors, like their age, activity level, and size. But regardless of your hydration needs, it’s not just glasses of pure water that count toward your fluid intake. We get around 20% of the water we consume from moisture-rich food, like many fruits and vegetables.

Snacks that can help you stay hydrated include cucumbers, iceberg or romaine lettuce, celery, radishes, bell peppers, and tomatoes — all more than 90% water.

Hands of African men are shown holding a bucket of dirty water to be used as drinking water.
Credit: Oxford Media Library/ Shutterstock

2 Billion People Have Limited Access to Water

Most places in the United States have ready access to clean drinking water, with the occasional notable exception. Worldwide, access to water for drinking or even hygiene can be a little more difficult. For more than 2 billion people, clean water is either unavailable or at least far away.

Around 1.2 billion of that group has clean water within a 30-minute round trip. Another 282 million people have to travel more than 30 minutes to collect water. But around 490 million people are left with unprotected or potentially contaminated water — 368 million people get it from unprotected wells and springs, and 122 million from untreated surface water such as lakes and rivers. Access to clean water means more than hydration, of course. Less time spent ill or fetching water means more opportunities to do other things, like work and attend school.

Water drop splash in a glass.
Credit: Peter Bocklandt/ Shutterstock

In Space, Water Forms a Perfect Sphere

You may not think of water as sticky, at least not in the way that glue or chewing gum is sticky, but it does have a unique ability to stick to things. This has to do with the hydrogen bonds in water’s molecular structure — H2O means that each molecule of water has two hydrogen atoms and one oxygen atom bonded together. Hydrogen bonds form easily and are extremely attracted to one another. These easy bonds cause surface tension in water: The molecules are so attracted to each other that at the surface, with nothing above them to cling to, they form a stronger bond with their neighbors below the surface.

The most common way you’ll see water’s stickiness in action is a drop of water hitting a larger amount of water, but it’s both much cooler and much more illustrative to see how water operates in zero gravity. In space, water pulls itself into a perfect sphere because it doesn’t have to work against gravity to bond with itself.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Allstar Picture Library Limited/ Alamy Stock Photo

As the star of The Little Foxes (1941), Now Voyager (1942), and All About Eve (1950), Bette Davis was one of the biggest names of the classic Hollywood era. Yet she was not your typical leading lady, often playing outspoken and even unsympathetic characters. This beloved icon, with her instantly recognizable heavy-lidded eyes, had a career that spanned almost 60 years. Here are six facts about the memorable Bette Davis.

Bette Davis turning her head towards the camera.
Credit: John Kobal Foundation/ Moviepix via Getty Images

Bette Davis Started Her Career on the Stage

Bette Davis got her start on the stage after developing an interest in acting at a finishing school, Cushing Academy in Massachusetts. However, when she applied in 1928 to take classes in New York with renowned director and actress Eva Le Gallienne, the latter rejected her as “a frivolous little girl.” She then became a star pupil at John Murray Anderson’s dramatic school and joined a stock theater company operated by George Cukor, who would later direct dozens of Hollywood hits. In 1929, she earned rave reviews for the Broadway hit Broken Dishes. The following year, a scout for Universal Studios saw her in Solid South and invited her to screen test.

Film actress Bette Davis with a black background.
Credit: Hulton Archive via Getty Images

Davis Failed Multiple Hollywood Screen Tests

Her first Hollywood screen tests did not go very well. Davis arrived in Hollywood with her mother, but the studio representative sent to meet her at the train left because he claimed not to see anyone who looked like an actress. A movie executive watched one screen test and announced she had no sex appeal. In others, she was rejected because of crooked teeth. She even once recalled fleeing the room, screaming, after seeing herself on-screen. Universal eventually offered her a contract, but she was given small, forgettable roles. Davis was preparing to return to New York when Warner Bros. offered her a contract — and then she was on her way to stardom.

Davis's Oscar for Best Actress in a Leading Role, standing next to a Warner Bros partner.
Credit: Hulton Archive/ Moviepix via Getty Images

She Fled the U.S. and Was Sued by Warner Bros. Studio

Davis soon developed a reputation for being strong-willed. Although she got better parts with Warner Bros. and was the studio’s first Best Actress Academy Award winner, she became frustrated with male stars getting better opportunities. She also hated the studio deciding when and where she could work. In 1936, she went to the U.K. to make two films, a move that caused Warner Bros. to sue her for breach of contract. She ultimately lost the case and returned to Hollywood. However, the case did lead to her getting more respect, with a new contract from Warner Bros. and better roles.

Portrait of Bette Davis and Joan Crawford.
Credit: Bettmann via Getty Images

Davis Had a Famous Feud with Joan Crawford

One of Bette Davis’ most memorable later roles was in What Ever Happened to Baby Jane? (1962). The feud between her and co-star Joan Crawford was mirrored by their on-screen loathing of one another. Crawford had been married to a Pepsi executive and was on the company’s board, and so Davis insisted on a Coke machine being installed in her dressing room during the production, among other episodes. By some accounts, Davis hit Crawford so hard in one scene that Crawford required stitches. Davis was nominated for an Academy Award, but Crawford actively campaigned against her. It was Crawford who accepted the Oscar on awards night — on behalf of Anne Bancroft, who could not be there to accept it in person. In later years, both actresses spoke of their respect for each other, although Davis criticized Crawford’s vanity.

Cedric Hardwick presents the Academy Awards to a beaming Bette Davis.
Credit: Bettmann via Getty Images

Davis Shares the Record for Most Consecutive Academy Award Nominations

Davis is one of two actresses with five consecutive Academy Award nominations. She shares the honor with Greer Garson, who beat Davis to win for Mrs. Miniver (1942). In 1962, she became the first person to have been nominated for 10 Oscars, a feat surpassed only by Meryl Streep, Katharine Hepburn, and Jack Nicholson. A write-in campaign for her part in Of Human Bondage (1934) adds another, unofficial, nomination. She won the Best Actress Award for Dangerous (1935) and Jezebel (1938). A win for What Ever Happened to Baby Jane? would have made her the first three-time winner in a non-supporting category.

Bette Davis holding her Lifetime Achievement Award.
Credit: Ron Galella via Getty Images

She Was the First Woman to Receive the American Film Institute Life Achievement Award

In 1977, Bette Davis became the fifth (and first female) recipient of the Life Achievement Award from the American Film Institute. The award is given to a performer “whose work has stood the test of time.” In announcing her as the winner, the AFI said, “She is that rarest of creatures — the consummate professional.” By the 1970s, she was still doing some movies, including Death on the Nile (1978). She was also taking more television roles. Davis continued to act through the 1980s, despite a stroke and breast cancer. Her final role was with fellow Hollywood veteran Lillian Gish in The Whales of August (1987). Davis died in France two years later.

Fiona Young-Brown
Writer

Fiona Young-Brown is a Kentucky-based writer and author. Originally from the U.K., she has written for the BBC, Fodor’s, Atlas Obscura, This England, Culture, and other outlets.

Original photo by james benjamin/ Shutterstock

Although radio is sandwiched between two revolutionary communication technologies — the telegraph and the television — the medium has remained remarkably resilient. First broadcast at the end of the 19th century, radio continues to provide the soundtrack to countless commutes. However, its importance goes far beyond local shock jocks and Top 40, and it still underpins the modern world. Here are six amazing facts about radio, from its remarkable discovery to its transformation into a world-changing communication system.

Close-up view of antenna towers in front of the blue sky.
Credit: onurdongel/ iStock

Radio Waves Were Theorized Before They Were Discovered

The scientific community knew about radio waves before anyone discovered actual evidence of them. In 1865, Scottish mathematician and physicist John Clerk Maxwell predicted the existence of radio waves in a paper titled “A Dynamical Theory of the Electromagnetic Field,” based on a presentation he gave before the Royal Society in December 1864. He also developed a set of electromagnetism equations known to history as “Maxwell’s equations.”

Although Maxwell gave due deference to his predecessor Michael Faraday, who had discovered electromagnetic induction among other principles of electromagnetism, many consider Maxwell’s work — which predicted various waves along the electromagnetic spectrum — a pivotal moment in the history of science and technology. These waves remained theoretical for more than 20 years, until German physicist Heinrich Hertz demonstrated radio waves for the first time in his laboratory in 1888 — forever transforming the history of communication.

Physicist Guglielmo Marconi working in radio transmission.
Credit: Hulton Deutsch/ Corbis Historical via Getty Images

The First Reported Transatlantic Radio Transmission Might Never Have Happened

Although Hertz got his own unit of frequency for his trouble, the undisputed giant of early radio is Italian inventor Guglielmo Marconi. After studying the work of Maxwell, Hertz, and other influential physicists while growing up, Marconi sent a radio signal more than a mile at his estate in Pontecchio, Italy, in 1895. Sensing both opportunity and celebrity, Marconi took out several patents and demonstrated his system throughout Europe. Then, on December 12, 1901, Marconi set out to prove that radio waves were not impacted by the curvature of the Earth. With a transmitter set up in Newfoundland, Canada, and another in Cornwall, England, some 2,100 miles away, Marconi waited for the three clicks (the letter “S” in Morse code) coming from Cornwall to prove that his invention — and the radio waves it produced — could work across long distances. Accompanied by his assistant George Kemp, Marconi believed he heard the expected three clicks, proving that his invention worked. Kemp also agreed that he heard the clicks.

Today, many experts are skeptical that the pair actually heard the clicks, since Marconi had many motives to act as if they did (and Kemp may have gone along). There were no independent witnesses, and it’s highly implausible the technology would have been capable of producing a transatlantic transmission at the time. It’s likely we’ll never know for sure what really happened that day.

Two women and a boy listening to a large valve radio.
Credit: FPG/ Archive Photos via Getty Images

The Most Powerful Radio Station Ever Was in Cincinnati, Ohio

In May 1934, President Franklin D. Roosevelt pushed a button in the White House, and the world’s first “super station,” WLW, came to life near Cincinnati, Ohio. The station used an 831-foot, 500-kilowatt tower capable of sending a signal halfway around the world. The project was designed as a temporary experiment to spread radio waves far and wide, but unfortunately, its immense power and operating costs proved to be too much. Stations far away from Cincinnati but still within range of the 700-hertz frequency complained of constant interference. People living close to the tower also reported hearing the broadcast vibrating along metal kitchen pans, barbed-wire fencing, or even bedsprings. After five years, Congress decided 500 kilowatts was simply too powerful, and limited broadcasts to 50 kilowatts — the current limit for AM clear-channel stations today.

Radio station microphone in front of a mixing board.
Credit: Gesundheit/ iStock

The First Radio Commercial Was for a Real Estate Developer in NYC

In August 1922, New York radio station WEAF created something that would change the radio industry forever — it broadcast the very first radio commercial. The ad spot was for an apartment complex in Jackson Heights, Queens. Although radio ads are an obvious innovation now, one worry among early radio stations was how to make money from the service, since people weren’t charged for the endless stream of programming itself. Initially ads may not have seemed like a profitable strategy given the limited number of listeners, but economics changed as more radios began to find their way into American homes. Between 1923 and 1930, the number of Americans who owned at least one radio jumped to 60%, meaning that there were enough listeners for the radio ad business to be booming.

Radio telescopes and the Milky Way at night.
Credit: zhengzaishuru/ Shutterstock

Radio Is an Extremely Important Tool for Astronomers

Because radio waves are part of the electromagnetic spectrum — one of the fundamental forces of nature — stars, quasars, planets, galaxies, and dust galaxies emit them. Some of the earliest attempts to use radio to investigate the stars came at the turn of the 20th century, when astronomers attempted to pick up radio emissions emanating from our sun. Today, radio astronomy is an entire field of dedicated scientists pointing massive radar arrays at the stars in an effort to glimpse things unseen by the naked eye. One of the most impressive radio telescopes in the U.S. is the National Radio Astronomy Observatory’s Robert C. Byrd Green Bank Telescope located in West Virginia. The telescope is the largest fully steerable radio telescope in the world, and the machine is so sensitive to radio waves that Wi-Fi is illegal in the 13,000-square-mile “National Radio Quiet Zone” surrounding the telescope.

The Eiffel Tower in Paris, France.
Credit: Paolo Gianti/ Shutterstock

The Eiffel Tower Avoided Destruction Because of Radio

It’s hard to imagine Paris without the Eiffel Tower, but the iconic tower wasn’t meant to stick around forever. Gustave Eiffel originally built his eponymous tower for the Exposition Universelle of 1889, and the city only leased the land to Eiffel for 20 years. After that, the land was to be returned to Paris and the tower demolished. Knowing the destruction in store for his precious monument, Eiffel set about finding some way to make the tower both useful and symbolic. On November 5, 1898, the Eiffel Tower participated in an early demonstration of radio when a signal was sent from the tower’s tip to the Pantheon some 2.5 miles away.

In the early 20th century, Eiffel doubled down on transforming his monument to progress into a full-fledged radio tower. By 1908, radio waves emanating from the Eiffel Tower could reach distances of more than 3,500 miles, and its creator had successfully proved its strategic worth. The Eiffel Tower then proved vital during World War I as it intercepted radio messages sent by the Central Powers. Today, the tip of the Eiffel Tower is still home to various radio antennas.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.