Original photo by DISENY Cinematic Collection/ Alamy

It’s not easy being green, whether you’re an amphibian or a reptile — just ask Kermit, the lovable Muppet who originally debuted as a lizard-like creature. The first Kermit puppet had a slender body, rounded feet, and lacked the pointed collar the current Kermit has — features that gave off a nondescript, vaguely reptilian appearance. 

It was designed in 1955 by creator Jim Henson using materials taken from his mother’s old coat, a pair of his blue jeans, and ping-pong balls for eyeballs. The resulting puppet was not assigned a specific species — Henson preferred somewhat abstract characters — but he looked more like a lizard than a frog.

Kermit the Frog received an honorary doctorate.

Ready to reveal?

Oops, incorrect!

It's a fact

In 1996, Kermit was awarded an honorary Doctorate of Amphibious Letters from Southampton College of Long Island University for his contributions to environmental awareness. Kermit also delivered the commencement address to that year’s graduating class.

Kermit — in his original lizardy form — made his TV debut in 1955 on the comedy show Sam and Friends. Afterward, the puppet underwent alterations that gave it a more frog-like appearance, such as flippers for feet. People began informally referring to Kermit as a frog, including late-night host Johnny Carson in 1965. Henson himself began describing Kermit as a frog-type Muppet by the late ’60s, though these were still unofficial designations.

According to Henson, Kermit didn’t officially transform into a frog until the 1971 TV special The Frog Prince, by which point he was formally credited as “Kermit the Frog.” According to Disney (the current parent company of the Muppets), Henson once said that Kermit’s evolution wasn’t a carefully orchestrated decision: “He just slowly became a frog.”

Numbers Don't Lie

Numbers Don't Lie

Episodes in the original run of “The Muppet Show”
120
Oscar-nominated songs performed by Kermit the Frog
2
Year the Muppets were acquired by Disney
2004
Peak position for “Rainbow Connection” on the Billboard Hot 100
25

Miss Piggy’s original name was ______.

Ready to reveal?

Confirm your email to play the next question?

Miss Piggy’s original name was Miss Piggy Lee.

Placeholder Image

“The Muppet Show” was produced in England.

Despite its status as an American cultural institution, The Muppet Show is a product of the United Kingdom. In the early 1970s, Jim Henson pitched the concept to many major U.S. TV networks, all of which passed on the idea. But he got a lucky break in 1975 when he was approached by British media mogul Lew Grade. Grade had seen Muppets make cameos on other TV programs and decided the characters deserved a show of their own.

The original Muppet Show was filmed at a studio in the English village of Elstree, debuting on the U.K.’s ATV on September 5, 1976, before making its U.S. debut in syndication later that month.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Element5 Digital/ Unsplash

Gold is present in low levels throughout the Earth. It’s been found on every continent except Antarctica, as well as in the planet’s core, the oceans, plants, and in humans, too. The average human body of about 150 pounds is said to contain about .2 milligrams of gold, which we excrete through our skin and hair. Babies less than 3 months old tend to have more gold in their manes than older people, thanks to the precious metal being passed along in human breast milk. And while no one’s suggesting we should mine the gold in hair or breast milk (as far as we know), researchers are studying whether gold — and other metals — might be recovered from human waste.

Hair and nails are made of the same protein.

Ready to reveal?

Oops, incorrect!

It's a fact

Both your hair and your nails are made of a protein called keratin, which the human body produces naturally. Keratin also forms the outer layer of your skin, the epidermis.

Gold is far from the only metal found in our bodies, however. Researchers estimate that 2.5% of the human body’s mass is made up of metals; think iron, cobalt, copper, zinc, calcium, and more. Many of these metals have important health functions — gold helps transmit electrical signals throughout the body, and plays a role in maintaining our joints. As for how gold and other precious metals got to Earth in the first place, some astrophysicists believe it’s all thanks to two neutron stars that crashed into each other about 4.6 billion years ago, leading to residual deposits of gold, silver, platinum, and more that eventually settled on our planet. Because these elements eventually found their way into our bodies, we can say that we truly are made of star stuff.

Numbers Don't Lie

Numbers Don't Lie

Diameter (in inches) of the world’s largest gold ring, a 181.2-pound jewel crafted in China in 2016
31.2
Estimated hair donations made each year to Locks Of Love, a charity that makes hair prosthetics for needy kids
104,000
Weight (in metric tons) of all the gold that has already been discovered globally
244,000
Approximate amount of hair follicles on a human body
5 million

The chemical symbol for gold, Au, comes from the Latin “______.”

Ready to reveal?

Confirm your email to play the next question?

The chemical symbol for gold, Au, comes from the Latin “aurum.”

Placeholder Image

Olympic gold medals are made mostly from silver.

According to the International Olympic Committee, athletes’ gold medals must be composed of at least 92.5% silver and plated with about 6 grams of pure gold. (Silver medals are authentically advertised as solid silver, yet bronze medals are actually 95% copper and 5% zinc.) However, genuine gold medals were briefly part of the Olympic Summer Games. In the St. Louis 1904 Games — the first Olympiad where the modern medal configuration was observed — top finishers received medals made entirely of gold. The practice ended after the 1912 Olympics in Stockholm, after World War I led to gold shortages. Cold-weather winners never had the chance to take home fully gold hardware, as the Olympic Winter Games launched in 1924.

Jenna Marotta
Writer

Jenna is a writer whose work has appeared in The New York Times, The Hollywood Reporter, and New York Magazine.

Original photo by ES3N/ iStock

Within 12 hours of their birth, oysters begin pulling calcium out of the water to create their signature shells. For the first few weeks of their lives, these newborn bivalves zoom around in a current until they eventually settle on some hard substrate, whether it’s a rock, pier, or another oyster. This place of protection is where the oysters will spend the rest of their lives (which can be as long as 20 years). Eventually, usually a year after birth, it’ll be time for the oysters to breed, and that’s where things get interesting. 

Only oysters can make pearls.

Ready to reveal?

Oops, incorrect!

It's a fib

When debris gets caught inside an oyster so it can’t be flushed out, some species encase the unwanted irritant in a material called nacre, also known as mother-of-pearl. Although oysters make pearls, most commercial pearls come from saltwater clams or freshwater mussels.

Although born male, oysters have the impressive ability to switch their sex, seemingly at will. Every season, females can release up to 100 million eggs, and the amount of sperm released is so high it’s essentially incalculable. Once the egg and sperm are released, the oysters rely on pure chance for fertilization to take place, as the egg and sperm meet in the open water. Because any resulting larvae are extremely vulnerable to predators (especially filter feeders), oysters have evolutionarily compensated by being one of the most virile and sexually flexible species in the world — meaning that their ability to change sex likely evolved as a matter of survival. This impressive fecundity means that natural oyster reefs can grow to tremendous size; as little as 10 square feet of reef can house up to 500 oysters. Scientists theorize that water temperature could play a role in triggering whatever causes an oyster to change its sex, but many aspects of the process remain a mystery. 

Numbers Don't Lie

Numbers Don't Lie

Length (in inches) of the world’s largest oysters
13.97
Amount of energy an oyster bed can absorb from a wave, protecting coastlines from storms
76%-93%
Gallons of water a 1-acre reef of oysters filters every day
24 million
Number of species of oysters found in U.S. waters
5

Juvenile oysters are known as ______.

Ready to reveal?

Confirm your email to play the next question?

Juvenile oysters are known as spat.

Placeholder Image

New York City was once the oyster capital of the world.

Before the 17th century, the island of Manahatta (as the Indigenous Lenape called it) was absolutely inundated with oysters. With their impressive filtering abilities, these oysters kept the surrounding estuary clean, and they also became a staple of the Lenape diet. When Henry Hudson’s ships sailed the river that would one day bear his name in 1609, the New York estuary was estimated to be home to 350 square miles of oyster reef — roughly half the world’s entire oyster population. The original names for Ellis and Liberty islands were “Little Oyster Island” and “Great Oyster Island,” respectively, and one of the oldest streets in Manhattan — Pearl Street — is named after an Indigenous oyster shell midden located along the shore (it was later paved, fittingly, with oyster shells). New Yorkers also began eating lots of oysters, upwards of 1 million a day at the industry’s height, while shipping millions abroad. Sadly, overharvesting and environmental degradation caused oysters to severely decline in New York’s waters, and by 1927 they were deemed too contaminated to eat. Today, groups are reintroducing oysters to New York Harbor, and wild populations are beginning to return. Although these oysters are already hard at work cleaning the estuary while providing important aquatic habitats, it’ll likely be a century until New York oysters are once again safe for human consumption.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Tjasa Janovljak/ Alamy Stock Photo

Music is powerful. Research shows it can help us sleep better, jog our memories, and even reduce anxiety and blood pressure. It also serves as a social tool and has helped humans mark special occasions such as religious festivals, weddings, and funerals for thousands of years — or maybe even longer. In fact, some scientists believe that the oldest musical instrument may date back around 60,000 years. In 1995, archaeologists unearthed an object that might be the world’s oldest musical instrument, a flute found in the Divje Babe cave in western Slovenia. The flute, which some researchers think was made by Neanderthals, was found near the remains of a prehistoric fire pit and fashioned from a bear femur; it has a carved mouthpiece and three spaced holes that may have been used to create different tones. Some scientists believe the instrument was ergonomically designed for a right-handed musician.

First Lady Dolley Madison may have once saved a flute from British troops.

Ready to reveal?

Oops, incorrect!

It's a fact

Saving George Washington’s portrait when the White House was attacked in the War of 1812 is one of Madison’s best-known achievements, but that wasn’t all she saved. She is also said to have rescued a crystal flute given to her husband by inventor Claude Laurent.

However, not everyone agrees that the Divje Babe cave flute is actually an instrument. A study published in 2015 suggests that the object wasn’t a musical creation or even the work of Neanderthals, but instead was made by scavenging predators. According to an examination by paleontologist Cajus Diedrich, the bones didn’t show any evidence of drilling from stone tools, but instead had teeth markings likely caused by ice age hyenas. However, Diedrich’s theory is heavily contested by other biologists who have performed bone-cracking tests and created playable replicas. They argue that it would be unlikely for an animal to accidentally create such an artifact — though we may never know exactly who did.

Numbers Don't Lie

Numbers Don't Lie

Length (in inches) of a standard concert flute
26
Length (in inches) of a piccolo, a smaller flute that reaches higher octaves
13
Holes in a Western concert flute, covered by keys or fingers to create musical notes
16
Year “Shooting the Pistol,” the earliest known recording of a jazz flute, was recorded
1927

“The Magic Flute,” written in 1791, was the last opera composed by ______.

Ready to reveal?

Confirm your email to play the next question?

“The Magic Flute,” written in 1791, was the last opera composed by Mozart.

Placeholder Image

The earliest known recorders date back to the 1300s.

Modern recorders are nearly identical to their popular musical ancestors, which emerged in the 14th century. The earliest known recorders were made from wood or ivory and came in a variety of sizes, with the largest used to mimic bass tone. Like today’s versions, they generally had seven finger holes and one thumbhole. Recorders were fundamental to Renaissance and Baroque music, and were featured in performing ensembles throughout Europe in the Middle Ages. However, they fell out of popularity during the 18th century as modern flutes became more popular. Nevertheless, simple plastic recorders eventually became an important part of musical education as a tool for the youngest performers — albeit one that is sometimes bemoaned by parents.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by ChrisDoDutch/ iStock

Watching an old Western might leave you with the perception that tumbleweeds have always been a part of America’s western landscape. However, many of the spiky bushes are actually an invasive species from Russia. Salsola tragus goes by a variety of names — including “Russian thistle” and “wind witch” — but its best-known title comes from the way the plant breaks free from the ground at the end of its growing season, blowing around and spreading hundreds of thousands of seeds. While some native tumbleweeds do exist — like Amaranthus albus, aka common tumbleweed — Russian thistle is highly invasive, a term scientists use to describe species that choke out native plants and cause ecological harm by altering habitats. Today, Russian thistle is the most common type of tumbleweed in California.

Tumbleweed plants can survive radioactive hot spots.

Ready to reveal?

Oops, incorrect!

It's a fact

They even thrive. In the 1950s, the U.S. government performed nuclear tests in Nevada’s deserts, noting that Russian thistle bushes were among the first plants to return to test sites. Despite being technically radioactive, tumbleweeds grew vigorously around the test sites.

Botanists believe Russian thistle first put down roots in South Dakota around 1873, accidentally mixed into containers of flaxseed brought with European immigrants and growing unchecked in arid, desolate regions because it requires minimal water. Russian thistle gained such a stronghold in Western states that it alarmed government botanists, who reported in the 1890s that the plant had claimed as much as 35,000 square miles of land in just two decades of growth. While wind helped disperse the seeds, the early days of the railroad system also spread seeds inside batches of contaminated agricultural material, both throughout the U.S. and as far north as Canada.

Tumbleweeds may seem relatively benign to humans, but they are known to gather en masse during windstorms, causing highways to shut down and even trapping people in their homes and cars. Newer species are capable of reaching 6 feet tall, prompting naturalists to remove them wherever they crop up with the help of shovels and herbicides.

Numbers Don't Lie

Numbers Don't Lie

Seeds produced by one Russian thistle
250,000
U.S. states with invasive tumbleweed (Alaska and Florida are excluded)
48
Average height (in feet) of a Russian thistle bush
3
Year Western film star Gene Autry starred in “Tumbling Tumbleweeds”
1935

Scientists have (unsuccessfully) attempted to control tumbleweeds with plant-eating ______.

Ready to reveal?

Confirm your email to play the next question?

Scientists have (unsuccessfully) attempted to control tumbleweeds with plant-eating moths.

Placeholder Image

American farmers grew tumbleweed plants during the Dust Bowl.

In their mature state, tumbleweeds don’t necessarily look nourishing, but the green leaves of young Russian thistle plants are actually quality feed for livestock — a fact utilized by Dust Bowl ranchers. Drought during the 1930s, combined with farming practices that failed to conserve soil health, had dried and cracked earth that was then battered by high winds, stripping off the topsoil and creating dust storms. Altogether, these conditions made it impossible for farmers to plant their fields. They faced a livestock feed shortage, while cattle ranchers faced the very real possibility of losing their herds to starvation. In some regions, farm workers turned to planting Russian thistle, which was known for withstanding unforgiving environments. One county in Oklahoma reportedly held a “Russian Thistle Week” to encourage residents to collect the green (some reports suggest people even brined and ate the plant themselves). Russian thistle was such a useful stand-in for traditional livestock feed that Kansas farms produced more than 350,000 tons of hay from the plant in just 1934, crediting it as a lifesaver for cattle farmers.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Andrew Haysom/ iStock

A baby red kangaroo (Macropus rufus) is about the size of a jelly bean. Born after about 34 days of gestation, it’s less than an inch long — or 100,000 times smaller than its adult height (roughly 4 feet). This newborn kangaroo, called a joey, isn’t quite ready for prime time, however. Unlike most mammals, joeys are born while they’re still embryos, which means they lack sight, hearing, and hair. They spend the next six months in their mother’s pouch, or marsupium, where they suckle from a teat and continue to develop before finally taking their first steps into the world. If the word “marsupium” sounds familiar, it’s probably because that’s where the term “marsupial” comes from. Marsupials are a mammalian class that includes kangaroos, wombats, koalas, possums, and more — about 330 species altogether.

Baby rabbits are called kittens.

Ready to reveal?

Oops, incorrect!

It's a fact

Cats don’t have a monopoly on the “kitten” term. Baby rabbits are also called kittens, while the animal’s short birthing process is called kindling. Rabbits can have multiple litters a year, with up to 12 kittens in each (though the average is five).

Kangaroos are some of the supermoms of the animal kingdom. Not only do they have a special pouch for their babies, but they can create two distinct types of milk to care for both the developing embryo and the more mature joey. They can even suspend their ability to conceive during times of drought, and then regain that ability when conditions are more favorable. With their remarkable adaptability, it’s no wonder kangaroos outnumber Australians nearly two to one

Numbers Don't Lie

Numbers Don't Lie

Year King Edward VII of Great Britain authorized Australia’s coat of arms, featuring a kangaroo and emu
1908
Approximate number of kangaroo species worldwide
60
Estimated number of kangaroos living in Australia as of 2019
42,756,617
Years ago that Australia separated from Antarctica
30 million

A group of kangaroos is known as a ______.

Ready to reveal?

Confirm your email to play the next question?

A group of kangaroos is known as a mob.

Placeholder Image

Most kangaroos are left-handed.

Turns out, a kangaroo paw is also a southpaw. A 2015 study of wild eastern gray kangaroos, red kangaroos, and red-necked wallabies found that they preferred their left hand for grooming, eating, and performing other tasks about 95% of the time. This stunning discovery goes against the long-standing theory that only humans (and some apes) have a strong preference for one hand over the other; 90% of humans are right-handed. Scientists think this is likely a case of “parallel evolution,” in which animals in different branches of the evolutionary tree develop similar traits through separate processes.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Natalia Blauth/ Unsplash+

There are at least 30,000 edible plant species in the world, the vast majority of which aren’t commonly eaten. Agricultural biodiversity is in decline, with 75% of the world’s food coming from just 12 plants and five animal species. Of that percentage, the majority comes from widespread staple crops (including wheat, rice, sugarcane, corn, and soy), while a much smaller share comes from cattle, chicken, sheep, pigs, and goats.

Strawberries are berries.

Ready to reveal?

Oops, incorrect!

It's a fib

The botanical definition of a berry is actually quite complex, and strawberries don’t fit the criteria — but bananas do. Strawberries are technically classified as “aggregate fruits.”

Those are striking statistics, but they’re also a bit of a warning. The more we rely on a smaller and smaller number of plant and animal species, the more susceptible those food sources are to disease — essentially, we’re putting all our eggs in too few baskets. Plant breeders are combating that risk via gene-editing tools such as CRISPR, which allow them to select for desirable genes that make crops more resilient to climate change and disease.

Numbers Don't Lie

Numbers Don't Lie

Plant species cultivated at a significant scale
170
Kernels on an average ear of corn
800
Farms in America
1.88 million
Years it takes one pineapple to grow
2

______ produces more wheat than any other country.

Ready to reveal?

Confirm your email to play the next question?

China produces more wheat than any other country.

Placeholder Image

The world’s largest plant is an Australian seagrass.

The world’s largest plant by area wasn’t discovered until 2022, but it was hiding in plain sight all along. A specimen of Posidonia australis seagrass, also known as Poseidon’s ribbon weed, covers 77 square miles of Australia’s Shark Bay — enough space for 28,000 soccer fields. 

It’s also quite old (about 4,500 years, researchers from the University of Western Australia and Flinders University estimate), and no one knows how it’s lasted as long as it has, especially since it could be sterile. Species that can’t reproduce tend to have reduced genetic diversity, which reduces their ability to cope with environmental change. One theory relates to Shark Bay itself, a World Heritage Site that has remained largely untouched by the outside world, making it an ideal environment for seagrass to continue growing for thousands of years.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by Delcea nicolae cosmin/ Alamy Stock Photo

The North Atlantic is filled with lobsters, and it’s been that way for millennia. In fact, the first European settlers who arrived in North America in the 17th century reported that heaps of lobsters — some in 2-foot piles — simply washed up along the shore, making the crustaceans a vital source of protein during those harsh New England winters. Fast-forward 400 years, and lobsters remain plentiful; by one estimate, the lobster industry catches some 200 million lobsters in the North Atlantic every year. Among those millions of lobsters are some truly eye-catching crustaceans — including the blue lobster, which is so rare that scientists estimate it’s a 1-in-2 million catch. Although such a rare find fetches a high price at the market, no evidence suggests that the blue lobsters (whose sapphire hue is caused by a genetic defect) taste any different than their normal-colored brethren. 

Most lobsters are red.

Ready to reveal?

Oops, incorrect!

It's a fib

Lobsters are actually many colors (though most look brown) and only turn red when cooked. A lobster’s various natural hues come from the chemical astaxanthin, which binds with the protein crustacyanin. When boiled, astaxanthin is released, and the creature turns a reddish-orange.

Although blue lobsters are a rarity in the North Atlantic, they are far from the most exclusive crustacean living along the seabed. The Lobster Institute at the University of Maine says that finding a yellow lobster, for example, is a 1-in-30 million catch. But one of the most astounding finds of all came in 2011, when a British fisherman caught an albino lobster — estimated to be a 1-in-100 million catch. The 30-year-old lobster, which somehow avoided predators despite being easier to spot in the sea, didn’t end up on a dinner table. Instead, it was donated to the Weymouth Sea Life aquarium in England. 

Numbers Don't Lie

Numbers Don't Lie

Weight (in pounds) of “Big George,” the world’s largest recorded lobster
37.4
Number of post-larval lobsters, out of circa 50,000, that’ll grow big enough to harvest
2
Maximum depth (in feet) where the American lobster is found, from Maine to North Carolina
2,300
Year the first Red Lobster restaurant opened in Lakeland, Florida
1968

Nineteenth-century ships designed to transport live lobsters were called ______.

Ready to reveal?

Confirm your email to play the next question?

Nineteenth-century ships designed to transport live lobsters were called smacks.

Placeholder Image

Evolution keeps turning animals into crabs.

Evolution doesn’t generally play favorites, but it does seem to have a predilection for crabs. Studies have found that evolution has formed animals with a crablike shape and features on five separate occasions in the past 250 million years. Decapods, an order of crustaceans (which also includes lobsters and shrimp), include two groups of crablike creatures: true crabs (brachyurans) and false crabs (anomurans). In both groups, many animals began with an elongated body like a lobster but eventually morphed into the shape of a crab. King crabs, porcelain crabs, and coconut crabs are not true crabs, but have all experienced a process known as convergent evolution by independently adopting the crablike body form. In fact, this has happened so many times in the fossil record that in 1916 English zoologist Lancelot Alexander Borradaile coined the phrase “carcinization,” describing the process of an animal independently evolving crablike features. While scientists aren’t sure why everything keeps coming up crab, there are a few theories. For one, the long tail of a lobster, called the pleon, shrinks over time, likely due to predatory pressures, whereas the lobster’s upper body, the carapace, grows wider for better mobility and speed. These consistent pressures may explain why animals time and time again seem to adopt the physical characteristics of crabs.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Andrea Izzotti/ Shutterstock

You’d be forgiven for thinking this distinction belongs to the members of the Bush or Kennedy clans, but it’s actually claimed by the lesser-known Dingell family, which has served southeast Michigan for 90 years and counting.

The political dynasty began with the election of Democrat John Dingell Sr. from Michigan’s 15th District in 1932. Along with co-authoring legislation that led to the Social Security Act of 1935, the paterfamilias was best known for introducing a national health insurance bill before his death in 1955. John Dingell Jr. picked up the fight after winning a special election to fill his father’s seat, notching a victory with the passage of the Medicare and Medicaid Act in 1965. He went on to craft a legacy that dwarfed that of John Sr. and nearly all of his colleagues, by way of his longtime chairmanship of the powerful House Energy and Commerce Committee. He retired in 2015 after a record 59 years in the House.

A woman was elected to serve in the U.S. House of Representatives before women were allowed to vote.

Ready to reveal?

Oops, incorrect!

It's a fact

Jeannette Rankin of Montana was the first woman elected to Congress, in November 1916, nearly four years before the August 1920 ratification of the 19th Amendment. Reelected in 1940, she became the only member of Congress to vote against entering both World War I and II.

The seat was then won by his wife, Debbie, who set about making her own mark as a sponsor of environmental and health care legislation. Debbie represented the 12th District from 2015 to 2023 and has served the 12th District since 2023. She could keep the lineage going, though she’ll likely need help from a yet-to-be-determined successor if the Dingells hope to push past the century mark as representatives of the Great Lakes State. 

Numbers Don't Lie

Numbers Don't Lie

Number of voting representatives in the House
435
Minimum age required to be a member of Congress
25
Age in 2015 of Rep. Ralph Hall, the oldest ever to serve in the House
91
Day spent as speaker of the House by Rep. Theodore M. Pomeroy in 1869
1

The only U.S. president elected to the House of Representatives after leaving office is ______.

Ready to reveal?

Confirm your email to play the next question?

The only U.S. president elected to the House of Representatives after leaving office is John Quincy Adams.

Placeholder Image

Just one mother-son pair has served concurrently in Congress.

That would be Frances and Oliver Bolton, Ohio Republicans who shared the chamber over three terms between 1953 and 1965. Frances, who began her congressional career in 1940 by replacing her deceased husband, Chester, went on to earn reelection 14 times, along the way authoring the Bolton Act to establish the U.S. Cadet Nurse Corps. Oliver had the less distinguished career of the two, though both mother and son insisted that he was his own person. When Frances asked if there was anything she could do to help his congressional campaign in 1952, he reportedly replied, “Sure there is — stay the hell out of my district.”

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by jijomathai/ Adobe Stock

Even though you experience life as a continuous, unchanging flow of time, most of the cells in your body are constantly being renewed. Through a process called cell turnover, old cells die and are replaced by new ones, meaning much of your biological makeup is far younger than your chronological age.

Aging still occurs because some cells don’t regenerate, renewal slows in certain tissues, and even new cells can experience wear and tear over time. Nonetheless, on average, the cells in an adult human body are estimated to be only 7 to 10 years old — so even in middle age, much of your body is biologically closer to that of a child than to that of an elderly adult.

Inner-ear hair cells that let humans hear are never replaced.

Ready to reveal?

Oops, incorrect!

It's a fact

The delicate sensory hair cells in the inner ear don’t regenerate once they’re damaged or lost — which is why hearing loss from aging or loud noise is often permanent.

Scientists have been able to estimate cellular ages thanks to carbon-14, a naturally occurring radioactive form of carbon that entered the atmosphere in large quantities during above-ground nuclear weapons testing in the mid-20th century. When cells divide, carbon-14 from the environment becomes permanently embedded in their DNA, effectively “dating” the moment each cell was born. By measuring the carbon-14 levels of different tissues, researchers can determine how often various parts of the body renew themselves.

The pace of renewal isn’t uniform across all tissues. For example, skin cells regenerate roughly every few weeks, the gut lining every few days, red blood cells about every four months, and liver cells approximately every year. Then there are cells — including most neurons in the brain’s cerebral cortex and the eyes’ inner lens cells — that can last an entire lifetime. Because some critical cells don’t regenerate and other new cells gradually accumulate damage, the body experiences functional decline and aging, even as most cells continue to turn over.

Numbers Don't Lie

Numbers Don't Lie

Red blood cells in 1 ounce of blood
150 billion
Year the Partial Nuclear Test Ban Treaty banned above-ground nuclear weapons testing
1963
Years it takes for an entire human skeleton to be remodeled from renewing cells
10
Year cells were discovered by Robert Hooke
1665

Long after pregnancy, a mother’s body can still contain cells from the fetus, a phenomenon known as ______.

Ready to reveal?

Confirm your email to play the next question?

Long after pregnancy, a mother’s body can still contain cells from the fetus, a phenomenon known as microchimerism.

Placeholder Image

Your sense of taste depends on some of the fastest-renewing cells in your body.

Like other cells in your body, the specialized cells in taste buds — the taste receptor cells that detect sweet, bitter, salty, sour, and umami flavors — are continually replaced throughout your lifetime. Those cells live only about eight to 12 days before being shed and replaced by new cells produced from progenitor cells in the tongue epithelium, the thin layer of tissue covering the surface of the tongue.

That rapid turnover helps explain why illnesses, injuries, or aging can temporarily alter taste perception. Because each new taste cell must form connections with nerves to transmit flavor information, anything that affects cell production or differentiation — including infections, inflammation, or age‑related changes — can cause shifts in your experience of how foods taste.

Kristina Wright
Writer

Kristina is a coffee-fueled writer living happily ever after with her family in the suburbs of Richmond, Virginia.