The concept of time has been described as many things: an arrow, a river, a march, anything that moves inextricably forward at a constant, unalterable rate. However, aging doesn’t flow at such a uniform speed. Instead, humans age in fits and starts. A 2024 Stanford University study shows that our bodies age faster around our mid-40s and early 60s than during other stages of life.
The study analyzed data from 108 people who donated blood and other biological samples over several years. By tracking 135,000 different molecules, creating 250 billion data points, scientists discovered that roughly 81% of the studied molecules showed age-related fluctuations, and those moments of rapid aging tended to coalesce around the ages of 44 and the early 60s. According to the scientists, the most surprising data point was rapid aging in the mid-40s. At first, they theorized that menopause or perimenopause could be playing a role in these changes, but they found the molecular changes impacted men just as much as women.
The giant tortoise is the longest-lived vertebrate on Earth.
Although the Seychelles giant tortoise can live up to nearly 200 years, scientists in 2016 discovered a Greenland shark that was at least 270 years old (and possibly much older), making that species the longest-lived vertebrate on Earth.
The affected molecules also differed between those two aging periods. Both age groups reported changes in molecules related to cardiovascular disease, caffeine metabolism, and skin and muscle growth, but the mid-40s cohort also recorded increased alterations in alcohol metabolism, while people in their early 60s underwent changes to immune regulation and kidney function. Of course, a lifetime of healthy eating, exercise, and plentiful sleep can curtail some of the effects of these periods of aging, so it may be worth paying extra close attention to your health when those milestones arrive.
Genomic regions at the end of chromosomes, known as telomeres, shorten as we age.
Advertisement
It’s a myth that ancient humans didn’t live to old age.
A well-known (and much appreciated) side effect of modern medicine is its ability to increase a human’s life expectancy, especially in developed countries. In the U.S., for example, life expectancy hovered around 47 years at the turn of the 20th century but skyrocketed to nearly 77 years a century later. Delve even further back into history, and it may seem like humans lived rather short, brutish lives compared to today. However, old age isn’t a modern phenomenon.
Ancient Greeks and Romans, for example, had a life expectancy of just 30 to 35 years, but in the early Roman Republic, for example, you couldn’t even be a senator until the age of 60. In fact, it wasn’t uncommon for people all around the world to live to at least their 50s or 60s in ancient and medieval times. A 2013 study highlighted that between the years 900 CE to 1531 CE, most people who reached adulthood in the region of what is now Cholula, Mexico, lived until at least the age of 50.
The low historical averages we often see reported are largely due to the high infant mortality rates at the time, a once-widespread occurrence that modern medicine has greatly alleviated. While technology has helped more humans reach an older age than ever before, we may be surprised by how many of our ancestors led lengthy lives.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Unless you’re a child of a certain age, you’ve likely long outgrown the silly, whimsical notion that Santa Claus lives in the North Pole. Obviously, he actually resides in Santa Claus Village, which is a real place you can visit. Located in Rovaniemi, Finland, the festive locale is part of the Arctic Circle and remains quite cold year-round. Speaking of year-round, St. Nick is even available for daily visits despite his busy schedule. He isn’t the only person honored there, either; Roosevelt Cottage, which was built decades before the rest of the village, was constructed there in 1950 in honor of former First Lady Eleanor Roosevelt’s visit to the Arctic Circle.
He received it in 1927 from William P. MacCracken, the U.S. assistant secretary of commerce for aeronautics.
But why has Rovaniemi been designated as Santa’s hometown, you may ask? Finnish folklore and tradition have long associated Santa with Finland’s Lapland region. Though Finns believe Santa’s real home lies in Korvatunturi, a fell in Lapland where Santa and his elves are said to listen to children’s wishes, they also believe its true location must be kept secret, so they chose nearby Rovaniemi as the “official” location for practical reasons. Santa Claus Village — which also has reindeer, more than 100 Siberian huskies, and a post office — opened in 1985 and has been delighting Christmas enthusiasts ever since.
Instead of Santa Claus, Iceland has 13 “Yule Lads.”
Before you start feeling too sad for all the Icelandic children who don’t have a Santa Claus despite living fairly close to him, know this: They have 13 “Yule Lads” instead. Mischievous yet merry, the Jólasveinar (as they’re known in Iceland) begin descending from the mountains to visit children’s homes on December 12. One Yule Lad makes the excursion each night, leaving gifts for good kids and rotting potatoes for those on the naughty list.
The 13 Yule Lads are, in order, Stekkjarstaur (Sheep-Cote Clod), Giljagaur (Gully Gawk), Stúfur (Stubby), Þvörusleikir (Spoon Licker), Pottaskefill (Pot Scraper), Askasleikir (Bowl Licker), Hurðaskellir (Door Slammer), Skyrgámur (Skyr Gobbler), Bjúgnakrækir (Sausage Swiper), Gluggagægir (Window Peeper), Gáttaþefur (Doorway Sniffer), Ketkrókur (Meathook), and Kertasníkir (Candle Stealer). As you may have guessed, their names provide hints as to the particular brand of trouble they cause. This odd family’s matriarch is the cruel troll Grýla, who lives in the mountains and has a penchant for turning misbehaving kiddos into stew.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Whether at home on the couch or among the crowds in Times Square, watching the New Year’s Eve ball drop symbolizes a fresh start. But as the ball descends to mark another year gone by, it also harkens back to an era when knowing the exact time was much more difficult. Before the 20th century, timekeeping was significantly less precise; most people noted the time thanks to church bells that rang on the hour, though the system was often inaccurate. For sailors and ship captains, knowing the exact time was key for charting navigational courses, and they used a device called a chronometer to keep track of time onboard ships. That’s why Robert Wauchope, a captain in the British navy, created the time ball in 1829. The raised balls were visible to ships along the British coastline, and they were manually dropped at the same time each day, allowing ships to set their chronometers to the time at their port of departure. At sea, navigators would calculate longitude based on local time, which they could determine from the angle of the sun, and the time on their chronometer.
The Times Square New Year’s Eve Ball has dropped every year since 1907.
Nearly, but not quite: The New York ball drop has a stunning record only dimmed by World War II. Revelers gathered in Times Square in 1942 and 1943, but no ball drop took place, thanks to wartime blackouts. Instead, the new year was marked by a minute of silence followed by chimes.
Time balls emerged as a timekeeping feature throughout the world, though evidence of them is hard to find today. The U.S. Naval Observatory in Washington, D.C., installed one in 1845, which would later help history record the precise time of Lincoln’s assassination; it dropped daily through 1936. But the time ball’s reign was short-lived. The devices fell out of fashion by the 1880s, thanks to the availability of self-winding clocks. The concept would eventually be co-opted by TheNew York Times in 1907, when the newspaper’s formerly explosive New Year’s Eve celebrations were barred from using fireworks. Organizers took a chance by looking back at the time ball’s influence, and decided a lighted midnight drop was the perfect way to honor the occasion.
In the 1980s, the Times Square Ball was reconfigured into an apple.
Advertisement
Times Square’s New Year’s Eve confetti is all tossed by hand.
Dropping a deluge of confetti into Times Square on New Year’s Eve is no small feat; preparations for the confetti avalanche take about a year, beginning while the previous holiday’s tissue paper is still being swept up. A large part of organizing the confetti shower is recruiting crews to release the 3,000 pounds that descend on Times Square, since there are no cannons involved — instead, every piece is hand-tossed. Workers are trained in the proper way to fluff and throw the biodegradable paper scraps for maximum impact, which is timed to begin 20 seconds before midnight so that the confetti descends into the crowds below right on cue.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Flight attendants make our journeys through the sky safer and more comfortable. Yet they do more than just serve peanuts and soda; they’re trained to respond to safety and medical emergencies, necessary skills for cruising at 35,000 feet. However, modern flight attendants don’t have to have in-depth medical training the way the first American in-air staff did — the earliest commercial airlines equipped with flight attendants required their staff to be registered nurses.
While doctors often make the diagnoses, it’s nurses who do much of the hands-on work of caring for patients — which is why it’s a good thing there are so many of them. The U.S. has three times as many registered nurses as doctors.
The first flight attendants to board U.S. commercial flights were led by Ellen Church, a nurse who was also a licensed aviator. Unable to find work as a pilot due to gender discrimination, Church found another way into the sky by pitching airlines the concept of the “flight stewardess,” who could use her nursing skills to aid sick or injured passengers while also easing nerves at a time when flying was still somewhat dangerous and often uncomfortable for passengers. Boeing Air Transport tested Church’s idea in May 1930, hiring Church and seven other nurses for flights between San Francisco and Chicago (with 13 stops in between). In air, the attendants were tasked with serving meals, cleaning the plane’s interior, securing the seats to the floor, and even keeping passengers from accidentally opening the emergency exit door. After a successful three-month stint, other airlines picked up Church’s idea, putting out calls for nurses in their early 20s to join the first flight crews — standard requirements until World War II, when nurses overwhelmingly joined the war effort, leaving room for more women of all backgrounds to enter the aviation field.
Most commercial airplanes are painted white to reflect sunlight and keep the plane cool.
Advertisement
Florence Nightingale’s parents opposed her dream of becoming a nurse.
Florence Nightingale is often recognized as the mother of modern nursing, though if her parents had their way, she never would have jump-started the profession as we know it today. At 16 years old, Nightingale became determined to care for the ill and injured, believing it was her calling. Her parents, however, opposed the idea, arguing it was a job inappropriate for a woman of their upper-class standing. Despite being forbidden from pursuing a medical career, Florence enrolled in a German training school for teachers and nurses, eventually returning to London three years later as a hospital nurse. When the Crimean War erupted in 1853, Nightingale’s path through history followed, with her innovative nursing techniques and quest to improve hospital cleanliness eventually seen as a game changer in medical treatment — one that would even be recognized by Queen Victoria.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Pictorial Press Ltd/ Alamy Stock Photo
The scientific name Nessiteras rhombopteryx may look more or less like any other. As with many Linnaean labels, the species name rhombopteryx references the creature’s overall appearance — in this case, its diamond-shaped fins. But there’s one key difference here: The creature it describes doesn’t exist (probably). Nessiteras rhombopteryx, or “Ness monster with diamond-shaped fins,” is the proposed taxonomic moniker of the Loch Ness monster, also known as Nessie. As a brief cryptozoology refresher, Nessie is a fabled reptilian monster believed to reside in a lake called Loch Ness in the Scottish Highlands. For nearly a century, people have scoured the lake with binoculars, sonar, and other equipment, hoping to glimpse this anachronistic plesiosaur. Although “confirmed sightings” number more than a thousand, no specimen has ever been captured and cataloged.
George R.R. Martin has more species named after his books than any other author.
Although the “Game of Thrones” creator has wasps, beetles, and even a pterosaur named after his characters, no author comes close to J.R.R. Tolkien. In fact, there’s an entire genus of New Zealand wasp named Shireplitis, with species S. bilboi, S. frodoi, and S. samwisei.
And that last part is important. Usually, for a species to receive a scientific name, scientists must have a “voucher specimen” in hand for future reference. However, in a non-peer-reviewed article in the December 1975 issue of Nature, U.S. researcher Robert Rines and British naturalist Sir Peter Scott put forward the name Nessiteras rhombopteryx based on only photographs and sonar data. In the article, the authors argued that “recent British legislation makes provision for protection to be given to endangered species; to be granted protection, however, an animal should first be given a proper scientific name.” In other words, the scientists had to give Nessie a name to save it (if “it” exists at all).
Although the legend of Nessie is beloved throughout Scotland (bringing in tourist dollars never hurts), not everyone was sold on giving the mythical elusive plesiosaur an air of scientific credibility. About a week after the name’s announcement in December 1975, a Scottish MP rebuffed the pseudo-scientific endeavor, saying there just might be a reason why “Nessiteras rhombopteryx” is an anagram for “Monster Hoax by Sir Peter S.”
Unconfirmed creatures such as yeti, sasquatches, and Nessie are called cryptids.
Advertisement
The mythological history of the Loch Ness monster dates back to at least 564 CE.
The modern fascination with Nessie dates back to the 1930s, but the legend of a mythical creature lurking in Loch Ness is much older. Some point to first-century CE Pictish carvings of a creature resembling a swimming elephant as the first real evidence of Nessie, but the first written account of some kind of sighting didn’t occur until centuries later. In the seventh century CE, a hagiographer wrote about the exploits of St. Columba, a Catholic missionary credited with spreading Christianity throughout Scotland. According to this hagiography, in 564 CE St. Columba had a confrontation with some kind of “water beast,” and with the power of prayer, he convinced this unknown monster to leave his disciples alone (converting scores of Scots in the process). Filled with supernatural phenomena, the tale is as hard to believe as an ancient family of plesiosaurs lurking somewhere in Great Britain’s largest freshwater lake. But the story does establish a 1,500-year-old relationship between some unknown mythical “water beast” and the Scottish people — a relationship that remains to this day.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Not all who speak for the trees are storybook characters … or even alive. That’s true in the case of Colonel William H. Jackson, a college professor and resident of Athens, Georgia, who sought to protect his favorite tree long after he was able to enjoy its shade. A portion of Jackson’s will made its way into newspapers around 1890, thanks to an unusual request — that his favorite childhood tree, and 8 feet of land surrounding it, be given to the tree itself.
Squirrels and deer eat acorns, and humans can, too. The tannins (naturally occurring bitter compounds) found in acorns can be toxic if consumed in large amounts. However, tannins are removed by soaking or boiling the nuts, rendering acorns safe for human consumption.
While the city of Athens has respected Jackson’s wishes and cared for the tree (with the help of gardening groups), it’s unclear whether the white oak has any legal roots to stand on. No modern person has ever seen the deed Jackson supposedly drew up to give the tree ownership of itself, and Georgia law doesn’t permit nonhuman entities to possess property. Yet no one has ever contested the tree’s ability to own itself, and Jackson’s oak has become a beloved local landmark. When it fell in 1942 during a windstorm, its acorns were collected and sprouted so that a descendant sapling could be replanted in the same spot.
Amazingly, Georgia isn’t the only place with a self-owning tree. Eufaula, Alabama — a town of 12,600 people some 200 miles from Athens — is home to another independent oak. In 1935, the area garden club advocated to protect a 65-foot-wide post oak (called the Walker Oak) in the middle of town, hoping to preserve a popular spot where children played. Mayor E.H. Graves recorded a “deed of sentiment” stating in part that the tree was “a creation and gift of the Almighty, standing in our midst — to itself — to have and to hold itself,” and an iron fence with a plaque was installed around the tree. Despite its safeguarding, a windstorm toppled the original 200-year-old hardwood nearly three decades later in 1961. But just like with its counterpart in Athens, townsfolk worked to replace the tree with another tree that still stands today.
Oak wood is often used to build wine and whiskey barrels because of its durability.
Advertisement
Oak trees can drop up to 10,000 acorns in one year.
Oak trees are known to shower yards, cars, and even people with a deluge of acorns — some autumns more than others. The number of acorns a single tree drops depends on the year, since oaks follow a pattern of lean and heavy acorn-producing seasons. In “mast years,” aka years when trees produce a heavier-than-normal supply of the nuts, oaks can drop up to 10,000 acorns. Scientists aren’t entirely sure what causes mast years, but the cycle occurs every two to five years, regardless of weather or rainfall. One working theory is that the mast year cycle outsmarts predators such as squirrels and chipmunks, allowing oak trees to saturate their environment with more acorns than can be eaten and giving future saplings a shot at sprouting.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Coincidentally, they both came into existence at roughly the same time, although their reasons for adopting the controversial punctuation differ as dramatically as their settings. The first, a village in southwestern England called Westward Ho!, sought to capitalize on the popularity of the identically named 1855 book by Charles Kingsley, who wrote lovingly of nearby Bideford. Founded as a vacation resort in the 1860s, the hamlet sprung up around the Westward Ho! Hotel, and remains a notable tourist destination thanks to its scenic coastline and famed Pebble Ridge.
German correspondence includes a salutation line that ends with an exclamation mark.
Although the comma has become more commonplace, an old-fashioned greeting such as "Lieber Friedrich!" (Dear Friedrich!) can still be seen atop the similarly old-fashioned written letter in Germany.
The second place, a town in southern Quebec called Saint-Louis-du-Ha! Ha!, isn't exactly a bustling tourist destination, although early explorers may have been happy to refresh themselves at nearby Lake Temiscouata. According to the Commission de Toponymie du Québec, the archaic French term "le haha" indicates an unexpected obstacle or a dead-end, likely referring to the lake's sharp change of direction. That doesn't explain the distinct punctuation in the name — no one's quite sure how or why that started. But no matter; this unassuming community, established in 1860 as a Catholic mission, has garnered an extra boost of attention since being honored for its double exclamation marks by Guinness World Records in 2018.
Honorable mention goes to the southwestern Ohio city of Hamilton, which became known as Hamilton! following a city council vote in May 1986. While the announcement drew plenty of pre-internet buzz, the United States Board on Geographic Names and mapmaker Rand McNally & Company refused to play along. Hamilton! officials nevertheless pressed forward with duly punctuated city seals, letterhead, signs, and the like for some time, although the federally unrecognized notation had disappeared from existence by the time a city clerk undertook a short-lived attempt to revive it in 2020.
A punctuation mark that combines an exclamation point and a question mark is called an interrobang.
Advertisement
A celebrated comic book writer became known for his exclamation mark-punctuated middle initial.
Were you to leaf through an old X-Men or Spider-Man comic, it wouldn’t take long to notice the proliferation of exclamation marks in the dialogue bubbles. That had as much to do with the exaggerated scenarios portrayed in the storylines as it did with the reality of printing on cheap pulp paper, which left a tiny period impossible to see at times. In the early 1970s, new DC Comics writer Elliot S. Maggin quickly adjusted to placing an exclamation mark where a period usually went, to the point where he unwittingly typed Elliot S! Maggin on a Superman script. Intrigued, editor Julie Schwartz subsequently issued an order to the rest of the company that any mention of Maggin’s name should thereby be “punctuated with an exclamation mark rather than a period from now on until eternity.” Maggin went on to earn industry acclaim for his work on Superman over the next decade-plus, and he continues to sign off with the S! well after leaving the hyperbole of comics behind to pursue other careers in writing, teaching, and politics.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Given that the United States was born amid an anti-monarchical fervor, it’s fitting that the sole royal palace within its confines is located more than 4,700 miles from the nation’s capital. There, amid the high rises and palm trees of downtown Honolulu, stands Iolani Palace, the home of Hawaii’s 19th-century royal dynasty.
A patron of the arts, the Merrie Monarch teamed with royal bandmaster Henry Berger in 1876 to compose “Hawai'i Pono'i,” a tribute to the kingdom’s founder, Kamehameha I.
After King David Kalākaua rose to power in 1874, he elected to tear down the deteriorating coral block building that housed his predecessors and erect an ostentatious new home in a style that reflected the grand palaces he had visited while touring Europe some years prior. The “Merrie Monarch” went through three architects to get the residence he craved, winding up with a concrete-facing brick structure marked by six towers and open-air verandas stretching around all sides. The interior featured the lavish Throne Room, State Dining Room, and Blue Room to entertain dignitaries, along with a massive koa wood staircase to the private chambers of the second floor. Additional luxuries like indoor plumbing and a telephone pushed the final bill into the neighborhood of $350,000 before the palace opened in 1879, and that was beforeelectricity was installed in the late 1880s.
Unfortunately, this display of extravagance served Hawaii’s rulers for just over a decade. Kalākaua’s sister and successor, Lili’uokalani, was deposed in an 1893 coup orchestrated by American businessmen, and the palace became the offices of the provisional, territorial, and then state governments until 1969. Reopened to the public as a museum in 1978, Iolani Palace serves as a reminder of Hawaii’s days as a sovereign nation, as well as America’s complicated history with monarchies.
Iolani Palace was constructed in an architectural style known as American Florentine.
Advertisement
Other “palaces” remain in use in the U.S. as museums and historical sites.
Although they never served as the residence of a monarch, a few other American structures retain the title of “palace” as the former home of a colonial authority. The best known is Governor’s Palace of Colonial Williamsburg, which housed seven British-appointed governors in Virginia and another two American-elected ones before the original building burned to the ground in a fire in 1781. Tryon Palace in New Bern, North Carolina, opened its doors to just two royal governors and, coincidentally, was also destroyed in a fire, before being rebuilt after World War II. Farther west, the 400-plus-year-old Palace of the Governors in Santa Fe, New Mexico, is the oldest European settler-built public building still in use in the United States. And finally there’s the Spanish Governor’s Palace in Texas, the only surviving building of an 18th-century presidio that guarded the settlement of San Antonio, and likely the only government building that also variously functioned as a pawn shop, tire shop, and saloon until it was restored by the city in 1930.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Although poinsettias are a multimillion-dollar business in the U.S. today, these fiery plants have ancient roots — they were first cultivated by the Aztecs hundreds of years ago. Native to Mexico and Guatemala, the poinsettia, known to the Aztecs as cuetlaxóchitl (kwet-la-sho-she), was used for medicinal purposes: The milky white sap was thought to increase milk production, dyes derived from the leaves (or bracts) were used in textiles, and some war rituals involved the plant. Poinsettias were also believed to hold magical properties, with one Native legend saying just the smell of a poinsettia could cause infection of the reproductive organs.
Egypt is known for its tombs, but Mexico has the largest pyramid in the world. Located in the Mexican state of Puebla, the pyramid measures 4.45 million cubic meters by volume (twice the size of the Pyramid of Giza). Its name, Tlachihualtepetl, means “artificial mountain” in Nahuatl.
So how exactly did these ancient Aztec plants become so closely associated with the winter holidays? Well, the first reason is biology. Poinsettias are typically (but not always) red and green — colors that have been associated with Christmas for millennia. The plant also often reaches full bloom in December. The second part of the equation arrived in the 17th century, when Spanish Franciscan friars used the plant to decorate altars and nativities. When the Vatican eventually used the plant for decoration, other Catholic churches throughout the world weren’t far behind. In the early 20th century, farmers in California began mass-producing the plant in the U.S., and the venerable poinsettia has been a modern holiday must-have ever since.
The beautiful red plant that adorns mantles and dining tables during the holiday season is known by many names. The Aztecs called the plant cuetlaxóchitl, meaning “a flower that withers,” while the Maya used the phrase k’alul wits (“ember flower”). The Spanish friars of the 17th century called it flor de Nochebuena, or “Holy Night flower,” while other parts of Latin America used flor de Pascuas, or “Easter flower.” But in the U.S., Euphorbia pulcherrima goes by another name — poinsettia. The name is an homage to the U.S.’s first ambassador to Mexico, Joel Roberts Poinsett. An amateur botanist, Poinsett became enamored with the plant when he came across it while staying in Taxco, Mexico. Poinsett brought specimens back to his greenhouses in the U.S. around 1825 and sent clippings to a specialist in Philadelphia, who eventually christened the plant Euphorbia poinsettia. Unfortunately, Poinsett’s legacy outside horticultural circles is a troubling one, as he was an enslaver and expansionist, and interfered so much in Mexican politics that he was removed from his post by a request from the Mexican president in 1829. Because the name is both controversial and divorced from its Mesoamerican roots, some people now call this holiday favorite by its original name — cuetlaxóchitl.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Liudmyla Chuhunova/ Shutterstock
Few food products are more quintessentially American than yellow processed cheese. But despite the name “American cheese,” the method for making this shelf-stable dairy treat actually has its roots in Switzerland. In 1911, food scientists Walter Gerber and Fritz Stettler pioneered a new process to keep cheese from rapidly spoiling so it could be more easily sold in warmer environments. They shredded and melted down a Swiss cheese called Emmentaler, added sodium citrate as a preservative, and left the mixture to cool, resulting in the first processed cheese and a much longer shelf life.
Queen Victoria was given a half-ton wheel of cheese as a wedding gift.
When Queen Victoria married Prince Albert in 1840, she was given a 1,250-pound wheel of cheddar produced by cheesemakers from two local villages. After the wedding, the wheel was sent on a nationwide tour, though upon its return, Victoria refused to accept it back.
Around the same time in the U.S., Canadian American businessman James L. Kraft — founder of Kraft Foods — was working to solve that same food spoilage problem. Kraft created his own similar method, though it’s unclear how much he knew about the work of his Swiss contemporaries. In place of Emmentaler, he used cheddar cheese, which he heated at 175 degrees while whisking continuously for 15 minutes, before adding emulsifying compounds and leaving the cheese to cool.
In 1916, Kraft successfully obtained the first U.S. patent for making processed cheese. But it was 34 years until American cheese singles appeared in supermarkets. This was thanks to Kraft’s brother Norman, who headed the company’s research department and hoped to repurpose these large hunks of cheese as conveniently packaged slices. Testing began in 1935, and in 1950, Kraft De Luxe Slices debuted. They were an immediate hit, with Progressive Grocer reporting an increase in cheese sales up to 150%.
The world’s most expensive cheese is made from 60% donkey milk and 40% goat milk.
Advertisement
Andrew Jackson displayed an enormous block of cheese in the White House for more than a year.
In 1835, President Andrew Jackson was given a 1,400-pound wheel of cheese measuring 4 feet in diameter and 2 feet tall as a gift from supporter and dairy farmer Thomas Meacham, who also gifted a 750-pound wheel to Vice President Martin Van Buren. In the months that followed, small portions of the cheese were consumed or given to friends, though Jackson was still left with an enormous hunk of cheddar.
So on February 22, 1837, toward the end of his presidency, Jackson held an open event at the White House, inviting people to enjoy the block of cheese, which had sat in the Entrance Hall of the White House for more than a year to age. Around 10,000 people attended and consumed the remnants in just two hours, though the odor in the White House still persisted for months. In 1838, Senator John Davis’ wife Eliza Davis wrote that Jackson’s successor Martin Van Buren “had a hard task to get rid of the smell of cheese … he had to air the carpet for many days; to take away the curtains and to paint and white-wash before he could get the victory over it.”
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.