Whether at home on the couch or among the crowds in Times Square, watching the New Year’s Eve ball drop symbolizes a fresh start. But as the ball descends to mark another year gone by, it also harkens back to an era when knowing the exact time was much more difficult. Before the 20th century, timekeeping was significantly less precise; most people noted the time thanks to church bells that rang on the hour, though the system was often inaccurate. For sailors and ship captains, knowing the exact time was key for charting navigational courses, and they used a device called a chronometer to keep track of time onboard ships. That’s why Robert Wauchope, a captain in the British navy, created the time ball in 1829. The raised balls were visible to ships along the British coastline, and they were manually dropped at the same time each day, allowing ships to set their chronometers to the time at their port of departure. At sea, navigators would calculate longitude based on local time, which they could determine from the angle of the sun, and the time on their chronometer.
The Times Square New Year’s Eve Ball has dropped every year since 1907.
Nearly, but not quite: The New York ball drop has a stunning record only dimmed by World War II. Revelers gathered in Times Square in 1942 and 1943, but no ball drop took place, thanks to wartime blackouts. Instead, the new year was marked by a minute of silence followed by chimes.
Time balls emerged as a timekeeping feature throughout the world, though evidence of them is hard to find today. The U.S. Naval Observatory in Washington, D.C., installed one in 1845, which would later help history record the precise time of Lincoln’s assassination; it dropped daily through 1936. But the time ball’s reign was short-lived. The devices fell out of fashion by the 1880s, thanks to the availability of self-winding clocks. The concept would eventually be co-opted by TheNew York Times in 1907, when the newspaper’s formerly explosive New Year’s Eve celebrations were barred from using fireworks. Organizers took a chance by looking back at the time ball’s influence, and decided a lighted midnight drop was the perfect way to honor the occasion.
In the 1980s, the Times Square Ball was reconfigured into an apple.
Advertisement
Times Square’s New Year’s Eve confetti is all tossed by hand.
Dropping a deluge of confetti into Times Square on New Year’s Eve is no small feat; preparations for the confetti avalanche take about a year, beginning while the previous holiday’s tissue paper is still being swept up. A large part of organizing the confetti shower is recruiting crews to release the 3,000 pounds that descend on Times Square, since there are no cannons involved — instead, every piece is hand-tossed. Workers are trained in the proper way to fluff and throw the biodegradable paper scraps for maximum impact, which is timed to begin 20 seconds before midnight so that the confetti descends into the crowds below right on cue.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Flight attendants make our journeys through the sky safer and more comfortable. Yet they do more than just serve peanuts and soda; they’re trained to respond to safety and medical emergencies, necessary skills for cruising at 35,000 feet. However, modern flight attendants don’t have to have in-depth medical training the way the first American in-air staff did — the earliest commercial airlines equipped with flight attendants required their staff to be registered nurses.
While doctors often make the diagnoses, it’s nurses who do much of the hands-on work of caring for patients — which is why it’s a good thing there are so many of them. The U.S. has three times as many registered nurses as doctors.
The first flight attendants to board U.S. commercial flights were led by Ellen Church, a nurse who was also a licensed aviator. Unable to find work as a pilot due to gender discrimination, Church found another way into the sky by pitching airlines the concept of the “flight stewardess,” who could use her nursing skills to aid sick or injured passengers while also easing nerves at a time when flying was still somewhat dangerous and often uncomfortable for passengers. Boeing Air Transport tested Church’s idea in May 1930, hiring Church and seven other nurses for flights between San Francisco and Chicago (with 13 stops in between). In air, the attendants were tasked with serving meals, cleaning the plane’s interior, securing the seats to the floor, and even keeping passengers from accidentally opening the emergency exit door. After a successful three-month stint, other airlines picked up Church’s idea, putting out calls for nurses in their early 20s to join the first flight crews — standard requirements until World War II, when nurses overwhelmingly joined the war effort, leaving room for more women of all backgrounds to enter the aviation field.
Most commercial airplanes are painted white to reflect sunlight and keep the plane cool.
Advertisement
Florence Nightingale’s parents opposed her dream of becoming a nurse.
Florence Nightingale is often recognized as the mother of modern nursing, though if her parents had their way, she never would have jump-started the profession as we know it today. At 16 years old, Nightingale became determined to care for the ill and injured, believing it was her calling. Her parents, however, opposed the idea, arguing it was a job inappropriate for a woman of their upper-class standing. Despite being forbidden from pursuing a medical career, Florence enrolled in a German training school for teachers and nurses, eventually returning to London three years later as a hospital nurse. When the Crimean War erupted in 1853, Nightingale’s path through history followed, with her innovative nursing techniques and quest to improve hospital cleanliness eventually seen as a game changer in medical treatment — one that would even be recognized by Queen Victoria.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Pictorial Press Ltd/ Alamy Stock Photo
The scientific name Nessiteras rhombopteryx may look more or less like any other. As with many Linnaean labels, the species name rhombopteryx references the creature’s overall appearance — in this case, its diamond-shaped fins. But there’s one key difference here: The creature it describes doesn’t exist (probably). Nessiteras rhombopteryx, or “Ness monster with diamond-shaped fins,” is the proposed taxonomic moniker of the Loch Ness monster, also known as Nessie. As a brief cryptozoology refresher, Nessie is a fabled reptilian monster believed to reside in a lake called Loch Ness in the Scottish Highlands. For nearly a century, people have scoured the lake with binoculars, sonar, and other equipment, hoping to glimpse this anachronistic plesiosaur. Although “confirmed sightings” number more than a thousand, no specimen has ever been captured and cataloged.
George R.R. Martin has more species named after his books than any other author.
Although the “Game of Thrones” creator has wasps, beetles, and even a pterosaur named after his characters, no author comes close to J.R.R. Tolkien. In fact, there’s an entire genus of New Zealand wasp named Shireplitis, with species S. bilboi, S. frodoi, and S. samwisei.
And that last part is important. Usually, for a species to receive a scientific name, scientists must have a “voucher specimen” in hand for future reference. However, in a non-peer-reviewed article in the December 1975 issue of Nature, U.S. researcher Robert Rines and British naturalist Sir Peter Scott put forward the name Nessiteras rhombopteryx based on only photographs and sonar data. In the article, the authors argued that “recent British legislation makes provision for protection to be given to endangered species; to be granted protection, however, an animal should first be given a proper scientific name.” In other words, the scientists had to give Nessie a name to save it (if “it” exists at all).
Although the legend of Nessie is beloved throughout Scotland (bringing in tourist dollars never hurts), not everyone was sold on giving the mythical elusive plesiosaur an air of scientific credibility. About a week after the name’s announcement in December 1975, a Scottish MP rebuffed the pseudo-scientific endeavor, saying there just might be a reason why “Nessiteras rhombopteryx” is an anagram for “Monster Hoax by Sir Peter S.”
Unconfirmed creatures such as yeti, sasquatches, and Nessie are called cryptids.
Advertisement
The mythological history of the Loch Ness monster dates back to at least 564 CE.
The modern fascination with Nessie dates back to the 1930s, but the legend of a mythical creature lurking in Loch Ness is much older. Some point to first-century CE Pictish carvings of a creature resembling a swimming elephant as the first real evidence of Nessie, but the first written account of some kind of sighting didn’t occur until centuries later. In the seventh century CE, a hagiographer wrote about the exploits of St. Columba, a Catholic missionary credited with spreading Christianity throughout Scotland. According to this hagiography, in 564 CE St. Columba had a confrontation with some kind of “water beast,” and with the power of prayer, he convinced this unknown monster to leave his disciples alone (converting scores of Scots in the process). Filled with supernatural phenomena, the tale is as hard to believe as an ancient family of plesiosaurs lurking somewhere in Great Britain’s largest freshwater lake. But the story does establish a 1,500-year-old relationship between some unknown mythical “water beast” and the Scottish people — a relationship that remains to this day.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Not all who speak for the trees are storybook characters … or even alive. That’s true in the case of Colonel William H. Jackson, a college professor and resident of Athens, Georgia, who sought to protect his favorite tree long after he was able to enjoy its shade. A portion of Jackson’s will made its way into newspapers around 1890, thanks to an unusual request — that his favorite childhood tree, and 8 feet of land surrounding it, be given to the tree itself.
Squirrels and deer eat acorns, and humans can, too. The tannins (naturally occurring bitter compounds) found in acorns can be toxic if consumed in large amounts. However, tannins are removed by soaking or boiling the nuts, rendering acorns safe for human consumption.
While the city of Athens has respected Jackson’s wishes and cared for the tree (with the help of gardening groups), it’s unclear whether the white oak has any legal roots to stand on. No modern person has ever seen the deed Jackson supposedly drew up to give the tree ownership of itself, and Georgia law doesn’t permit nonhuman entities to possess property. Yet no one has ever contested the tree’s ability to own itself, and Jackson’s oak has become a beloved local landmark. When it fell in 1942 during a windstorm, its acorns were collected and sprouted so that a descendant sapling could be replanted in the same spot.
Amazingly, Georgia isn’t the only place with a self-owning tree. Eufaula, Alabama — a town of 12,600 people some 200 miles from Athens — is home to another independent oak. In 1935, the area garden club advocated to protect a 65-foot-wide post oak (called the Walker Oak) in the middle of town, hoping to preserve a popular spot where children played. Mayor E.H. Graves recorded a “deed of sentiment” stating in part that the tree was “a creation and gift of the Almighty, standing in our midst — to itself — to have and to hold itself,” and an iron fence with a plaque was installed around the tree. Despite its safeguarding, a windstorm toppled the original 200-year-old hardwood nearly three decades later in 1961. But just like with its counterpart in Athens, townsfolk worked to replace the tree with another tree that still stands today.
Oak wood is often used to build wine and whiskey barrels because of its durability.
Advertisement
Oak trees can drop up to 10,000 acorns in one year.
Oak trees are known to shower yards, cars, and even people with a deluge of acorns — some autumns more than others. The number of acorns a single tree drops depends on the year, since oaks follow a pattern of lean and heavy acorn-producing seasons. In “mast years,” aka years when trees produce a heavier-than-normal supply of the nuts, oaks can drop up to 10,000 acorns. Scientists aren’t entirely sure what causes mast years, but the cycle occurs every two to five years, regardless of weather or rainfall. One working theory is that the mast year cycle outsmarts predators such as squirrels and chipmunks, allowing oak trees to saturate their environment with more acorns than can be eaten and giving future saplings a shot at sprouting.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Coincidentally, they both came into existence at roughly the same time, although their reasons for adopting the controversial punctuation differ as dramatically as their settings. The first, a village in southwestern England called Westward Ho!, sought to capitalize on the popularity of the identically named 1855 book by Charles Kingsley, who wrote lovingly of nearby Bideford. Founded as a vacation resort in the 1860s, the hamlet sprung up around the Westward Ho! Hotel, and remains a notable tourist destination thanks to its scenic coastline and famed Pebble Ridge.
German correspondence includes a salutation line that ends with an exclamation mark.
Although the comma has become more commonplace, an old-fashioned greeting such as "Lieber Friedrich!" (Dear Friedrich!) can still be seen atop the similarly old-fashioned written letter in Germany.
The second place, a town in southern Quebec called Saint-Louis-du-Ha! Ha!, isn't exactly a bustling tourist destination, although early explorers may have been happy to refresh themselves at nearby Lake Temiscouata. According to the Commission de Toponymie du Québec, the archaic French term "le haha" indicates an unexpected obstacle or a dead-end, likely referring to the lake's sharp change of direction. That doesn't explain the distinct punctuation in the name — no one's quite sure how or why that started. But no matter; this unassuming community, established in 1860 as a Catholic mission, has garnered an extra boost of attention since being honored for its double exclamation marks by Guinness World Records in 2018.
Honorable mention goes to the southwestern Ohio city of Hamilton, which became known as Hamilton! following a city council vote in May 1986. While the announcement drew plenty of pre-internet buzz, the United States Board on Geographic Names and mapmaker Rand McNally & Company refused to play along. Hamilton! officials nevertheless pressed forward with duly punctuated city seals, letterhead, signs, and the like for some time, although the federally unrecognized notation had disappeared from existence by the time a city clerk undertook a short-lived attempt to revive it in 2020.
A punctuation mark that combines an exclamation point and a question mark is called an interrobang.
Advertisement
A celebrated comic book writer became known for his exclamation mark-punctuated middle initial.
Were you to leaf through an old X-Men or Spider-Man comic, it wouldn’t take long to notice the proliferation of exclamation marks in the dialogue bubbles. That had as much to do with the exaggerated scenarios portrayed in the storylines as it did with the reality of printing on cheap pulp paper, which left a tiny period impossible to see at times. In the early 1970s, new DC Comics writer Elliot S. Maggin quickly adjusted to placing an exclamation mark where a period usually went, to the point where he unwittingly typed Elliot S! Maggin on a Superman script. Intrigued, editor Julie Schwartz subsequently issued an order to the rest of the company that any mention of Maggin’s name should thereby be “punctuated with an exclamation mark rather than a period from now on until eternity.” Maggin went on to earn industry acclaim for his work on Superman over the next decade-plus, and he continues to sign off with the S! well after leaving the hyperbole of comics behind to pursue other careers in writing, teaching, and politics.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Given that the United States was born amid an anti-monarchical fervor, it’s fitting that the sole royal palace within its confines is located more than 4,700 miles from the nation’s capital. There, amid the high rises and palm trees of downtown Honolulu, stands Iolani Palace, the home of Hawaii’s 19th-century royal dynasty.
A patron of the arts, the Merrie Monarch teamed with royal bandmaster Henry Berger in 1876 to compose “Hawai'i Pono'i,” a tribute to the kingdom’s founder, Kamehameha I.
After King David Kalākaua rose to power in 1874, he elected to tear down the deteriorating coral block building that housed his predecessors and erect an ostentatious new home in a style that reflected the grand palaces he had visited while touring Europe some years prior. The “Merrie Monarch” went through three architects to get the residence he craved, winding up with a concrete-facing brick structure marked by six towers and open-air verandas stretching around all sides. The interior featured the lavish Throne Room, State Dining Room, and Blue Room to entertain dignitaries, along with a massive koa wood staircase to the private chambers of the second floor. Additional luxuries like indoor plumbing and a telephone pushed the final bill into the neighborhood of $350,000 before the palace opened in 1879, and that was beforeelectricity was installed in the late 1880s.
Unfortunately, this display of extravagance served Hawaii’s rulers for just over a decade. Kalākaua’s sister and successor, Lili’uokalani, was deposed in an 1893 coup orchestrated by American businessmen, and the palace became the offices of the provisional, territorial, and then state governments until 1969. Reopened to the public as a museum in 1978, Iolani Palace serves as a reminder of Hawaii’s days as a sovereign nation, as well as America’s complicated history with monarchies.
Iolani Palace was constructed in an architectural style known as American Florentine.
Advertisement
Other “palaces” remain in use in the U.S. as museums and historical sites.
Although they never served as the residence of a monarch, a few other American structures retain the title of “palace” as the former home of a colonial authority. The best known is Governor’s Palace of Colonial Williamsburg, which housed seven British-appointed governors in Virginia and another two American-elected ones before the original building burned to the ground in a fire in 1781. Tryon Palace in New Bern, North Carolina, opened its doors to just two royal governors and, coincidentally, was also destroyed in a fire, before being rebuilt after World War II. Farther west, the 400-plus-year-old Palace of the Governors in Santa Fe, New Mexico, is the oldest European settler-built public building still in use in the United States. And finally there’s the Spanish Governor’s Palace in Texas, the only surviving building of an 18th-century presidio that guarded the settlement of San Antonio, and likely the only government building that also variously functioned as a pawn shop, tire shop, and saloon until it was restored by the city in 1930.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Although poinsettias are a multimillion-dollar business in the U.S. today, these fiery plants have ancient roots — they were first cultivated by the Aztecs hundreds of years ago. Native to Mexico and Guatemala, the poinsettia, known to the Aztecs as cuetlaxóchitl (kwet-la-sho-she), was used for medicinal purposes: The milky white sap was thought to increase milk production, dyes derived from the leaves (or bracts) were used in textiles, and some war rituals involved the plant. Poinsettias were also believed to hold magical properties, with one Native legend saying just the smell of a poinsettia could cause infection of the reproductive organs.
Egypt is known for its tombs, but Mexico has the largest pyramid in the world. Located in the Mexican state of Puebla, the pyramid measures 4.45 million cubic meters by volume (twice the size of the Pyramid of Giza). Its name, Tlachihualtepetl, means “artificial mountain” in Nahuatl.
So how exactly did these ancient Aztec plants become so closely associated with the winter holidays? Well, the first reason is biology. Poinsettias are typically (but not always) red and green — colors that have been associated with Christmas for millennia. The plant also often reaches full bloom in December. The second part of the equation arrived in the 17th century, when Spanish Franciscan friars used the plant to decorate altars and nativities. When the Vatican eventually used the plant for decoration, other Catholic churches throughout the world weren’t far behind. In the early 20th century, farmers in California began mass-producing the plant in the U.S., and the venerable poinsettia has been a modern holiday must-have ever since.
The beautiful red plant that adorns mantles and dining tables during the holiday season is known by many names. The Aztecs called the plant cuetlaxóchitl, meaning “a flower that withers,” while the Maya used the phrase k’alul wits (“ember flower”). The Spanish friars of the 17th century called it flor de Nochebuena, or “Holy Night flower,” while other parts of Latin America used flor de Pascuas, or “Easter flower.” But in the U.S., Euphorbia pulcherrima goes by another name — poinsettia. The name is an homage to the U.S.’s first ambassador to Mexico, Joel Roberts Poinsett. An amateur botanist, Poinsett became enamored with the plant when he came across it while staying in Taxco, Mexico. Poinsett brought specimens back to his greenhouses in the U.S. around 1825 and sent clippings to a specialist in Philadelphia, who eventually christened the plant Euphorbia poinsettia. Unfortunately, Poinsett’s legacy outside horticultural circles is a troubling one, as he was an enslaver and expansionist, and interfered so much in Mexican politics that he was removed from his post by a request from the Mexican president in 1829. Because the name is both controversial and divorced from its Mesoamerican roots, some people now call this holiday favorite by its original name — cuetlaxóchitl.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Liudmyla Chuhunova/ Shutterstock
Few food products are more quintessentially American than yellow processed cheese. But despite the name “American cheese,” the method for making this shelf-stable dairy treat actually has its roots in Switzerland. In 1911, food scientists Walter Gerber and Fritz Stettler pioneered a new process to keep cheese from rapidly spoiling so it could be more easily sold in warmer environments. They shredded and melted down a Swiss cheese called Emmentaler, added sodium citrate as a preservative, and left the mixture to cool, resulting in the first processed cheese and a much longer shelf life.
Queen Victoria was given a half-ton wheel of cheese as a wedding gift.
When Queen Victoria married Prince Albert in 1840, she was given a 1,250-pound wheel of cheddar produced by cheesemakers from two local villages. After the wedding, the wheel was sent on a nationwide tour, though upon its return, Victoria refused to accept it back.
Around the same time in the U.S., Canadian American businessman James L. Kraft — founder of Kraft Foods — was working to solve that same food spoilage problem. Kraft created his own similar method, though it’s unclear how much he knew about the work of his Swiss contemporaries. In place of Emmentaler, he used cheddar cheese, which he heated at 175 degrees while whisking continuously for 15 minutes, before adding emulsifying compounds and leaving the cheese to cool.
In 1916, Kraft successfully obtained the first U.S. patent for making processed cheese. But it was 34 years until American cheese singles appeared in supermarkets. This was thanks to Kraft’s brother Norman, who headed the company’s research department and hoped to repurpose these large hunks of cheese as conveniently packaged slices. Testing began in 1935, and in 1950, Kraft De Luxe Slices debuted. They were an immediate hit, with Progressive Grocer reporting an increase in cheese sales up to 150%.
The world’s most expensive cheese is made from 60% donkey milk and 40% goat milk.
Advertisement
Andrew Jackson displayed an enormous block of cheese in the White House for more than a year.
In 1835, President Andrew Jackson was given a 1,400-pound wheel of cheese measuring 4 feet in diameter and 2 feet tall as a gift from supporter and dairy farmer Thomas Meacham, who also gifted a 750-pound wheel to Vice President Martin Van Buren. In the months that followed, small portions of the cheese were consumed or given to friends, though Jackson was still left with an enormous hunk of cheddar.
So on February 22, 1837, toward the end of his presidency, Jackson held an open event at the White House, inviting people to enjoy the block of cheese, which had sat in the Entrance Hall of the White House for more than a year to age. Around 10,000 people attended and consumed the remnants in just two hours, though the odor in the White House still persisted for months. In 1838, Senator John Davis’ wife Eliza Davis wrote that Jackson’s successor Martin Van Buren “had a hard task to get rid of the smell of cheese … he had to air the carpet for many days; to take away the curtains and to paint and white-wash before he could get the victory over it.”
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Yuganov Konstantin/ Shutterstock
Rudolph the Red-Nosed Reindeer is famous for his eponymous Christmas tune and for using his luminous nose to heroically guide Santa Claus through the dense snow and fog on Christmas Eve. But originally, Rudolph was created as part of an ad campaign to guide Chicago area customers into department stores. Montgomery Ward was a retailer known for releasing Christmas-themed promotional coloring books in the 1930s to attract shoppers. After years of buying and distributing books made elsewhere, it opted to cut costs by designing a book of its own in 1939. The retailer enlisted the help of copywriter Robert L. May to conceive a new story, and thus, Rudolph was born.
In winter, a reindeer’s eyes change from gold to more of a deep blue. It’s believed that the pressure in their eyes builds until fluid squeezes out from a layer behind the retina, which causes the color change. This release of fluid makes reindeers’ eyes more sensitive to light during the winter.
According to the fact-checking site Snopes, May was inspired by the story of the “Ugly Duckling” and decided to create a character that was similarly ostracized for his physical appearance. He was also influenced by the fact that reindeer had been associated with Christmas as far back as the early 19th century. May settled on a reindeer with a glowing red nose, and at first considered names such as Rollo (which he later said in a 1963 interview was “too happy”) and Reginald (“too sophisticated”); Rudolph, however, “rolled off the tongue nicely.”
May’s story was a hit with both his young daughter and his employer, which distributed 2.4 million copies of the book in 1939 and another 3.6 million in 1946. Rudolph became a national sensation in 1949, when May’s brother-in-law, Johnny Marks, composed a song about the character. That tune was recorded by Gene Autry and went on to sell 1.75 million copies in its first year, becoming the first No. 1 song of the 1950s.
Frosty the Snowman’s “official” hometown is Armonk, New York.
Advertisement
Eating KFC is a Japanese Christmastime tradition.
Christmas has always been celebrated as a secular holiday in Japan, where only 1% of the country’s approximately 125 million residents identify as Christian. Instead of attending mass or singing carols, Japanese people prefer to celebrate by eating KFC every year around Christmas. The very first Japanese KFC opened in Nagoya in 1970, and the chain quickly expanded across the nation.
In 1974, KFC launched a “Kentucky for Christmas” ad campaign to target expats overseas. But the campaign inadvertently became popular among Japanese folks, who lacked any sort of long-standing Christmas traditions of their own. Today, many Japanese people reserve their buckets of chicken far in advance, and those who don’t plan ahead end up waiting in line for hours. KFCs in Japan say their busiest day is December 24, when they sell five to 10 times as much chicken compared to a normal day.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The next time you take a sip of eggnog, you should know you’re indulging in a centuries-old tradition that traces back to medieval Britain. This sweet concoction — made from milk, cream, sugar, spices, and eggs — is the modern descendent of posset, a fixture of festive gatherings in the Middle Ages. Posset recipes vary, but most combine wine or beer with cream, sugar, and eggs, and are topped with a thick gruel made from bread, biscuits, oatmeal, or almond paste. To separate the drink from its rich topping, it was served in specialized “posset pots,” teapot-like vessels with two handles and a spout. These unique pots were passed around at English celebrations, particularly weddings, to toast prosperity and good health.
George Washington banned eggnog from his Mount Vernon estate.
On the contrary, Washington seemed to embrace this tradition. Several eggnog recipes have been discovered at the estate, including one believed to be from Washington himself, which he reportedly served to guests. It includes eggs, sugar, salt, whipping cream, nutmeg, and bourbon.
Several centuries later, the drink made its way to the American colonies, where it became a hallmark of holiday festivities. Colonists added rum, making it more potent, which paved the way for the modern recipe as we know it. By1775, the term “eggnog” was part of the American English vernacular. Etymologists pose two theories about its origin. The first suggests that “nog” comes from “noggin,” meaning a wooden cup, while others speculate it comes from “grog,” a strong beer. The origin of the word “posset” is more mysterious, possibly from the Latin word posca for a drink made of vinegar and water. The term endures to this day in the world of British baking, although it now refers to a cold cream-based dessert.
Often called “Puerto Rican eggnog,” coquito is a festive coconut milk-based drink.
Advertisement
Spiked eggnog caused a “grog mutiny” at West Point.
The infamous “grog mutiny” at the United States Military Academy in West Point, New York, is an uncharacteristically unruly chapter in the highly esteemed institution’s history — and it all started with spiked eggnog. In 1826, West Point’s annual Christmas party erupted into chaos after Colonel Sylvanus Thayer, the superintendent, banned alcohol — including eggnog — from campus.
A group of defiant cadets boated up the Hudson River to gather whiskey from a nearby town, smuggling a few gallons onto campus by bribing a guard 35 cents for reentry. Mayhem ensued as eggnog-fueled cadets sought retribution by assaulting Captain Ethan Allen Hitchcock, the officer on duty during the party. As the revelers smashed windows, broke furniture, and even drew swords, Hitchcock barricaded himself in his room, calling upon the commandant for reinforcements. The mutiny eventually dispersed, but 19 cadets and one soldier were court-martialed for their involvement in the “eggnog riot” — a holiday rebellion that’s since been cemented into West Point lore.
Rachel Gresh
Writer
Rachel is a writer and period drama devotee who's probably hanging out at a local coffee shop somewhere in Washington, D.C.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.