Humans love to think we’re the brainiest species around, but leeches have an impressive 32 brains (making them absolute shoo-ins if Mensa ever expands their ranks to include nonhuman animals). These bloodsucking invertebrates are biologically divided into around 32 separate sections, each of which features its own brain fragment. In addition to housing a leech’s thought centers, these segments serve additional functions: The first few contain a leech’s eyes and front sucker, the middle sections are where you can find the bulk of a leech’s nerves and reproductive system, and the rear portion is home to yet another sucker at its tail end.
Leeches have been used as medicinal tools for centuries.
From ancient Greece through the 19th century, leeches were used for bloodletting, which was believed to treat disease. Leeches are still sometimes used in a medical context, particularly to clean wounds and improve circulation after surgery.
Yet leeches are far from the only living things with more organs than you might expect. Cuttlefish, squid, and octopuses all have three hearts — a systemic heart to pump blood throughout the body, and two branchial hearts used for pumping blood through the gills. Cows famously have a stomach with four separate compartments, but that’s nothing compared to the Baird’s beaked whale, which can have as many as 13 stomachs. Perhaps no animal is more unusual, however, than the Ramisyllis multicaudata, a sea worm with hundreds of butts, each with its own set of eyes and brain.
The giant Amazon leech can grow up to 18 inches long.
Advertisement
Leeches were once used to predict the weather.
In 1851, an English doctor by the name of George Merryweather attempted to repurpose medicinal leeches for weather forecasting. Merryweather noted that leeches would rise in water as rain neared, perhaps reacting to the change in air pressure, and decided to use that phenomenon to create his Tempest Prognosticator. The machine was essentially a leech barometer, and featured a circle of glass bottles, each containing a leech in rainwater. As atmospheric conditions changed, the leeches would crawl to the top of the bottles, dislodging a pin that would ring a bell. Merryweather hypothesized that the more bells that rang, the more likely it was that a storm was approaching. The Tempest Prognosticator debuted at the Great Exhibition of 1851 and was later recreated for the Festival of Britain in 1951, although it failed to achieve any sort of widespread success, due to the dubious science behind the concept.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The languages we speak don’t just shape the way we communicate; they also influence how we perceive and understand the world — including something as fundamental as time. The direction in which a language is written, for example, can affect how we think about and refer to the passage of time. Because English speakers write from left to right, we tend to visualize the timeline of life to death from left to right, and describe the past and future as being “behind” us and “in front” of us, respectively — we say “looking forward” to the future and “looking back” at the past.
English was the most spoken language in the world as of 2023, based on the total number of speakers. The language with the highest number of native speakers, however, is Mandarin, the most widely spoken form of Chinese.
However, speakers of Aymara, an Indigenous language of the Andes with no traditional writing system, perceive the past as lying ahead of them because it’s known and visible, while the unseen, unpredictable future remains behind them. This element of their communication was discovered through gestures. Meanwhile, Mandarin has adopted a vertical view of time. Speakers of this language often refer to past events as "up" and future events as "down" — next week, then, becomes “down week."
Researchers have long been interested in metaphorical expressions — "spending time" or "feeling down,” for example — and whether the way we talk about abstract concepts does indeed shape how we think about them. These figures of speech fall under the umbrella of linguistic relativity: the thought that the language we speak influences our reality. But some critics argue that this theory — also known as the Whorf-Sapir hypothesis — overstates the influence of language on thought. They argue that while language can indeed shape our perceptions, it does not rigidly determine how we think, or how we understand the world.
Someone who can speak several languages fluently is known as a polyglot.
Advertisement
A watch made for Marie Antoinette took 44 years to make — and she never saw it.
In 1783, a spectacular watch was commissioned for Queen Marie Antoinette of France. It was to be made by pioneering Swiss watchmaker Abraham-Louis Breguet, with no expense spared or time limit put on its creation. The finished product was indeed something to behold: a gratuitous luxury befitting of the famously lavish monarch. “The Queen,” as the watch was dubbed, used gold instead of brass, and included a perpetual calendar, metallic thermometer, and sapphire mechanisms.
It took 44 years to complete — a time frame that exceeded the life not only of Antoinette, who was executed in 1793, but also of Breguet himself, who died in 1823. Breguet’s son finished the masterpiece in 1827. The watch later ended up in Jerusalem’s L.A. Mayer Museum for Islamic Art, but it was stolen in 1983 and vanished for decades. It was finally recovered in 2006, returned by the thief’s widow.
Nicole Villeneuve
Writer
Nicole is a writer, thrift store lover, and group-chat meme spammer based in Ontario, Canada.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The next time you thoroughly shuffle a deck of cards, you’ll almost certainly have landed on a combination that’s never been created before — and may never be created again. This may sound unlikely or even impossible, given that each deck contains just 52 cards, but there are actually more ways to shuffle a deck of cards than there are atoms on Earth. The exact number of possible card combinations is 8 x 10 to the 67th power, which is an 8 followed by 67 zeroes — an almost unfathomably large number. If you were to go back in time to the beginning of the universe and rearrange a deck of cards into a new permutation every second, the universe itself would come to an end before you were a billionth of a way to one of those arrangements repeating itself.
Richard Nixon funded his first campaign with poker winnings.
Nixon was a skilled player during his time in the U.S. Navy and did indeed use his winnings to fund his successful 1946 congressional race.
As for how many atoms there are on the planet, most estimates put the number at 1.3 x 10 to the 50th power or 130,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000. This is obviously a vast figure in its own right, but it’s still dwarfed by the potential groupings of a deck of cards. The good news for the math-averse among us is most of us will never have to deal with such impossibly immense figures in our day-to-day lives — or in our next poker game.
Poker player Andrei Karpov once lost his wife in a poker game.
Advertisement
Atoms are almost entirely empty.
If you were to expand an atom to the size of a sports arena, its nucleus — by far the densest part of an atom, where most of its mass is concentrated — would be roughly the size of a pea. The rest of the atom, about 99.9% of it, would be empty space. The electrons floating around the nucleus are quite small, even compared to protons and neutrons; one proton is 1,836 times larger than a single electron.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Jiggly, wiggly, and inexpensive — Jell-O has a reputation for being the ultimate affordable and fun dessert. But the moldable treat we’re familiar with today wasn’t always affordable fare for the masses. At one time, its key ingredient — gelatin — was difficult to come by, making any gelatin-rich dish a symbol of wealth and social standing. What’s more, the earliest gelatin dishes weren’t post-dinner treats; in medieval Europe, cooks used gelatin to preserve meats in aspics, making savory jellies similar to modern head cheeses. Extracting gelatin back then was time-intensive: Cooks spent days boiling animal bones and byproducts, then straining the liquid before letting it set into its gelatinous state. This lengthy, involved process meant that gelatin dishes were rarely served at the dinner tables of everyday folks who didn’t employ kitchen staff.
If you’ve been vaccinated against measles, chickenpox, or rabies, chances are your shot contained gelatin. The jiggly preservative is used in some vaccines to stabilize ingredients, protecting them from extreme temperatures during storage and transport.
Gelatin’s status as a high-class delicacy would only last a few centuries. Peter Cooper, an inventor who also designed the first American steam locomotive, created a “portable gelatin” in 1845 that was easily reconstituted with hot water. But Cooper was uninterested in marketing his invention, and his gelatin was largely ignored despite its potential success with cooks who yearned for an easier method for making gelatin — such as suffragette and cookbook author Mary Foote Henderson, whose 1876 Practical Cooking and Dinner Giving footnoted her gelatin recipe by saying she’d never again undertake the arduous task of making the stuff. Cooper’s creation was eventually sold to a New York cough syrup manufacturer, who added fruit flavors and branded it with its Jell-O name in 1897. By the early 20th century, Jell-O ads promoted the dessert as a low-cost, high-society wonder, and the Great Depression and World War II solidified Jell-O’s versatility as a budget- and rations stretcher — a reputation that has carried on for more than 100 years.
Photographers once used gelatin to print photographs.
Gelatin isn’t just for eating — it was once an important ingredient used in 19th- and 20th-century photo developing. Gelatin silver prints emerged around the 1870s, using a specialized photo paper that included a layer of silver salt particles infused in gelatin. The developing process was a photography breakthrough, because it created detailed images with refined clarity that were more stable and durable than other early photographs. Despite their detail, gelatin silver prints didn’t take off in popularity for nearly four decades — until World War I, when war-related shortages of other popular photo papers (specifically those made with platinum) led photographers to experiment with a variety of paper options. Finally recognized for their ability to provide crisp black-and-white images, gelatin silver prints remained popular through the 1970s. While not commercially used today, many of those early photographs live on, giving us crystal-clear glimpses of historical events.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Today, cultures around the world have specific rules and phrases for the common toast. In South Korea, one accepts a drink with two hands, and in Italy, locking eyes is absolutely essential. But how exactly does the word “toast,” as in dry bread, figure into all of this? Well, it turns out dunking literal pieces of toast into a drink during celebrations in someone’s honor was commonplace centuries ago. Historians believe the practice came from the idea that the bread soaked up unwanted bitter or acidic sediments found in wine, thus making the drink more enjoyable. By the 18th century, the term “toast” somehow became more entwined with the person receiving the honor than the bread itself, which is also where the phrase “toast of the town” originates.
Jack Daniels is the top-selling liquor brand in the world.
While less than 5% of its sales come from the U.S., the top-selling liquor brand in the world is Jinro Soju. The national drink of South Korea, soju is a liquor typically made from rice or sweet potatoes. It’s popular in other parts of Asia as well, especially China and Japan.
Although dipping crusty bread into your beverage isn’t a common custom today, you don’t have to look hard to find remnants of the practice in literature. In William Shakespeare’s The Merry Wives of Windsor, the hard-drinking Falstaff quips, “Go, fetch me a quart of sack [wine]; put a toast in ’t,” a reference to the bread-dipping ritual. Lodowick Lloyd’s The Pilgrimage of Princes, written in 1573, also contains the passage “Alphonsus … tooke a toaste out of his cuppe, and cast it to the Dogge,” confirming that the alcohol-infused bread didn’t always go to waste after being dunked. Because general toasting in 16th- and 17th-century Europe was often an excuse to drink heavily, many temperance movements, including one in Puritan Massachusetts, banned the practice in the name of health. Of course, these bans didn’t stick, and today toasts — sans actual bread — are central to some of the biggest celebrations in our lives.
In Victorian England, a sandwich made of toast in between slices of bread was prescribed for invalids.
Advertisement
Libation is an ancient drink-pouring ritual found in many cultures.
Today the word “libation” is mostly used as a stand-in for “alcoholic beverage,” but such a definition omits the complex history of the religious and secular ritual known as libation — the act of pouring out a drink to honor the deceased or a deity. Libation is one of the most widespread yet least understood rituals in human history. The act of pouring out liquid (whether on the ground or on an elaborate altar) can be found in cultures throughout the world dating back to the Bronze Age. The Papyrus of Ani, dated 1250 BCE, reads, “Pour libation for your father and mother who rest in the valley of the dead,” and religions with seemingly little connection, such as Greek paganism, Judaism, Christianity, and traditional African religions, all feature some sort of libation ceremony. Even tribes in pre-Columbian South America, separated by an entire ocean from these other examples, performed similar liquid sacrifices. Today, forms of libation rituals still occur in Kwanzaa celebrations, weddings, the hit comedy show Key & Peele, and in bars around the world, where patrons (usually metaphorically) “pour one out” for the dearly departed.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Considering most fruits have a spherical or ovate shape, the average banana’s long, curved appearance is something of an anomaly. This unique curvature is due to a scientific concept called negative geotropism, where the stem flexes upward as the plant grows, rather than being pulled straight down by the forces of gravity. While most fruits simply absorb sunlight and grow downward toward or into the earth, bananas begin to curve as they strive to find sufficient sunlight to fuel their growth. This has to do with the unique presence of photosensitive hormones called auxins, which influence how bananas react to light.
Bananas contain naturally occurring radionuclides, particularly the potassium-40 isotope. But eating a single banana provides an infinitesimal dose of about 0.01 millirems of radiation. In other words, you’d need to eat 274 bananas every day for seven years to develop radiation poisoning.
Some bananas grow in lush rainforests with dense canopies, which can obscure the fruit from getting enough light. In these cases, bananas will grow toward the sky to break through the light-blocking canopy. But negative geotropism still occurs even in other environments where there’s plenty of direct sunlight. The auxins are distributed unevenly along the side of the banana facing the sun, triggering accelerated growth on that side and causing the fruit to curve away from the earth’s gravitational pull.
In the very early stages of development, bananas actually grow at a straight downward angle, only developing their signature shape later on. As the fruit matures, it will begin to flex upward in search of additional sustenance. But even as this happens, gravitational forces will continue to pull the banana down toward the ground and away from the sun. This combination is what ultimately gives bananas their distinct curve.
A visual artist once sold two bananas for $120,000 each.
In 2019, visual artist Maurizio Cattelan unveiled a conceptual piece titled “Comedian” at the Art Basel exhibition in Miami Beach. This unusual artistic work consisted of a banana that had been duct-taped to the wall. For years, Cattelan had dreamed of creating a sculpture in the shape of a banana; he often brought a banana with him on his travels and hung it on the wall for inspiration. But eventually, he gave up on the idea of creating a new sculpture and instead decided to exhibit the banana itself. He brought three editions of “Comedian” with him to Miami, two of which immediately sold for $120,000. Given the high level of interest, Cattelan raised the price of the third one to $150,000, which also promptly found a buyer. A week later, performance artist David Datuna ate one of the pricey fruits right off the wall, criticizing the artwork for embodying wealth inequality and food insecurity.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
For those in the military, camouflage can be the difference between life and death. For those who simply think it looks cool, it’s considerably less essential. With that in mind, it makes a certain amount of sense that 20 countries have banned civilians from wearing it. Those nations are Antigua and Barbuda, Azerbaijan, the Bahamas, Barbados, Dominica, Ghana, Grenada, Guyana, Jamaica, Nigeria, Oman, the Philippines, St. Lucia, St. Vincent and the Grenadines, Saudi Arabia, South Africa, Trinidad and Tobago, Uganda, Zambia, and Zimbabwe.
Camouflage is the main reason chameleons change color.
A chameleon usually changes color to reflect its emotional state, as when trying to attract a mate or ward off a foe, or to regulate its temperature. Camouflage is an added benefit of this ability, but not the main reason for it.
In some cases this is because civil unrest has given rise to paramilitary organizations, and any civilian wearing camo could be mistaken for a member of such groups or even of the actual armed forces — a potentially dangerous scenario in which to find oneself. The regulations for wearing camo differ greatly between countries, however. Some, such as Trinidad and Tobago, ban all forms of military-style camouflage (even styles clearly worn for fashion), while others, such as South Africa, ban only the specific patterns used by their military.
In nature, the opposite of camouflage is just as important.
While many animals use camouflage to blend in with their environment — chameleons, arctic foxes, leopards, and countless others — some are bold enough to do just the opposite. It’s called aposematism, also known as warning coloration, and it’s meant to do exactly what the latter name suggests: ward off potential predators by letting them know it’s a bad idea to attack. Often this is because the creature in question is venomous, toxic, or has stingers; sometimes it’s simply because they smell or taste bad. Any bird that eats a monarch butterfly, for instance, will soon regret its decision after falling ill from the toxins monarchs derive from the milkweed plants they eat in their caterpillar stage. Aposematism most often comes in the form of displaying bright colors such as red, orange, and yellow, but highly contrasting colors such as black and white — as in the case of skunks, well known for their malodorous defense mechanism — are common as well.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
From the automatic tasks of regulating breathing and blood pressure to the voluntary efforts needed for muscle movement, the central nervous system puts in a lot of work to maintain the complex mechanisms of the human body. Pace is crucial to keep this system running smoothly — which is why some signals from our body’s command centers can reach a speed of 268 miles per hour.
Although this concept is a popular “fact” thrown out in movies and TV shows, people tend to use the entirety of their brains over the course of a day.
In a nutshell, nerve cells in the brain and spinal cord send information through branching nerve fibers known as axons, which release chemicals across microscopic gaps to be picked up by other cells and synthesized by the appropriate areas of the body. The speed of this process varies according to the size and properties of the nerve fiber; bulky A-alpha axons, which can be 20 micrometers in diameter, have the bandwidth to generate the fastest impulses. Additionally, gaps along axons that are covered by a sheath of fats and protein, known as myelin, contain positive sodium ions that keep signals charged for rapid transmission.
So which bodily act necessitates the thickest channels to conduct information at speeds approaching those of the world’s fastest cars? That would be the delicate balance required for proprioception, our ability to sense the movement and positioning of body parts without looking. At the other end of the spectrum are the unmyelinated fibers that relay pain signals at a near-crawl of 1 mile per hour — evidence that our central nervous systems at least attempt to cushion the blow when serving as the bearer of bad news.
The chemical messengers that carry signals from nerve cells to other cells are called neurotransmitters.
Advertisement
Multiple sclerosis destroys the fiber coating that aids brain-signal transmission.
Myelin does more than facilitate the high-speed transport of brain signals, as it also provides a layer of insulation for the fibers that execute this crucial process. When myelin is damaged, the central nervous system stimulates the production of cells called oligodendrocytes to repair the harm. However, neurological diseases such as multiple sclerosis attack both the myelin and the oligodendrocyte-producing cells, resulting in disrupted signals and telltale MS signs such as impaired vision and speech. To this point, the discovery of a clear method to undo nerve cell damage has largely eluded medical researchers, although some recent trials have pointed to a potential breakthrough. Among them, treatments involving the over-the-counter antihistamine clemastine have produced evidence of myelin regeneration, providing hope that a cure can be found in the lifetime of at least some of the 2.8 million people now living with MS.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Cooler weather, shorter days, and changing leaves are small harbingers of one undeniable truth: Oktoberfest is at hand. Most years (except 2020 and 2021) since 1810, the German town of Munich has erected massive beer tents (some capable of seating 6,000 people), tapped kegs filled with liquid masterpieces such as helles, Pilsner, and hefeweizen, and held the world-renowned beer celebration called Oktoberfest — the largest beer festival in the world. Although Germany will likely never relinquish their beer-guzzling crown, a few towns around the world hold similar Bavarian bashes that rival the original. One of the biggest is the Kitchener-Waterloo Oktoberfest, held about 75 miles west of Toronto. Established with only $200 back in 1969, the festival has exploded in popularity in the ensuing decades, and regularly attracts more than 700,000 people — including Canada’s Prime Minister Justin Trudeau, who opened the 2016 festival by tapping its first keg.
Although it’s impossible to know where the first beer was brewed for certain, the boozy beverage likely coincided with the arrival of grain agriculture 12,000 years ago. The first concrete evidence of barley beer comes from the ancient Sumerians of Mesopotamia (modern Iraq and Syria).
While the event in Kitchener-Waterloo is a leading candidate for the world’s largest beer festival outside Germany, it does have some competitors. Its biggest rival comes from a country intimately familiar with throwing big parties: Brazil. Today, the town of Blumenau in southern Brazil is known as “Little Germany” because it was founded by German pharmacist Hermann Bruno Otto Blumenau in 1850 alongside 17 other German immigrants. Around 30% of the town is now of German descent, so it makes sense that Blumenau holds a 19-day-long Oktoberfest against the backdrop of the town’s German-style architecture.
To begin Oktoberfest, Munich’s mayor taps the first keg and says “O’zapft is!” (“It’s tapped!”)
Advertisement
The first Oktoberfest was actually a wedding celebration.
On October 12, 1810, Prince Regent Ludwig of Bavaria married Princess Therese of Saxony-Hildburghausen. Five days later, all the locals were invited to take part in the royal couple’s marital bliss by celebrating at a party complete with a horse race on a large open field outside the city. The gathering was such a success, the town decided to have another party (and horse race) the next year, and then a third one in 1812. By 1818, drink stands began supplying the beer, and by 1896, those stands had transformed into tents. While this Bavarian couple isn’t a household name today, their wedding reception, now known as Oktoberfest, is technically the longest wedding celebration in human history. Missing only a handful of years due to wars or pandemics, Oktoberfest remains the largest beer festival in the world. Although at first glance the original intent of the celebration appears lost amid untold gallons of lagers and ales, its legacy lives on in at least one small way. Every year since its inception, Munich’s Oktoberfest takes place on the same stretch of ground that celebrated the royal couple’s union all those years ago. It’s known as Theresienwiese, or “Therese’s fields.”
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Candy corn may be the most maligned confectionary — it’s rare that the tricolored treat ever tops the list of beloved Halloween candies. Yet somehow it returns with flair each fall, an unyielding symbol of the season. While the waxy, triangular kernels have essentially remained the same since their creation in the 1880s, one thing has changed about candy corn: its name. In its earliest years, candy corn was called “Chicken Feed,” a catchy name appealing to rural Americans during a time when nearly half the country’s population worked on farms.
Most corn grown in the U.S. isn’t meant to be eaten as a vegetable.
Farmers in the U.S. plant around 90 million acres of corn each year, though most isn’t for human consumption. Just 1% of corn grown in the country is meant for table fare, with the remaining 99% used for livestock feed, ethanol production, plastics, and more.
Not much is known about candy corn’s origin, though credit for its creation often goes to the Wunderle Candy Company, a Philadelphia venture that first produced the candy during the 1880s. However, another manufacturer — the Goelitz Confectionary Company, which would grow into the modern Jelly Belly Candy Company — further popularized the treat around 1898, designing packaging featuring a rooster and the tagline “Something worth crowing for.” By then, the treat was called “candy corn.” At a time when most real corn was planted for animal feed, candy corn was a novelty play on the idea that corn could actually be enjoyable for humans.
Making the miniature kernels was a time- and labor-intensive process done entirely by hand. Workers called “runners” walked backward along a conveyor belt packed with cornstarch molds, lugging buckets filled with a hot, sugary slurry that slowly dripped out through a hole. Each pass contributed one of the candy’s iconic yellow, orange, and white layers, which cooled into shape. Today, the process is nearly entirely mechanized and much faster, allowing candy corn factories to produce about 9 billion of the kernels each fall, just in time for seasonal snacking.
Shoppers in California buy more candy corn than shoppers in any other state.
Advertisement
Candy corn was once a Christmas treat.
It’s usually hard to find a bag of candy corn on store shelves before September rolls around, and the treat typically disappears right after Halloween, but it wasn’t always that way. For decades after its invention in the 1880s, candy corn was an everyday snack, available year-round as “penny candy” for purchase cheaply and in bulk. However, making the treat was laborious, so manufacturers often crafted it in large batches between March and November, creating a stockpile that flooded candy shops for the fall and winter holidays. During the 1920s, advertisers marketed the treat as a top candy for Christmas and New Year’s celebrations. That changed during the 1950s, when Halloween and trick-or-treating became more widespread. As the Halloween holiday became linked with candy, confectioners began advertising candy corn as the perfect October 31 treat, linking the kernels to autumn and eventually changing the time of year we nibble on the mellowcreme triangles.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.