Original photo by RapidEye/ iStock

From fine art to classic cars, many incredibly valuable items have been offered up at auction. But while some bidders have their sights set on buying a prized gem, others are more keen on bidding for quirky and unusual items that are once-in-a-lifetime finds. Certain people will pay any price for the chance to add a weird relic to their collection, whether it’s a decades-old pastry or even the surgically removed body part of a famous musician. Here are six of the strangest items ever sold at auction.

A boxed slice of wedding cake, from the British Royal wedding of Prince William.
Credit: BEN STANSALL/ AFP via Getty Images

A Slice of Cake From Queen Elizabeth II’s Wedding

On November 20, 1947, Queen Elizabeth II and Prince Philip were married in a lavish ceremony. Following the service, a reception was held at Buckingham Palace, where the 2,000 guests in attendance were each given a slice of the 9-foot-tall, four-tier wedding cake designed by confectioner Fredrick Schur. The indulgent cake included ingredients from across the British empire: dried fruit from Australia, butter from New Zealand, flour from Canada, brandy from South Africa, and Jamaican rum. But while some guests chowed down on the delicious dessert, others held on to their portions for decades to come.

In 2013, a slice of said cake went up for sale at Christie’s auction house, with an eventual hammer price of £1,750 (more than $2,000 today). The slice was wrapped and placed in a box inscribed with the words “EP Buckingham Palace 20th November 1947.” The package also included a card reading, “With the Best Wishes of Their Royal Highnesses The Princess Elizabeth and The Duke of Edinburgh.” Despite some evident decay, the dessert — which had been given to a man who formed part of the Guard of Honour at the royal wedding — attracted many bids. Two years later, in 2015, yet another slice of the cake went up for auction, this one selling for £500 (around $610 now). 2022 saw a portion of Queen Elizabeth II’s wedding cake go up for sale yet again, and this time the auction house warned potential buyers that the item is no longer edible.

Banksy's newly completed artwork 'Love in in the Bin'.
Credit: Jack Taylor/ Getty Images News via Getty Images

The Self-Destructing Banksy Painting

Banksy is an anonymous graffitist who is heralded as one of modern art’s most prolific figures. Among his most notable works is 2006’s “Girl with Balloon,” a canvas version of which was put up for auction in 2018. Moments after the work sold for $1.37 million, a motor within the painting’s frame initiated a self-destruct sequence. The canvas began slowly descending through the frame, which shredded part of the spray-painted work into dangling strips while a shocked auction gallery looked on.

Banksy — who later posted an anonymous video taking credit for the self-destructive act — claimed that he had installed the shredder to destroy the painting should it ever be auctioned. While Banksy’s intent may have been to render the painting worthless, it did quite the opposite. In 2021, the partially shredded work, now renamed “Love Is in the Bin,” went up for auction yet again, this time selling for $25.4 million.

Close-up of John Lennon being interviewed.
Credit: Michael Putland/ Hulton Archive via Getty Images

John Lennon’s Tooth

Sometime between the years 1964 and 1968, the Beatles’ John Lennon gave his housekeeper, Dot Jarlett, a tooth of his to dispose of. Lennon had had the tooth removed at the dentist earlier that day, though later changed course and said that Jarlett should give the tooth to her daughter, who was a huge Beatles fanatic. The family held on to Lennon’s stained and partially rotted tooth for decades before the molar ultimately hit the auction block in 2011, when it sold for $31,200. The tooth was purchased by a Canadian dentist named Michael Zuk, who even wrote a book about celebrity teeth. He claimed that when he heard about the auction, he “had to have it.” Oddly enough, the tooth isn’t the only body part of a famous musician to sell for thousands. In 2009, a lock of Elvis Presley’s hair from the year 1958 sold for $18,300.

Aerial view of old fashioned french toast, with butter and syrup.
Credit: fdastudillo/ iStock

Justin Timberlake’s Leftover French Toast

Around the early 2000s, eBay was all the rage, as the online auction site had debuted just a few years prior. At the same time, few bands were more popular than ’N Sync, and heartthrob Justin Timberlake was a member. On March 9, 2000, Timberlake participated in an interview at New York’s Z-100 radio station, during which he partially consumed some French toast. Rather than throw the two slices of uneaten French toast in the trash, the station DJ took the food and listed it on eBay. The half-eaten breakfast sold for $1,025 to 19-year-old ’N Sync superfan Kathy Summers, who claimed that she planned to “probably freeze-dry it, then seal it… then put it on my dresser.”

Paining by Carl Kahler titled My Wife's Lovers.
Credit: Album/ Alamy Stock Photo

The World’s Largest Cat Painting

In 2015, Sotheby’s put the purr-fect painting up for auction. Considered the world’s largest cat painting, “My Wife’s Lovers” was created around 1893 by Austrian artist Carl Kahler, who spent three years on it. The painting measures 75 inches by 102 inches and weighs a staggering 227 pounds — so humongous that Sotheby’s had to construct a special reinforced wall to ensure it could be safely displayed.

Kahler was commissioned to create the painting by San Francisco philanthropist Kate Birdsall Johnson, a devoted cat lover who cared for around 350 cats. Of those 350, 42 made it into the piece, most prominently her cat Sultan, who had been purchased for $3,000 on a trip to Paris. Cat lovers came out in droves to view the painting in person while it was on display prior to the auction, with the work ultimately selling for a whopping $826,000.

Didius Julianus, the 20th Roman emperor.
Credit: Pictures from History/ Universal Images Group via Getty Images

The Roman Empire

Didius Julianus is far from the most notable Roman emperor, but he was certainly one of the wealthiest. On March 28, 193 CE, the then-reigning emperor, Pertinax, was assassinated by Rome’s Praetorian Guard, leaving no apparent successor. The soldiers — who served as protectors of the throne — vowed that no successor would be allowed without their approval, which in turn led to an auction to determine who would ascend to the throne.

Didius Julianus, who boasted vast wealth, outbid Pertinax’s father-in-law, Titus Flavius Suspicianus, to purchase the position of Roman emperor for himself. Julianus’ bid is believed by some historians to have been in the range of 25,000 sesterces per Praetorian soldier, equating to a total payment of over 200 million sesterces. After handing over the winning bid, Julianus was declared emperor by the Roman senate, despite the fact that he was both feared and abhorred by that body. Julianus’ reign was short-lived, however — he was killed on June 1, 193 CE, by invading Danube forces.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Lorelinka/ Shutterstock

Ahoy, mateys! Everyone knows that in addition to making an excellent costume for Halloween, pirates are pretty fun, at least in their Disney-fied state with parrots, rum, jewels, and gold. However, a lot of the modern stereotypes about pirates just don’t hold water. Here are a few facts about pirates that won’t get you cast out to sea.

Robert Newton as Long John Silver.
Credit: Screen Archives/ Moviepix via Getty Images

Pirates Didn’t Talk Like You Think They Did

Shiver me timbers: Pirates didn’t actually go around saying “arrrrrr” (or “ahoy, mateys,” for that matter). In fact, they probably spoke more or less just like other sailors of the time. We can blame the “pirate accent” on Hollywood. Actor Robert Newton gave an influential performance as Long John Silver in the 1950 Disney adaptation of Robert Louis Stevenson’s Treasure Island, which itself was the source of much (often inaccurate) modern-day pirate lore. Newton based his pirate-speak on the dialect of the West Country in southwestern England, where he hailed from (and where Long John Silver comes from, in the book). But don’t let the facts get in the way on September 19, which is International Talk Like a Pirate Day.

Captain William Kidd with a map and world globe.
Credit: Print Collector/ Hulton Archive via Getty Images

Pirates Didn’t Make Treasure Maps

This one is another myth from Treasure Island — not that we wouldn’t all enjoy finding a chest filled with gold doubloons. Although Captain Kidd did bury some booty on New York’s Gardiner’s Island, most pirates spent or sold the fruits of their “labor” pretty quickly. To date, there has never been a single case of an “authentic” pirate map resulting in a treasure find. However, in the golden age of piracy, accurate maps were rare and valuable, and were considered an actual treasure all on their own.

'Madame Ching', born Shih Yang,1775–1844.
Credit: IanDagnall Computing/ Alamy Stock Photo

Women Ruled the High Seas, Too

Move over, Blackbeard: The “fairer sex” had its fair share of pirates. In addition to Anne Bonny (a prominent character in the piracy TV drama Black Sails), a number of women are known to have been successful swashbucklers. Cantonese commoner Cheng I Sao married into piracy, and upon her husband’s death expanded the family business beyond his wildest dreams. She commanded up to 600 ships and as many as 40,000 men, and defeated European and Chinese fleets in fierce battles. Upon her retirement, Cheng I Sao sailed 260 junks into Canton Harbor and demanded a pardon — which the terrified government was happy to grant.

Close-up of a pirate peg leg.
Credit: duncan1890/ iStock

Most Pirates Didn’t Have Peg Legs

Piracy is a pretty rough business and swords are sharp and pointy, so the occasional eye patch isn’t improbable on a pirate. But while amputations certainly wouldn’t be unheard of on an unsanitary ship, the trope of pirates sporting wooden legs is almost entirely a literary convention courtesy of Treasure Island. That being said, Robert Louis Stevenson’s character Long John Silver is said to have been inspired in part by Welsh pirate John Lloyd and perhaps French privateer Francois Le Clerc — two one-wooden-legged exceptions to the rule.

A large green parrot in a tropical forest.
Credit: Ivan_off/ iStock

Some Pirates Probably Did Have Parrots

Although it’s yet another trope from Treasure Island, parrots as pirate companions certainly make sense. As pets go, the colorful birds would have consumed little of a ship’s resources, considering they eat like, well, birds. And parrots’ abilities to mimic human speech would have been an amusing diversion during long and often boring journeys at sea. Finally, the regions roamed by pirates during the “golden age” included many places (such as the Caribbean and Mexico) where the otherwise exotic birds were plentiful.

Jolly Roger, the Pirate's Flag.
Credit: Bettmann via Getty Images

The Skull and Crossbones Flag Was Real

Pirate ships did actually fly the Jolly Roger, a black flag featuring a skull and crossbones, which was intended to strike fear into the hearts of all who saw it. Though they’d often originally approach their victims under a friendly flag, switching to a black banner (or sometimes a red one) announced their plundering intentions. One theory links the name “Jolly Roger” to Satan, as “Old Roger” was a common 18th-century nickname for the Devil. Not all pirate flags looked quite like this, though — some also incorporated bleeding hearts, an hourglass, or other fearsome insignia.

Cynthia Barnes
Writer

Cynthia Barnes has written for the Boston Globe, National Geographic, the Toronto Star and the Discoverer. After loving life in Bangkok, she happily calls Colorado home.

Original photo by tuulimaa/ iStock

Bones are the unsung heroes of biology. Always working beneath the surface, they’re the ossified architecture that makes our bipedal existence possible — not to mention the existence of thousands of other species. And their amazing durability gives archaeologists and paleontologists an unparalleled glimpse into early human history and beyond. Here are seven amazing facts about bones to shine some much-needed light on these building blocks of our bodies.

Spectacular eagle flying with wings outspread.
Credit: LeslieLauren/ iStock

Less Than 5% of Animal Species on Earth Are Vertebrates

All animals on Earth fall into one of two categories: vertebrates or invertebrates. This distinction is based on whether an animal has a spinal column. (Those that do, including humans, are the former; those that don’t are the latter.) Although all mammals, reptiles, amphibians, birds, and fish are vertebrates, they’re vastly outnumbered by invertebrates, which include worms, sea sponges, arthropods, and jellyfish (not really fish, despite their name). In fact, of the estimated 1.37 million surviving species on Earth, according to the International Union for the Conservation of Nature, only 66,800 have a spine. It turns out that a backbone is a bit of a biological rarity.

The largest invertebrate group by far is that of the class Insecta, which includes around 900,000 known living species (about 80% of the species on Earth) and millions more that have not yet been described by science. (Some estimates put the number of unnamed insect species as high as 30 million, though it’s likely less than that.) Insects don’t have spinal columns; instead, they have exoskeletons, which, while lacking a backbone, do have some spine-like features. And speaking of spines…

Close-up of an Reticulated Python (Python reticulatus).
Credit: Mark Kostich/ iStock

Pythons Have Nearly 20 Times More Vertebrae Than Humans

The average human is born with 33 distinct vertebrae, which are connected to one another through flexible joints called facets. Birds, meanwhile, have anywhere from 39 to 63 vertebrae. But even they can’t compete with snakes, especially large species of snakes like pythons. The Australian Oenpelli python (Morelia oenpelliensis), for example, may have as many as 600 vertebrae. That’s nearly three times as many bones as an adult human has in their entire body — though only two times as many as that same human has at birth. Which brings us to the next fact…

Portrait of new born child boy one week old sleeping peacefully.
Credit: NataliaDeriabina/ iStock

Human Babies Have More Bones Than Adults

Babies pack a lot of bones in their tiny bodies — around 300, in fact, which is nearly 100 more than adult humans have. The reason for this is biologically genius: These extra bones, many of which are made entirely or partly of cartilage, help babies remain flexible in the womb and (most crucially) at birth, making it easier for them to pass through the birth canal. As a baby grows into childhood and eventually early adulthood, the cartilage ossifies while other bones fuse together. This explains the “soft spots” in a baby’s skull, where the bones have yet to fuse completely.

It also explains why kids may be more susceptible to injury — fracture rates are high around the ages of 11 to 15, when many young people experience growth spurts due to puberty. This is because children’s bones have growth plates, which are particularly sensitive to trauma. Those growth plates eventually close as we age, and a child’s bone count decreases until it settles at 206. Our bones continue to change in a process called “bone remodeling” throughout our lives, but the number typically remains stable once we reach adulthood.

Woman suffering from wrist pain, numbness, or Carpal tunnel syndrome.
Credit: Doucefleur/ iStock

Half the Bones in Our Bodies Are in Our Hands and Feet

Perhaps surprisingly, the lion’s share of those 206 bones in the human body are in our hands and feet. Each foot contains 26 bones, and each hand contains 27, for a grand total of 106 bones in just those four extremities. Interestingly (but perhaps not surprisingly), the hand and foot are similar in terms of bone structure. On our hands, for example, each finger has three bones — the distal, middle, and proximal phalanges — except for the thumb, which has two (just the distal and proximal). Our feet are the same, with three phalanges in each of the smaller toes, and two in the big toe. The five metacarpals — that is, the bones that make up the palm of your hand — are also arranged similarly to the five metatarsals in your foot. The hand, however, has an extra bone called the pisiform, which is located on the outside edge of the wrist and attaches to various tendons and ligaments.

Anatomy of the human femur.
Credit: Cinefootage Visuals/ iStock

The Femur Is the Longest and Strongest Human Bone

The bones in our hands and feet are relatively small, though not as small as the stapes, the smallest bone in the human body, found in the middle ear. On the other end of the spectrum is the femur, notable for being the longest and strongest bone. The average adult femur — named for the Latin for “thigh,” where it’s located — stretches to about 18 inches in length and can support as much as 30 times the weight of your body. Because of this, it plays a crucial role in our ability to stand and move. It also connects to many muscles, tendons, and ligaments in our hips and knees.

Xray of hyoid bone 3D rendering.
Credit: libre de droit/ iStock

Only One Bone Isn’t Connected to Another Bone in the Human Body

Bones provide necessary skeletal scaffolding, and that means they’re usually connected to other bones via joints woven together with ligaments and tendons. However, there is one notable exception in the human body — one bone that is not connected to any other bone nearby. That exception is the hyoid, a small U-shaped bone in the neck, at the root of the tongue. Instead of connecting to other bones, the hyoid is linked only to muscles, ligaments, and cartilage, making it something of a “free-floating” loner. That’s not to say it’s superfluous, though. The hyoid aids in the very vital human activities of talking, chewing, and swallowing, so it’s actually pretty important.

Girl touches painful elbow, suffers ulnar joint injury during training.
Credit: Khosrow Rajab Kordi/ Alamy Stock Photo

Your “Funny Bone” Is Not a Bone

Your “funny bone,” named as such for its location near the humerus bone — “humorous,” get it? — is not really a bone at all. Rather, it’s part of the ulnar nerve, which runs from your neck all the way to your hand. Nerves are usually protected by bone, muscle, and fat, so they can perform their bioelectrical functions undisturbed, but a small part of the ulnar nerve in the back of the elbow is a little more exposed. There, the nerve is protected only by a tunnel of tissue, known as the cubital tunnel, so when you hit your “funny bone,” the ulnar nerve presses against the medial epicondyle (one of the knobby ends of the humerus bone), which in turn sends a painful sensation throughout your lower arm and hand. And because the nerve gives feeling to the pinky and ring fingers, those two digits may feel particularly sensitive compared to your other three fingers.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by shironosov/ iStock

When it comes to teeth, there’s always something new to learn. Innovations like fillings and toothbrushes had a long and rich history before they reached our mouths, and cultural norms can vary wildly — or be oddly similar — throughout place and time.

Which famous author became a tooth-removal evangelist? What animals have far more teeth than you’d expect? What kinds of small creatures gather baby teeth in the night?

Smile big and read on for eight facts that just might change the way you think about your pearly whites.

A view of a tooth enamel.
Credit: edwardolive/ Shutterstock

Tooth Enamel Is the Hardest Substance in the Human Body

Move over, bones! The outer layer protecting our teeth is the hardest thing in our bodies. The next layer down, dentin, is also stronger than bone. The trade-off is that teeth have a very limited ability to heal themselves, unlike bones. You can’t put a cast on a cavity, after all.

Close-up of a snail on a lime.
Credit: Krzysztof Niewolny/ Unsplash

Snails Have Thousands of Teeth

Each unassuming snail hides a microscopic secret: between 1,000 and 12,000 tiny teeth protruding from its tongue. They use these teeth to break down parts of their food, and since the teeth are not especially durable, they need to be replaced pretty frequently. This tooth-covered tongue is called a radula, and it’s not exclusive to snails. Slugs have them, too, along with some squids.

Not all radula are the same, though. Some predatory snails have venomous radula, and the terrifying-looking Welsh ghost slug has razor-sharp (and teeny-tiny!) teeth for eating worms.

Two wooden bamboo eco friendly toothbrushes and a green leaf.
Credit: Margarita Serenko/ iStock

The Earliest Toothbrushes Came From China

Tooth-cleaning goes back thousands of years, with methods including abrasive powder, cloth, and frayed sticks. Bristle toothbrushes emerged in China during the Tang dynasty (618–907 CE); the handles were made from ivory or bamboo. These brushes didn’t catch on in Europe until the 17th century, first in France and later in England.

While toothbrushes evolved in design throughout the 18th and 19th centuries, the materials stayed largely the same. Plastic handles came along in the early 1900s, and nylon bristles followed in 1938.

Toothpaste being put on a toothbrush over the bathroom sink.
Credit: Subbotina Anna/ Shutterstock

It Took a While to Get Americans to Brush Their Teeth

It sounds gross, but it’s true: Toothbrushing didn’t become a standard, everyday part of American life until the 1940s. That doesn’t mean all people didn’t brush their teeth — it just wasn’t the standard practice it is today.

The tide started to change in the decades prior, though. In the 1910s, schools started implementing dental hygiene programs like toothbrush drills, in which children practiced brushing their teeth with their teachers. Similar programs visited factories to care for workers’ teeth. This wasn’t just benevolence: Employers hoped their workers would miss fewer days of work due to tooth infections.

With dental hygiene already becoming normalized, one thing set it over the edge: American soldiers during WWII were required to brush their teeth every day, and brought the habit back home with them.

Close-up of female with open mouth during oral checkup at the dentist.
Credit: shironosov/ iStock

The Oldest Known Dental Filling Is Made of Beeswax

In 2012, scientists used the jaw of a Neolithic man to test out some new X-ray equipment — and in the process, made an exciting discovery. The man, who lived 6,500 years ago in modern-day Slovenia, had a filling made out of beeswax.

Drilling goes back even further than filling, though; archaeologists have found drill holes in teeth from more than 7,500 to 9,000 years ago in a graveyard in Pakistan.

A man looked scared as his tooth is about to be pulled.
Credit: Everett Collection/ Shutterstock

Tooth Pulling Used to Be a Public Spectacle

Before modern dentistry existed, the task of tooth extraction in Britain fell to a strange assortment of professions, including blacksmiths, wigmakers, and a very specific kind of sideshow entertainer. Like snake-oil salesmen, charlatan tooth-drawers traveled to fairs and marketplaces wearing silly hats and sometimes even strings of teeth, eager to rip out teeth for curious spectators. They typically made a grand entrance, sometimes on horseback or with a team of assistants. Loud noises were a key part of the act, both to draw a crowd and to drown out the screams of their “patients.” This continued into the 1800s.

The alternatives, for what it’s worth, weren’t great either. In the 18th and 19th centuries, you could see a “barber-surgeon” (or later, just a surgeon) to get your painful tooth removed with a tooth key, a clawed device that looks a little like a broken corkscrew. All in all, it was not a great time to have bad teeth.

A single tooth underneath a pillow for the tooth fairy.
Credit: billyfoto/ iStock

Tooth Fairy? More Like Tooth Mousie

Today, the most common American version of the tooth fairy is a small, whimsical figure, typically female, who checks under our pillows at night for lost baby teeth. But the tooth fairy is an early-20th-century invention, and that particular image rose to prominence right as Disney was releasing animated films featuring kind, gentle, feminine fairies.

The fairy is likely layered on top of a much longer tradition of offering baby teeth to rats and mice — the hope being that the child’s permanent teeth would grow in as strong as a rodent’s. While this practice appears throughout the world, it’s perhaps most common today in Spanish-speaking households. In fact, a specific tooth mouse named Ratoncito Perez emerged in Spanish lore in the 1800s, and spread throughout Latin America in children’s stories. A similar tooth mouse, La Petite Souris, goes back to 1600s France. In some countries, children make it more convenient for the rodent by placing their teeth in or near mouseholes.

The core concept — giving children money in exchange for teeth — dates back to at least the 12th or 13th century, and appears in Norse and Northern European tradition, while other lost-tooth rituals are common throughout the world’s history.

Portrait of the British writer, Roald Dahl.
Credit: Ronald Dumont/ Hulton Archive

Roald Dahl Had All His Teeth Removed — Voluntarily — at 21

Famed author Roald Dahl was strange in many ways, including his strong opinions about teeth. When he was 21 years old and working at Shell Oil, he decided having teeth was just too much trouble, so he visited a highly regarded dentist in London to have them all taken out and an artificial set created. Five years later, he treated himself to extra-fancy new teeth with the sales from Shot Down in Libya, his first piece of paid writing.

This wasn’t especially unusual for British people at the time, but it gets weirder: Dahl became a teeth-removal evangelist. He convinced his mother to have all of hers removed. Then he turned to his four living siblings, none of whom actually went for it; this made him impatient and “foul-mouthed,” according to biographer Donald Sturrock. Finally, he convinced his brother-in-law to go — but to Dahl’s surprise, he never got false teeth to replace them.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by oldbunyip/ Shutterstock

A sandwich may be one of the most humble foods known to humankind, often slapped together with whatever’s found in the fridge and eaten on the go. These easily customizable eats go by many names depending on their variety — sub, hoagie, roll, gyro, po’ boy, and more — and they’re one of a few foods that span cultures and time. Here are eight facts you may not know about the concoctions we create with two slices of bread.

John Montagu, 4th Earl of Sandwich.
Credit: UniversalImagesGroup via Getty Images

The Word “Sandwich” Likely Gets Its Name From a Real-Life Royal

John Montagu (1718-1792), the British noble who served as the fourth Earl of Sandwich, was a politician and postmaster. He’s also credited as the inventor of the sandwich. Humans have arguably been combining bread with savory fillings for thousands of years, but Montagu is said to have inspired the dish’s official term. (His family name, meanwhile, comes from a place name that means “sandy harbor.”) One 18th-century account claimed Montagu popularized sandwiches by requesting sliced meat and bread as a meal so that he could continue gambling, though other accounts say the earl likely also consumed sandwiches while working at his desk. With his title used as a description, sandwiches exploded in popularity throughout Europe, soon served to nobility and civilians alike.

Sandwich, Illinois on map.
Credit: SevenMaps/ Shutterstock

Three U.S. Cities Are Named Sandwich

Massachusetts is home to the oldest American city called Sandwich, founded in 1637. The oldest town on Cape Cod — Massachusetts’ history-heavy, hook-shaped peninsula — was named after Sandwich, England. Nearby in New Hampshire, the town of Sandwich got its name in 1763 to honor John Montagu, the fourth Earl of Sandwich mentioned above. And the Midwestern town of Sandwich, Illinois, also bears the name. Originally called Almon, the Illinois town’s name was scrapped in the 1850s and eventually switched to Sandwich in honor of John Wentworth — a politician born in Sandwich, New Hampshire — who was responsible for getting the Illinois town a railroad stop.

Fluffernutter sandwich on a white plate.
Credit: Jmcanally/ Shutterstock

A Descendant of Paul Revere Invented the Fluffernutter Sandwich

Paul Revere is best known for his patriotic ride during the Revolutionary War, though he’s also the great-great-great-grandfather of Emma Curtis, the Massachusetts woman who invented the fluffernutter sandwich. While Curtis isn’t the original creator of marshmallow creme — the spreadable sweet that contributes the sandwich’s “fluff” — she was known to popularize the product, manufactured by her family, through inventive recipes. Curtis first released the recipe for her peanut butter and marshmallow creme delight in 1918, amid World War I, initially calling it a Liberty sandwich, aka a patriotic way to consume less meat during wartime.

Little kid girl giving mom sandwich to bite.
Credit: fizkes/ Shutterstock

Americans Eat Millions of Sandwiches Each Year

The most popular food served in the U.S. just may be the sandwich. On any given day, 47% of American adults will eat a sandwich, with about half of those served for lunch and a third consumed for dinner. In one year, Americans chow down on an estimated 300 million sandwiches, the most popular including cold cuts. Peanut butter and jelly sandwiches, meanwhile, account for just 6% of sandwiches consumed by adults.

A peanut butter and jelly sandwich with oranges and grapes.
Credit: MSPhotographic/ Shutterstock

PB&J Sandwiches Were Once Upper-Class Cuisine

The beloved peanut butter and jelly sandwich is now considered kid-approved dining, though it wasn’t always that way. Around 1901, PB&J sandwiches were served in tea rooms frequented by wealthier American patrons. More savory varieties of the sandwich nixed the jelly and paired peanut butter with cheese, lettuce, or Worcestershire sauce. By the 1920s, the invention of commercially sliced bread helped PB&Js become a lunchtime staple for all Americans, particularly schoolchildren.

Grilled cheese sandwich with gourmet four cheese in a basket cut in half.
Credit: N K/ Shutterstock

The World’s Most Expensive Sandwich Includes Edible Gold

Would you pay $214 for a taste of the world’s priciest sandwich? That’s the cost of the record-breaking grilled cheese sandwich at Serendipity 3 in New York City, made with French bread (itself made with Champagne and gold flakes), and thick slices of caciocavallo podolico cheese (a rare type of Italian dairy). Each sandwich is seared using white truffle oil containing gold flakes, plus butter infused with white truffles, with more edible gold applied after cooking. The entire dish is served with a side of lobster tomato bisque.

A beef salami sandwich.
Credit: Red Stock/ Shutterstock

A Sandwich Helped Detectives Catch a Jewel Thief

In February 2003, burglars robbed the seemingly impenetrable Antwerp Diamond Centre, pilfering $100 million in diamonds, gold, and more from an underground vault. Detectives were left with few leads; the thieves absconded with the security camera footage and left behind few clues, though one — a half-eaten salami sandwich — was later recovered from a nearby site where the crooks attempted to destroy evidence. Detectives were able to make arrests with the help of cellphone records, DNA evidence, and a grocery store receipt found in a suspect’s home that matched the sandwich’s ingredients. However, most of the diamonds stolen in the heist have never been recovered, and police believe accomplices in the crime are still at large.

Astronaut John W. Young.
Credit: Bettmann via Getty Images

A Corned Beef Sandwich Was Smuggled Into Space

Astronaut food is the stuff of science — often heavily processed to give it the chance to hold up in space, and sometimes looking less than appetizing. That’s why one astronaut packed his own sandwich and jetted off into space with it. In 1965, NASA pilot John Young hid a corned beef sandwich in his spacesuit pocket prior to the launch of the Gemini III. Nearly halfway through the five-hour flight, Young pulled out the sandwich and offered a bite to mission commander Gus Grissom, though the unfinished sandwich was stowed away after breadcrumbs began to make a mess. Young was reprimanded once back on Earth, and a congressional inquiry took place. A resin-preserved replica of the sandwich now sits at the Grissom Memorial Museum in Mitchell, Indiana.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Lia Koltyrina/ Shutterstock

While there’s a lot happening on Earth, the sun is the real star of our show — pun intended. Hanging out at an average 93 million miles away from Earth, the sun is a perfect mixture of hydrogen and helium that spit-roasts our planet just right as we travel around its bright, glowing body. But although the sun is central to our survival, there’s still a lot we don’t know about it. For decades, space agencies have been sending missions to explore the sun and find answers; in 2021, NASA’s Parker Solar Probe became the first spacecraft to “touch” the sun by entering its upper atmosphere (still some 4 million miles away from its surface). Based on research from these missions and more, here are some of the most interesting things we’ve learned about the sun — and some of our best guesses at what its future might look like.

Solar system and Sun.
Credit: Vadim Sadovski/ Shutterstock

The Sun Is Middle-Aged

The sun seems eternal — an ever-present, life-giving fireball in the sky — but not even it can escape the wear and tear of time. Some 4.6 billion years ago, the sun formed from a solar nebula, a spinning cloud of gas and dust that collapsed under its own gravity. During its stellar birth, nearly all of the nebula’s mass became the sun, leaving the rest to form the planets, moons, and other objects in our solar system. Even today, the sun makes up 99.8% of all mass in the solar system.

Currently in its yellow dwarf stage, the sun has about another 5 billion years to go before it uses up all its hydrogen, expands into a red giant, and eventually collapses into a white dwarf. So at 4.6 billion years old, the sun could be best described as “middle-aged” — but we don’t think it looks a day over 3 billion.

Planet earth in different angles, chaotically on a black background.
Credit: Andrei Riakin/ Shutterstock

1.3 Million Earths Could Fit Inside the Sun

The Earth is big, but the sun is bigger — way bigger. Measuring 338,102,469,632,763,000 cubic miles in volume, the sun is by far the largest thing in our solar system, and some 1.3 million Earths could fit within it. Even if you placed Earth in the sun and maintained its spherical shape (instead of squishing it together to fit), the sun could still hold 960,000 Earths. Yet when it comes to stars, our sun is far from the biggest. For instance, Betelgeuse, a red giant some 642.5 light-years away, measures nearly 700 times larger and 14,000 times brighter than our sun.

Sun cross-section science illustration.
Credit: Mopic/ Shutterstock

It Takes a Long Time for Light to Escape the Sun

The sunlight that reaches your eyes is older than you might think. It takes a little over eight minutes for photons from the surface of the sun to reach Earth, meaning every time you glimpse the sun (hopefully with sunglasses!), it actually looks as it appeared eight minutes ago. However, this photon blazing at the speed of light is at the end of a very long journey. Once a photon enters the sun’s “radiative zone,” the area between the core and the convective zone (the final layer which stretches to the surface), energy is absorbed after a very short distance into another atom, which then shoots that energy into yet another direction. The overall effect is what scientists call a “random walk,” and the result is that it can take a single photon thousands of years — up to 100,000 years — to escape the sun. As our knowledge of the sun grows, scientists will likely refine this number, but for now it’s safe to say that it takes “a long time.”

Solar prominence, solar flare, and magnetic storms.
Credit: Lia Koltyrina/ Shutterstock

The Sun’s Atmosphere Is Much Hotter Than Its Surface

As you travel farther from the surface of Earth, things usually get colder and colder. Planes traveling at 35,000 feet, for example, travel through the stratosphere and experience temperatures around -60 degrees Fahrenheit. However, the sun’s atmosphere works in exactly the opposite way. While the surface of the sun hovers around 10,000 degrees Fahrenheit, the atmosphere (or corona) of the sun is hundreds of times hotter, with temperatures reaching up to 3.6 million degrees Fahrenheit.

Scientists aren’t exactly sure why the sun’s atmosphere is so much hotter than the surface. One leading theory is that a series of explosions called “nanoflares” release heat upwards of 18 million degrees Fahrenheit throughout the atmosphere. Although small when compared to the sun, these nanoflares are the equivalent of a 10 megaton hydrogen bomb, and approximately a million of them “go off” across the sun every second. Another theory is that the sun’s magnetic field is somehow transferring heat from its core, which rests at a blazing 27 million degrees Fahrenheit, to its corona.

Wind turbines against backdrop of sunset sky with clouds in field.
Credit: Prilutskiy/ Shutterstock

Different Parts of the Sun Rotate at Different Speeds

The sun doesn’t rotate like your typical planet. While the Earth’s core does rotate ever so slightly faster than the planet’s surface, it mostly moves as one solid mass. The sun? Not so much. First of all, it’s a giant ball of gas rather than a rigid sphere like Earth. The gases at the sun’s core spin about four times faster than at its surface. The sun’s gases also spin at different speeds depending on their latitude. For example, the gases at the sun’s equator rotate much faster than the areas at higher latitudes, closer to the poles. A rotation that takes 25 Earth days at the sun’s equator takes 35 days to make the same journey near the poles.

3D rendering of our home planet with moon and sun.
Credit: aryos/ iStock

The Sun Completes Its Own Galactic Orbit Every 250 Million Years

Picture a grade-school model of the solar system, and it’s easy to forget that the sun is on its own galactic journey. While the Earth orbits the sun, the sun is orbiting the center of the Milky Way galaxy. On its orbiting journey, it travels roughly 140 miles per second, or about 450,000 miles per hour (by comparison, the Earth travels around the sun at only 67,000 miles per hour). Although blazing fast by Earth standards, it still takes our star roughly 230 million years to complete a full revolution.

Sunset landscape at Paria Rimrocks, Utah.
Credit: Rezus/ iStock

In About 1 Billion Years, the Sun Will Kill All Life on Earth …

In 5 billion years, the sun will enter its red giant phase and engulf many of the inner solar system planets, including Earth. However, Earth will lose its ability to sustain life much earlier than that, because the sun is steadily getting hotter as it ages. Scientists estimate that anywhere between 600 million and 1.5 billion years from now, the Earth will experience a runaway greenhouse effect induced by our warming sun that will evaporate all water on Earth and make life on our blue marble impossible (except for maybe some tiny microorganisms buried deep underground). Eventually, Earth will resemble Venus, a hellish planet warmed beyond habitability due to its thick atmosphere and proximity to the sun. Luckily, humanity has at least several hundred million years to figure out a plan B.  

Young plant in the morning light on nature background.
Credit: amenic181/ iStock

… But Life Only Exists Because of the Sun in the First Place

You can’t get too mad at the sun for its warming ways, because life couldn’t exist without it. Earth is perfectly placed in what astronomers call a star’s “goldilocks zone,” where the sun isn’t too hot or too cold but just right. This advantageous distance has allowed life to flourish on Earth, with the sun bathing our planet in life-giving warmth. The sun also gives plants the light they need to grow and produce oxygen, which in turn forms the bedrock of the web of life — and it’s all thanks to the middle-aged, hydrogen-burning, massively huge star at the center of our solar system.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by stocksnapper/ iStock

While many inventions are the outcome of tireless effort and incredible insight, a little luck never hurts. Some inventions — from Post-it Notes to penicillin — are amazing examples of good fortune as well as curiosity and tenacity. After all, it’s not enough for an accident to reveal some incredible new advancement; it also needs to be witnessed by an inquisitive person prepared to understand its significance. Here are seven world-changing inventions that were discovered by accident, by people who made sure that these serendipitous moments didn’t go unnoticed.

Wall covered with blank post-its.
Credit: gerenme/ iStock

Post-Its

In 1968, scientist Spencer Silver was working at the Minnesota Mining and Manufacturing Company, also known as 3M. Founded in 1902, 3M quickly evolved beyond mining, and by the mid-20th century, the company had expanded into the adhesives game. At the time, Silver was trying to make “bigger, tougher, stronger” adhesives, and thus considered one of his creations, known as acrylate copolymer microspheres, a failure. These microspheres could retain their stickiness but could also be removed easily — not exactly big, tough, or strong.

While Silver believed this light-hold adhesive could have some sort of use (he patented it just to be safe), he couldn’t put a finger on what that use was, exactly, until one day when fellow 3M scientist Art Fry was in search of a bookmark that could stick to pages without damaging the paper. Fry immediately thought of Silver’s microspheres, and the two scientists soon found themselves writing each other messages around the office on the world’s first Post-it Notes. “What we have here isn’t just a bookmark,” Fry once said. “It’s a whole new way to communicate.”

Woman using a microwave oven for heating food at home.
Credit: Andrey_Popov/ Shutterstock

Microwave Ovens

Today 90% of American households have a microwave oven — and it’s all thanks to magnetron expert Percy Spencer. In the mid-1940s, Spencer was working at the aerospace and defense company Raytheon when he took a step in front of an active radar set. To his surprise, the candy bar in his pocket melted. Spencer conducted a few more experiments, using popcorn kernels and eggs, and realized that microwaves could vibrate water molecules, causing them to produce heat and cook food. Raytheon patented the invention in 1945, and released the first microwave oven, called the “Radarange,” the next year. It weighed 750 pounds and cost $5,000 (about $52,000 today). It wasn’t until the 1970s that both the technology and price reached that consumer sweet spot, and microwave ovens became a must-have appliance in every U.S. home.

Colonies of Penicillium mold growing on agar plate.
Credit: Kallayanee Naloka/ Shutterstock

Penicillin

If you ever need to stress to your boss the importance of vacation, share the tale of penicillin. On September 3, 1928, Scottish physician Alexander Fleming returned to his laboratory at St. Mary’s Hospital in London after a vacation of more than a month. Sitting next to a window was a Petri dish filled with the infectious bacteria known as staphylococcus — but it’s what Fleming found in the dish alongside the bacteria that astounded him.

Inside the Petri dish was a fungus known as penicillium, or what Fleming at the time called “mould juice.” Whatever the name, this particular fungus appeared to stop staphylococcus from spreading, and Fleming pondered whether this fungus’s bacteria-phobic superpowers could be harnessed into a new kind of medicine. Spoiler: It could, and in the coming years, Fleming developed the world’s first antibiotic, winning the Nobel Prize for medicine in 1945 for his accidental yet world-changing discovery. “I did not invent penicillin. Nature did that,” Fleming once said. “I only discovered it by accident.”

Collection of an x-ray from a normal knee.
Credit: jaimaa85/ iStock

X-Rays

In November 1895, German scientist Wilhelm Conrad Röntgen was hard at work studying cathode radiation in his Würzburg laboratory when a chemically coated screen 9 feet away began to glow. What followed was seven weeks of what Röntgen’s wife, Bertha, later described as a “dreadful time.” Röntgen worked tirelessly, obsessed with discovering the secrets of the phenomenon he called “X-rays” (named because the rays were unknown, as in “solving for x”) — often coming home in a bad mood, and eating silently before immediately retreating back to his lab. Eventually, he even moved his bed to his lab so he could work around the clock. As Röntgen would later put it, “I didn’t think; I investigated.”

The result of this investigation was a paper published in late December that same year, titled “On a New Kind of Rays.” The work detailed how these X-rays could penetrate objects, and the medical applications for such an invention were immediately apparent. Within a month or two, the first clinical uses of X-rays occurred in Hanover, New Hampshire, and Röntgen became the recipient of the first Nobel Prize in physics in 1901.

Close-up of a pile of rubber sealing strips.
Credit: ZHMURCHAK/ Shutterstock

Vulcanized Rubber

On its own, natural rubber isn’t immensely useful — it melts in warm weather, cracks in the cold, and adheres to basically everything. But once rubber undergoes a process known as “vulcanization,” in which natural rubber is mixed with sulfur (or some other curative) and heated to between 140 to 180 degrees Celsius, it gains immense tensile strength and becomes resistant to swelling and abrasion.

Although creating this kind of tough rubber is a relatively complicated process, evidence suggests that an ancient Mexican people known as the Olmecs (which means “rubber people”) used some type of vulcanization. But modern vulcanization didn’t arrive until 1839, when American inventor Charles Goodyear accidentally dropped India rubber mixed with sulfur on a hot stove. Recognizing that the rubber held its shape and also gained strength and rigidity, Goodyear soon patented his discovery. Alas, protecting those patents from infringement proved impossible, and Goodyear died in 1860 some $200,000 in debt.

However, Goodyear still saw his life as a success, once writing: “I am not disposed to complain that I have planted and others have gathered the fruits. A man has cause for regret only when he sows and no one reaps.” Thirty-eight years later, American entrepreneur Frank Seiberling started a company to supply tires for the nascent automobile industry. Because creating tires capable of handling the rough terrain of dirt roads relied entirely on the process of vulcanization, Seiberling named his enterprise after the man who made it all possible — calling it the Goodyear Tire & Rubber Company.

Velcro in a close up view.
Credit: stocksnapper/ iStock

Velcro

Amazing inventions come to curious minds, and that’s certainly the case for Swiss engineer George de Mestral. While on a walk in the woods with his dog, de Mestral noticed how burrs from a burdock plant stuck to his pants as well as his dog’s fur. Examining the burrs under a microscope, de Mestral discovered that the tips of the burr weren’t straight (as they appeared to the naked eye), but instead contained tiny hooks at the ends that could grab hold of the fibers in his clothing. It took nearly 15 years for de Mestral to recreate what he witnessed under that microscope, but he eventually created a product that both stuck together securely and could be easily pulled apart. In 1954, he patented a creation he dubbed “Velcro,” a portmanteau of the French words velours (“velvet”) and crochet (“hook”).

Multi-coloured fabric dyes close-up.
Credit: holgs/ iStock

Synthetic Dye

For most of human history, dyes and pigments were sourced from natural resources such as metals, minerals, and even bat guano. It was an expensive process, and one of the most costly colors to create was purple, which had to be sourced from a particular mollusk along the coast of Tyre, a city in modern Lebanon. In fact, the dye was so expensive that the color was reserved for royalty, with monarchs like Queen Elizabeth even passing laws to ensure as much.

Then came 18-year-old British chemist William Henry Perkin. In 1856, Perkin was working in a lab, where he was trying (and failing) to produce a synthetic form of quinine, a compound found in the bark of cinchona trees and used to treat malaria. While washing out the brown sludge of one failed experiment with alcohol, the mixture turned a brilliant purple. Calling his creation “mauveine,” Perkin soon realized that not only was this dye cheap to produce, but it also lasted longer than dyes derived from natural sources, which tended to fade quickly.

Perkin’s discovery kick-started a chain reaction of chemical advances that brought cheap, colorful dyes to the fashion industry. Within six years of Perkin’s happy accident, even Queen Victoria herself began wearing colorful garments of bright mauveine.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Pressmaster/ Shutterstock

They’ve been described as the windows to the soul by William Shakespeare and the jewel of the body by Henry David Thoreau, and featured in song titles by musicians ranging from Van Morrison and The Who to Billy Idol and Billie Eilish. Needless to say, eyes hold a prominent place in our lives, both for our dependence on their functionality as well as the aesthetic qualities that have inspired so many artists. Here are six eye-opening facts about these amazing organs.

A fossilized trilobite found in the Green River Formation in Wyoming.
Credit: Layne Kennedy/ Corbis Documentary via Getty Images

The First Eyes Appeared at Least 540 Million Years Ago

The first known organism to demonstrate the leap from light-sensitive receptors to complex eyes was the trilobite, which left records of its evolutionary impact from approximately 540 million years ago. The orbs of these early arthropods more closely resembled the compound eyes of modern insects, with multiple lenses, as opposed to the single lens-to-retina camera-style eye built into humans. Because they offered trilobites a clear advantage in hunting prey (and thus encouraged their predators to evolve in response), the emergence of working eyes in these and subsequent life forms may have helped drive the so-called “Cambrian Explosion,” which gave rise to most of the creatures that now populate the animal kingdom.

Blue eyed woman's face.
Credit: IoanaB/ Shutterstock

The Human Eye Can See Objects Millions of Miles Away

While the majority of us wouldn’t consider our vision to be extraordinary, the human eye can see much farther than most of us realize. That’s because our ability to perceive an object is based not only on its size and proximity, but also on the brightness of the source. Practically speaking, our sight is hindered by factors such as the Earth’s curvature, which creates the dropoff point of the horizon just 3 miles away, and atmospheric conditions. However, a trip outside on a clear night reveals the true power of our vision, as most of us are able to make out the faint haze of the Andromeda Galaxy some 2.6 million light-years into space.

 Eyeglasses laying on colored papers with different colors visible thru the eyeglasses.
Credit: kmatija/ iStock

Some People Can Distinguish Between 100 Million Colors

Most people are trichromatic, meaning they possess three types of cone cells in their retinas to detect variations of red, green, and blue light. Dichromatic or colorblind people are those with missing or defective cone cells; normally this means they have trouble differentiating between two colors, with red and green being the most common combination. On the extreme ends of the spectrum, those suffering from achromatopsia lack the ability to see any colors, while those born with an extra set of cone cells, tetrachromats, are said to be extraordinarily sensitive to light wavelengths and capable of distinguishing between 100 million colors.

Woman with her eyes shut.
Credit: PhotoTalk/ iStock

There are a few established reasons for why we blink: This rapid closure triggers secretions that flush away foreign particles, while also providing a lubrication that keeps our precious eyes functioning smoothly. However, this action, which can be voluntary or involuntary, is also affected by a raft of psychological reasons. We blink less when concentrating, for example, and more when we’re nervous. Recent studies also indicate that blinking may be a way of providing the brain a brief moment of rest. Regardless of the reasons, we all blink a lot. Most people average at least 15 per minute, which translates to 14,400 for each waking 16-hour period, and a whopping 7.8 million blinks per year.

Huge squid with an eye popping out
Credit: sunara/ iStock

The Colossal Squid Boasts the Largest Animal Eyes

The human eye measures about two-thirds of an inch across at birth, before growing to its full size of 1 inch by adulthood. By comparison, the eye of the 45-foot-long colossal squid has been measured at 11 inches in diameter, making it the largest such organ in the animal kingdom and possibly the largest in the history of recorded life. Among land-dwelling creatures, the ostrich tops the pack with an eye that measures around 2 inches from the cornea to the retina — dimensions that also happen to be bigger than its walnut-sized brain.

Close-up of an eye showing the details in the iris and sclera.
Credit: Peter Finch/ Stone via Getty Images

All Humans Had Brown Eyes at One Point

Eye color (along with skin and hair color) is determined by the amount of melanin our bodies produce; those with blue or green eyes simply possess a lower density of this pigmentation in the iris than those with dark brown peepers. According to research published by a University of Copenhagen team in 2012, all humans had brown eyes until sometime between 6,000 and 10,000 years ago, when a genetic mutation created the first blue-eyed individual. Nowadays, 70% to 79% of the world’s population has brown eyes, with 8% to 10% sporting baby blues, approximately 5% featuring hazel or amber, and just 2% showing green. Less than 1% of people possess two completely different colored eyes — a condition known as heterochromia.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by lucky-photographer/ iStock

Presidents are much more than the policies they back or the speeches they give — sometimes they put out fires in the Library of Congress, popularized ice cream Stateside, or were nationally ranked wrestlers. Below are 20 strange and fascinating facts about America’s first 20 Presidents.

George Washington, an American politician, soldier and the first President of the United States.
Credit: Universal History Archive/ Universal Images Group via Getty Images

George Washington

While every acting President serves as the commander-in-chief of the U.S. military, nobody will ever outrank George Washington. He was posthumously given the rank of General of the Armies of the United States, sometimes compared to being a six-star general. Although General John J. Pershing also held the title for his service during WWI, when President Ford appointed Washington in 1976 — 200 years after Washington’s heyday — he specified that our first President would always rank first among all Army officers past and present.

John Adams

The Boston Massacre, in which British soldiers shot and killed five citizens, was a major flashpoint on the road to the Revolutionary War. When murder charges were brought against the soldiers, John Adams acted as their defense lawyer. He called the trial one of his most “gallant” and “disinterested actions of my whole life.”

Portrait of Thomas Jefferson, a past American president.
Credit: Ann Ronan Pictures/ Print Collector via Getty Images

Thomas Jefferson

When Jefferson sent Meriwether Lewis and William Clark out west to explore the Louisiana Territory in 1804, the President told the explorers to watch out for mammoths. Jefferson was apparently obsessed with mammoths and was convinced they were still alive, gallivanting in America’s wild west.

James Madison

James Madison and his wife, Dolley, helped popularize ice cream in America. Tastes in the treat, however, would be considered questionable today: chestnut, asparagus, and parmesan were all on the menu. Dolley’s favorite flavor was … oyster.

James Monroe, the fifth president of the United States 1817-25.
Credit: Universal History Archive via Getty Images

James Monroe

While George Washington gets all the credit for bravely crossing the icy Delaware River in 1776, Monroe deserves attention too. Then-Lieutenant Monroe was part of an advance unit and crossed the river before Washington’s crew. On the other side, Monroe was wounded at the Battle of Trenton, as Hessian soldiers shot him in the shoulder.

John Quincy Adams

The election of 1824 saw four viable candidates, none of whom won an outright majority of electoral votes. Andrew Jackson nabbed 99, John Quincy Adams won 84, William H. Crawford earned 41, and Henry Clay claimed 37. Despite having neither the highest number of electoral or total popular votes, Adams was chosen as President by the U.S. House of Representatives.

Andrew Jackson, the 7th President of the United States.
Credit: Universal History Archive via Getty Images

Andrew Jackson

In 1835, Jackson received a present at the White House: a 1,400 pound wheel of cheese. It sat around idle for two years until Jackson, sick of the cheese, invited 10,000 visitors into the White House to get rid of it. As one resident recalled: “The air was redolent with cheese, the carpet was slippery with cheese, and nothing else was talked about at Washington that day.” Another called the event, “an evil-smelling horror.”

Martin Van Buren

Having grown up in a Dutch-speaking community in New York, Van Buren was the only President whose first language was not English. Although Van Buren worked hard to mask his original tongue, observers claim his accent would surface whenever he became visibly excited.

William Henry Harrison, the 9th President of the United States.
Credit: Universal History Archive via Getty Images

William Henry Harrison

When a pesky newspaper claimed that Harrison sat in a log cabin and swilled hard cider all day, Harrison didn’t fight the libelous claim. He embraced it. Supporters started calling him the “Log Cabin and Hard Cider candidate” and handed out cider and specialty bottles of booze shaped like log cabins. (Supporters even composed a song for him called “Good Hard Cider.”)

John Tyler

According to Smithsonian, Tyler was “the only president to commit a public act of treason against the U.S. government.” Sixteen years after leaving office, Tyler embraced Virginia’s decision to secede from the Union and was elected to the Confederate Congress. He died before taking a seat with the Confederate government, but the damage was done: When Tyler died, President Lincoln reportedly refused to lower the flag at half-staff.

James K Polk, the 11th President of the United States.
Credit: MPI via Getty Images

James K. Polk

For two days a week, any Joe Schmoe could walk off the street and visit President Polk in the White House for an evening reception. These “office hours” weren’t very productive, though. Many people walking off the street, to Polk’s annoyance, merely came in asking for a job.

Zachary Taylor

The second President to die in office did not smoke or drink, but he did chew tobacco — lots of it. In fact, he was an impressive spittle-shooter. According to the book Presidents: A Biographical Dictionary, Taylor had “the reputation of never missing the sand-filled box that served as a spittoon in his office.”

Millard Fillmore, the 13th president of the United States.
Credit: Universal History Archive via Getty Images

Millard Fillmore

Millard Fillmore was an avid reader who would do nearly anything for a book. On Christmas Eve in 1851, when the Library of Congress caught on fire, Fillmore ran to the scene with a group of Congressmen and “rendered all the service in their power” to stop the fire. Fillmore led the bucket brigade early into Christmas morning.

Franklin Pierce

Franklin Pierce was one of many Presidents who came from a law background, but he was also known for being especially personable and having an excellent memory. He reportedly could give half-hour-long speeches without the help of notes and had “a knack for recalling the names of every juror who ever sat on one of his cases.”

James Buchanan, the 15th President of the United States.
Credit: Universal History Archive via Getty Images

James Buchanan

Considered one of America’s worst presidents, Buchanan’s fence-sitting on the topic of slavery helped set the stage for the Civil War. It was just one of many controversial positions: On the campaign trail, Buchanan argued that a wage of 10 cents was fair for a day’s labor. (At the time, most Americans supposedly needed around $6 a week — or 86 cents a day — to live.) The gaffe earned Buchanan the nickname “10 Cent Jimmy.”

Abraham Lincoln

Honest Abe was an accomplished wrestler. It’s said that, as a young man in Illinois, Lincoln competed in about 300 wrestling contests and lost just one match. In 1830, after he was crowned his county’s wrestling champion, Lincoln wasn’t afraid to trash-talk his opponents: “I’m the big buck of this lick,” he reportedly said. “If any of you want to try it, come on and whet your horns.”

Andrew Johnson, the 17th president of the United States.
Credit: Universal History Archive via Getty Images

Andrew Johnson

Lovers of Alaska’s mountains and streams have Andrew Johnson and his Secretary of State, William H. Seward, to thank. During Johnson’s presidency, Russia sold Alaska to the United States for the price of  $7.2 million … in gold. Although most Americans supported the purchase, critics would call the land “Walrussia,” “Icebergia,” and Johnson’s “polar bear garden.”

Ulysses S. Grant

The “S” in Ulysses S. Grant’s name doesn’t mean anything. It’s a clerical error. Grant received the erroneous middle initial when a friend of his father, Thomas Hamer, nominated Ulysses for enrollment at West Point. The initial stuck. Writing to his future wife, Grant said: “I have an ‘S’ in my name and don’t know what it stands for.”

Portrait of Major General Rutherford B. Hayes.
Credit: Library of Congress/ Corbis/ VCG via Getty Images

Rutherford B. Hayes

Several Presidents served during the Civil War, but only Hayes was wounded in combat. He had four horses shot from under him. He suffered a wounded knee, a gunshot wound to the left arm, a hit from a spent musket ball to the head, an ankle injury, and a final gunshot wound at the Battle of Cedar Creek.

James A. Garfield

In 1880, Garfield attended the Republican National Convention with no intention of running for president. But when the convention stalled, a delegate nominated Garfield as a compromise, and a stream of unexpected votes flooded in. “This honor comes to me unsought,” Garfield said. “I have never had the presidential fever, not even for a day … I have no feeling of elation in view of the position I am called upon to fill.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Cottonbro/ Pexels

Desserts have long reigned as the pièce de résistance of suppertime, encouraging picky eaters to clear their plates in anticipation of a bubbling pie, warm cookies, or decadently sweet puddings. While the dessert course seems like an ages-old staple, finishing off a meal with dazzlingly decorated cakes and sugary sweets is a relatively new concept attributed to French royalty. Turns out, the word “dessert” —  meaning “to clear the table” after a meal — didn’t even hit the modern lexicon until the 17th century, a time when French nobles feasted on a post-dinner selection of preserves, jams, and cookies, calling the course “le fruit.”

Renaming the serving of sweets gained popularity as chefs crafted painstakingly designed sugar sculptures, cakes, and treats — though the term wouldn’t take off until the French Revolution, when commoners more easily gained access to sugar. By the late 1700s, the idea of dessert had spread through Europe and to young America, where home cooks were dishing up cakes, cookies, and tarts once only seen in aristocratic dining rooms. In the centuries since, some sugary dishes have come and gone (like tomato soup spice cake and mock apple pie), while others have maintained their place in dessert history. Here’s the sweet story behind several classic and beloved dessert dishes.

Homemade cheesecake with fresh berries and mint for dessert.
Credit: Mizina/ iStock

Cheesecake

New York City may be the world’s cheesecake capital, but the classic delight was actually created and shared thousands of years before the city’s founding. Originating with the ancient Greeks, early cheesecake was constructed from cheese that had been beaten until smooth and then blended with honey and flour. The crustless delight was served at special events, weddings, and even the first Olympic Games in 776 BCE.

But like any food that has transcended the centuries, the cheesecake morphed into the dessert we know today only slowly. By 160 BCE a newer version emerged with a separately baked crust, and an English recipe from 1390 blended sugar and dried elderflowers with cheese curds before baking the entire dish in a pie shell. When mass-produced cream cheese emerged in the 1870s, cheesecake recipes underwent another change. By the 1930s, New York bakeries had fully adopted the use of a cream cheese and sour cream base paired with a Graham cracker crust — creating the cheesecake we enjoy today.

Variety of ice cream scoops in cones with chocolate, vanilla and strawberry.
Credit: VeselovaElena/ iStock

Ice Cream

Ice cream may seem like a modern culinary invention considering its need for refrigeration, but food historians believe it appeared sometime within China’s Tang Dynasty, between 618 and 907 CE. Early ice cream blended goat, buffalo, or cow milk with flour and camphor harvested from evergreen trees, and it reached its creamy state after being plunged into ice pools. Cultures throughout Asia and Europe made similar frozen treats; Arab chefs created fruit sherbets during the Middle Ages, and sugary, milk-infused ice drinks became known as the first Italian sorbettos during the mid-1600s.

By the time ice cream reached North America in the 1700s, similar treats such as gelato had been popular in Italy and France for decades. But upgrades to icehouse technology in the 1800s allowed the American masses to finally enjoy ice cream more regularly. That same century saw a deluge of related innovations. Paper companies produced foldable “ice cream satchels” complete with wire handles for easily transporting the cold confection, and in 1897, Black inventor Alfred L. Cralle patented the Ice Cream Mold and Disher — the original ice cream scoop (from which he never made any money). And while waffle-style ice cream cones likely existed before the 20th century, the 1904 World’s Fair in St. Louis was so successful at popularizing the edible containers that it’s often credited as the site of their invention.

Half a dozen frosted sprinkle doughnuts in an opened box.
Credit: Cottonbro/ Pexels

Doughnuts

Whether you enjoy them for breakfast or as a workday snack, doughnuts have become an iconic treat. World War I catapulted doughnuts to the front of our taste buds, but the circular sweets are actually much older. Archaeologists have unearthed fossilized remnants of prehistoric doughnuts from Indigenous communities in North America, while ancient Romans and Greeks paired fried dough with sweet and savory sauces. A plethora of fritters and fried pastries spread throughout Europe in the Middle Ages before early Dutch settlers brought them to the American colonies under the name “olykoeks,” describing how they were cooked in oil.

By 1750, the first American doughnut recipe was published, allowing cooks to create the desserts at home. But it wasn’t until World War I that doughnut popularity skyrocketed; the Salvation Army’s volunteer “doughnut girls” cooked and distributed the fried rings along trenches and frontlines to homesick soldiers. Shortly after the war’s end, the first doughnut machine popped up in New York City, cementing the dessert as a culinary mainstay.

A plate of red fancy red jello.
Credit: Bildagentur Zoonar GmbH/ Shutterstock

Jell-O

Gelatin-based foods have a long, unsavory history. First emerging in medieval Europe, wiggly foods were hard to come by unless you were nobility. That’s because early gelatin dishes were based on livestock bone collagen; boiling the bones to extract the substance took days of labor-intensive work. Hours of boiling, straining, and mixing with other ingredients required a large kitchen staff, making gelatin dishes a status symbol for upper-class diners.

Fast forward hundreds of years to 1845, when inventor Peter Cooper (who also created America’s first steam locomotive) crafted the first gelatin powder, which required little time to set after being mixed with just hot water. Cooper’s “portable gelatin” wasn’t a big hit, and the patent was sold to Pearle and May Wait, owners of a cough syrup and laxative company. The Waits added fruit syrups to the powdered gelatin, launching Jell-O as a jiggly dessert product before selling off the brand in 1899. With the help of magazine ads and radio jingles, the confection became a household name, its popularity rising in the 1920s and remaining a household name for decades to follow.

A high angle view looking down on a freshly baked pumpkin pie.
Credit: RyanJLane/ iStock

Pumpkin Pie

While apple pie may have misappropriated origins (the first recipe appeared in England around 1381, not the U.S.), pumpkin pie deserves more credit as a purely American dessert. The spiced autumnal pie that now inspires countless fall desserts and drinks was concocted by early English colonists who encountered native pumpkins for the first time. Accounts from the mid-1600s suggest that newcomers to young America were reliant on pumpkins, brewing them in ale and baking them into pies. Because of their easy-to-grow nature, pumpkins became popular throughout Europe, where countless recipes for the baked squash pies directed chefs to boil pumpkin flesh in milk or mix pumpkin puree with baked apples.

Modern pumpkin pie construction became significantly less laborious around the 1920s, when Libby’s brand launched its first canned pumpkin puree. Most cooks today continue to opt for the store-bought ingredient, though pie purists may just opt to roast their own pumpkins, considering commercial purees actually consist of a sweeter, butternut-like squash. Pumpkin pie may be seasonal, but the tradition of adding your own flair is what keeps it around from year to year — just like every other popular dessert.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.