Some problems are eternal — like finding the best gifts for all the people on your holiday list. You have to make a list, set a budget… and potentially fight for the last item at the department store (or rush to beat the millions of other buyers online). While shoppers of yore weren’t brawling in the aisles over Beanie Babies, the act of picking out the right item has long been stressful. Let these popular gifts from the last few decades — and their modern upgrades — give your holiday shopping some inspiration this season.
The years immediately after World War II created a Christmas craze more aligned with the holiday celebrations we know today, compared to the reserved celebrations of decades prior. Aluminum Christmas trees, holiday paper crafts, and outdoor lights sprung to life during Christmases of the 1950s, and with them came the idea of spreading cheer to everyone you know through an extensive holiday shopping list. While toy producers were rolling blockbusters off their assembly lines for kids — think Mr. Potato Head, Frisbees, Hula Hoops, Barbies, and other toys that defined the decade — one adult-oriented gift became so popular that Santa Claus himself endorsed it: cigarettes. Cartons of smokes were cheaply priced and came in festive boxes (with a convenient gift tag attached), making it easy to swing by a drug store for a prewrapped holiday gift. So long as shoppers knew what brand their secretary, father-in-law, or friend smoked, they had what was then considered a stress-free, solid gift. Cigarette companies went all in, marketing rosy-cheeked Santas with sleighs full of cigarettes while celebrities such as Joan Crawford and Ronald Reagan attested to their affinity for the last-minute gift idea.
Modern Update: Dishing out cigarettes is decidedly outdated, but needing to pick up an affordable gift for your coworker or brother’s new girlfriend is a Christmas conundrum that will likely never be resolved. Virtually anything you can toss in your cart for $5 that doesn’t cause cancer is probably a winner, though a solid, popular choice is themed holiday socks. One YouGov poll reports 56% of Americans appreciate gifted socks, with the holidays being the main source of new socks for one in eight people. Sockmaker Bombas offers higher-end gift boxes, while sites like Goodly feature inexpensive holiday motifs that will warm the recipient’s heart (and feet).
The space race didn’t only occur at the most classified levels — some of the scientific breakthroughs that propelled humans to the moon were transformed into toys, too. Take Wham-O’s SuperBall, which debuted in 1964. Creator Norman Stingley, a chemical engineer who created a form of artificial rubber, transformed the material into an ever-bouncing ball that became an instant hit. At top production, Wham-O rolled 170,000 SuperBalls off assembly lines per day, with more than 20 million sold before the end of the decade. SuperBalls reached peak fad when they influenced the name of an even bigger cultural icon: the Super Bowl. Lamar Hunt, founder of the American Football League, admitted that when it came to naming the first championship football game, his inspiration came from the top-selling toy, which just so happened to be his children’s favorite.
Modern Update: While bouncing balls are unlikely to ever go out of style, Super Balls were inexpensive gifts that could round out a kid’s holiday haul. Today’s top science-adjacent stocking stuffer? Slime. The main appeal of this trendy goop is its sticky sensation and neon coloring, though physical therapists and psychologists say it has hidden benefits such as encouraging mindfulness and fine motor skills. Slime has moved from the DIY project of its early days to a commercially available goo — you can order a multipack from Play-Doh or give a creative kid their own slime-making kit. (For parents who don’t want to spend the holidays picking pet hair and crumbs from a slime ball, a squishable bead-foam ball is a mess-free alternative.)
Star Wars is one of the most successful sci-fi universes out there, but initially there wasn’t much confidence in its potential. Even creator George Lucas was expecting a box-office flop for his 1977 space opera. After the film’s surprising hyperjump to blockbuster status, Lucasfilm shopped for toy producers to move hit characters like Luke Skywalker, Han Solo, and Princess Leia from the big screen to under the Christmas tree. They settled on Kenner, a smaller manufacturer who crafted odds-and-ends merchandise such as the Escape from Death Star board game, puzzles, and posters. Alas, sure-sellers like action figures couldn’t be made and shipped in the seven months between the film’s release and the winter holidays, thanks to the extensive process of designing, molding, and producing the moveable toys.
In an attempt to still make holiday sales and keep up interest, Kenner debuted a risky idea: the “Early Bird Certificate Package,” a cardboard envelope that included some stickers, a Star Wars Space Club membership card, a cardboard display stand, and a certificate redeemable for four figurines that would be shipped out in early 1978. While some stores refused to stock the packages on the basis that they weren’t actually toys, the certificates sold out in many areas and Kenner’s gamble paid off.
Modern Update: The Star Wars franchise practically sells itself, and toys from its live-action spinoff, The Mandalorian, are still going strong. The six-inch, armor-clad Black Series Mandalorian Figure gets top reviews and is officially licensed by Disney. On the other end of the spectrum, this LEGO Millennium Falcon set, at more than 1,300 pieces, would delight diehard fans who are young or young at heart.
Credit: Barbara Alper/ Archive Photos via Getty Images
1980s: Cabbage Patch Dolls
We’ve become accustomed to massive Black Friday crowds and fighting in aisles over limited-stock items, but 30-some years ago, seeing parents tear into each other over a toy wasn’t only unheard of, it was truly shocking. That’s why the Cabbage Patch Doll craze lives on in infamy, and hot-selling toys are often compared to the frenzy for the stuffed, one-of-a-kind dolls. While they had been available at some stores across the country earlier, peak demand hit in the winter of 1983. Despite the limited stock, eager-to-please parents scooped up more than 3 million of the dolls by the end of the holiday shopping season, with many shoppers waiting in line for hours to snag one, or nearly rioting when stores ran out. But what made Cabbage Patch Dolls the perfect gift? Probably a mix of exclusiveness and marketing. The Cabbage Patch concept centered around not purchasing, but adopting one of its unique dolls, complete with an included adoption certificate. And it worked well — by the end of the 1980s, Cabbage Patch Kids had made around $2 billion thanks to its dolls and add-on accessories.
Modern Update: Dolls remain a tried-and-true holiday gift, but the modernized take on gifting a lifelike best friend is less about changing wet diapers (we’re looking at you, Baby Alive) and more about matching the doll to its prospective owner. Our Generation dolls are upsized, measuring 18 inches, and come in male and female dolls with a variety of hair types and skin tones. The dolls also have hobbies, jobs, and interests instead of needing round-the-clock parenting. While similar in size and appearance to the more pricey American Girl dolls that were a 1990s staple, Our Generation dolls have become popular in part because of their cost — around $25 compared to American Girl’s $100-plus price tag. And with endless accessories, such as a hot dog cart and even a tiny, ergonomic neck pillow, it’s one doll that can grow with your little one.
Credit: Tristan Fewings/ Getty Images Entertainment via Getty Images
1990s: Furby
The 1990s were all about pets — primarily ones that you didn’t have to actually feed, walk, or clean up after. Digital creatures like Giga Pets and Tamagotchi ruled the decade, but one robotic toy took its place as the most loved (and hated) animatronic critter on every kid’s wishlist: Furby. The fur-covered robot could chirp, speak “Furbish,” and dance, and supposedly even learn language and tricks with regular interaction over time. Furby fever had parents stampeding displays only to find limited stock, in part because the gadgets were first released in October 1998, far too close to the holiday shopping season for retailers to build a sizable supply. Within two months, parents had snapped up 1.8 million Furbies, with another 14 million sold the next year. Furby’s run, like that of most fad toys, was short-lived — it ended by the early 2000s, amid a storm of concerns about the robot’s supposed artificial intelligence and potential ability to retain information. (At one point, the National Security Agency banned Furbies over concerns they could act as spying devices.)
Modern Update: Hatchimals are the newest iteration of an animatronic companion. Part of the appeal is waiting for their jumbo-sized eggs to actually hatch (hence the catchy name) and reveal the stuffed robotic plush inside. Despite a limited number first rolling into stores in 2016, Hatchimals unexpectedly became a top holiday contender. Ambitious resellers purchased large quantities that they hocked online at steep markups, creating a Hatchimal black market that the company spoke out against. In the years since, Hatchimals have expanded to include miniature figurines and characters that arrive in plastic eggs but don’t hatch on their own (which is great for parents who are still creeped out by a battery-operated furball).
The year was 2004: Steve Jobs was rocking a black turtleneck, the original iPod had already been out for three years, and Apple’s brightly colored dancing silhouette commercials were getting regular airtime. And that’s when Apple dropped its iPod Mini — the downsized, candy-colored music player that came with 4 (or an upgraded 6) GB of space for all your favorite songs. That holiday season, Apple sold more than 4.5 million iPods between their regular and Mini editions — more than half its total iPod sales for the entire year. Like any hot gift, iPod Minis could be difficult to come by; even with so many in production, many models were put on backorder while desperate gift-givers scoured eBay’s marked-up options. While the Mini (and its whopping $249 price tag) was quickly left behind for improved models, some people still swear by the simple interface and classic click-wheel design (which brings on some good ’00s nostalgia).
Modern Update: The iPod Mini (and nearly every other version of the iPod) faded into the background as faster smartphones began to provide an all-in-one experience. And along with the ability to listen on demand came the ability to pick and choose new music without paying for every individual song piecemeal. Music streaming services such as Spotify and Apple Music have made it easier to listen to all the newest beats and old favorites on loop without shelling out for entire albums. So while you can’t give the gift of an aluminum-backed mp3 player any longer, a music subscription gift card for ad-free listening will be just as prized. (Gift cards to the online music store Bandcamp also allow music fans to directly support the artists they love.) Because what says “Happy Holidays” better than a gift you didn’t have to fight for?
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Countless movies are based on real events, and most of them are quick to let you know it. Whether the plot is ripped from the headlines or merely adapted, taking inspiration from real-world happenings can confer a sense of legitimacy — the implication being that even if creative license was taken (and it almost certainly was), the filmmakers are performing a kind of public service by bringing these stories to the big screen. Not all of these movies advertise their pedigree, however, and there’s a good chance you didn’t realize these four movies were based on real events.
Freddy Krueger isn’t real and there have been zero confirmed cases of teenagers being murdered in their dreams (thankfully!), but that doesn’t mean that Wes Craven’s landmark slasher series wasn’t inspired by a real story. Years before dreaming up Elm Street, the horror maestro became fascinated by a series of newspaper articles about refugees from Laos, Cambodia, and Vietnam who were afflicted with nightmares so disturbing that they forced themselves to stay awake — and, in some cases, died upon finally falling asleep.
“It was a series of articles in the L.A. Times; three small articles about men from South East Asia, who were from immigrant families and had died in the middle of nightmares — and the paper never correlated them, never said, ‘Hey, we’ve had another story like this,’” Craven explained in a 2008 interview. Other research has shown that the phenomenon primarily affected Laotian male refugees from the Hmong ethnic group, an ethnicity that fought alongside the U.S. in the Vietnam war and was subsequently persecuted in Laos after the war ended. Many later suffered traumatic resettlements in the U.S. In the newspaper articles, there were no reports of a man wearing a striped red-and-green sweater — but the core of the idea was the same.
Whether or not the mobsters in Martin Scorsese’s crime classic are actually good fellas is debatable, but one thing is certain: They were at least based on real fellas. Adapted from Nicholas Pileggi’s book Wiseguy: Life in a Mafia Family, Goodfellas envisions mafiosa-turned-informant Henry Hill as a made man whose life of crime represents a fulfillment of his childhood dream — there’s a reason the movie’s first line is, “as far back as I can remember, I always wanted to be a gangster.”
The fact that Scorsese had already directed revered crime pictures such as Mean Streets and Taxi Driver made him reluctant to make another, but coming across Wiseguy was more than enough to change his mind. “I just read your book. I’ve been looking for it for years,” Scorsese told Pileggi over the phone when pitching the idea of adapting it. “Well, I’ve been waiting for this call all my life!” Pileggi replied. The rest, as they say, is history.
If you believe that truth is stranger than fiction, you won’t be surprised to learn that Three Billboards Outside Ebbing, Missouri’s inventive premise was borne of more than writer-director Martin McDonagh’s imagination. The Oscar-winning drama stars Frances McDormand as a grieving mother who, months after the rape and murder of her daughter, takes matters into her own hands by calling out law enforcement’s lack of progress on the case with a series of accusatory billboards.
McDonagh revealed how the idea came to him in an interview conducted shortly after the film’s release: “Twenty years ago I was on a bus going through the southern states of America, and somewhere along the line, I saw a couple of billboards in a field that were very similar to the billboards that we see in the start of our story,” he told Deadline in 2018. “They were raging and painful and tragic, and calling out the cops.” McDonagh received an Academy Award nomination for his screenplay, and a number of protest groups have since used similar billboards to make their voices heard.
Plenty of people consider The Exorcist the scariest movie ever made, and the fact that it’s based on a true story only adds to the terror. The actual practice of exorcism is highly controversial, so when writer William Peter Blatty based his 1971 novel on a particularly disturbing episode he’d first heard about in college, it was perhaps surprising that it was so well received. Blatty adapted the story of a 14-year-old boy whose family had believed he was possessed by a demon. A number of Jesuit priests performed the exorcism in 1949, which one account claims was witnessed by at least 48 people.
“The little boy would go into a seizure and get quite violent,” one of the priests recalled, even going so far as to break that priest’s nose, and he had words like “hell” etched into his skin. Skeptics doubt that the teenager was ever actually possessed, of course, and the boy reportedly went on to lead “a rather ordinary life.” Blatty wrote the script for William Friedkin’s hugely successful adaptation of his novel, and the author-turned-screenwriter won an Academy Award.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
History is filled with the stories of amazing women, from female pirates to tireless civil rights advocates. And while many of these stories are important feminist firsts, some feature the quirkier side of women’s history — like the famed mystery writer who helped popularize surfing. From women who authored important legal arguments to those whose inventions made our lives easier, here are 25 facts that will help you celebrate Women’s History Month.
On the list of things women don’t get enough credit for, being the first to brew beer might not seem like the most important. But fermented beverages have played a vital role in human culture for almost as long as society has existed, providing nutrients, enjoyment, and often a safer alternative to drinking water before the advent of modern sanitation. Scholars disagree over exactly when beer was first introduced — the earliest hard evidence for barley beer comes from 5,400-year-old Sumerian vessels that were still sticky with beer when archaeologists found them — but one thing has never been in question: “Women absolutely have, in all societies, throughout world history, been primarily responsible for brewing beer,” says Theresa McCulla, who curates the Smithsonian’s American Brewing History Initiative.
Ada Lovelace Is Often Considered the World’s First Computer Programmer
Ada Lovelace followed a path many considered impossible for a woman in the early 19th century. Encouraged by her mother, Lady Byron, Lovelace developed a passion for mathematics at a young age. In 1833, a 17-year-old Lovelace met British mathematician Charles Babbage at a party, and he told her about a calculating machine he’d created called the Difference Engine. Fascinated, Lovelace eventually began a regular correspondence with Babbage.
About a decade later, while translating a French text regarding Babbage’s proposed Analytical Engine — often considered the first mechanical computer — Lovelace added a few notes of her own. “Note G” detailed a method through which Babbage’s creation could calculate complex numbers called Bernoulli numbers. This is often considered the world’s first computer program, making Lovelace the first computer programmer. And while Babbage was the brains behind the machine, Lovelace was the one who truly grasped its wider importance, foreseeing a future where engines could use the “abstract science of operations” to do things beyond mere computation.
In fact, many early computer programmers were women. In the 1940s and ’50s, engineering computers was perceived as a man’s profession, but programming them was considered secretarial. As a result, many women took jobs as programmers — helping Alan Turing crack the Enigma Machine during World War II, writing instructions for the world’s first general-purpose computer called ENIAC, and creating the world’s first compiler (a program that translates programming languages into machine languages). According to government data, around 27% of programmers in 1960 were women. In 2013, that number was 26% and falling. Today, many leading universities are working hard to reverse that trend.
As the first female head of a major Hollywood studio — Desilu Productions, which Lucille Ball formed with then-husband Desi Arnaz but took over by herself after their divorce in 1960 — Ball helped produce some of the most influential television shows of all time. She was particularly instrumental in getting Star Trek on the air. There was apparently some trepidation by Desilu board members when it came to the budget of the ambitious series, leaving Ball to personally finance not one but two pilots of the science fiction mainstay. One studio accountant, Edwin “Ed” Holly, even claimed: “If it were not for Lucy, there would be no Star Trek today.”
Not all pirates were men: Ching Shih was a fearless female pirate from China. Following the 1807 death of her husband Cheng I, who was head of the powerful Red Flag Fleet, she unofficially commanded a fleet of 1,800 pirate ships and approximately 80,000 men. She also took control of the Guangdong Pirate Confederation and spent the following years waging battle — and winning — against the Portuguese Empire, the Chinese Navy, and Britain’s East India Company. She’s widely considered one of the most successful pirates of all time.
Before Rosa Park, Claudette Colvin Refused to Give Up Her Seat on the Bus
Nine months before Rosa Parks was arrested for refusing to surrender her bus seat to a white passenger in Montgomery, Alabama, the same thing happened to 15-year-old Claudette Colvin. So why was the Parks incident the one that ignited the Montgomery bus boycott and transformed the issue into a national story? As Colvin herself later conceded, the then-42-year-old Parks, a secretary for the NAACP, was considered by some to be a more respectable symbol for the boycott, particularly after it was discovered that the unwed Colvin had become pregnant.
Nevertheless, Colvin wound up playing a crucial role as events unfolded: She was named a plaintiff in the 1956 Browder v. Gayle case that challenged the constitutionality of Alabama’s segregated buses and provided the legal backbone for the boycott’s triumph. Colvin left Alabama soon after and spent most of the following decades living anonymously in New York City, though her contributions have finally earned some long-overdue recognition in recent years.
While Ireland is named after the mythical goddess Éiru, there’s only one sovereign nation in the world named for a real-life woman. That distinction lies with St. Lucia, a Caribbean island nation christened in honor of St. Lucy of Syracuse, patron saint of the blind, who died around the fourth century CE.
St. Lucia was initially called Louanalao (meaning “Island of the Iguanas”) by the Indigenous Arawak people as early as 200 CE. It was in 1502 that the origins of its current name formed, when shipwrecked French sailors dubbed the place “Sainte Alousie.” It was a common practice at the time to name islands after saints, and legend has it that the sailors reached the island on December 13 — St. Lucy’s feast day. Given the date’s significance, December 13 is now celebrated in the country as the National Day of St. Lucia. The Spanish who arrived around 1511 named the island “Sancta Lucia”; the current name formed after waves of colonization by the English and French.
While female namesakes are rare on a national level, one woman has lent her name to dozens of smaller locations. The name of Queen Victoria, the U.K.’s reigning monarch from 1837 to 1901, appears in the titles of locations around the globe, such as the provincial capital of British Columbia, Canada, and Zimbabwe’s breathtaking Victoria Falls. You’d be hard-pressed to find an American woman with influence so vast. Even in the U.S., only a handful of places are named for women, including Barton County, Kansas — named after Clara Barton, founder of the American Red Cross — and Dare County, North Carolina, honoring Virginia Dare, the first child of English parents to be born in the New World.
Cleopatra’s legacy is so complicated because it tangles with historical biases against strong, female rulers and the propaganda of the early Roman Empire. Today, most people know Cleopatra as a seductress, one who had romances with two of the most powerful Roman leaders in the first century BCE, and who used her sex appeal to manipulate geopolitics in her favor. However, the source of many of these colorful tales is Octavian’s (later Caesar Augustus’) propaganda machine; he launched the equivalent of a fake news campaign to discredit the foreign queen and his rival Mark Antony. When Octavian proved victorious against Antony and Cleopatra at the Battle of Actium in 31 BCE, the victors became the authors of history, and it has taken millennia for scholars to learn more about the real life of this fascinating final pharaoh.
Amelia Earhart Once Took Eleanor Roosevelt on a Nighttime Joyride
Although her aviation career lasted just 17 years, Amelia Earhart remains one of the most famous people ever to take to the sky. In addition to being renowned for her many firsts — including being the first woman to fly solo across the Atlantic and the first person to fly alone from Hawaii to the mainland U.S. — she’s known for her 1937 disappearance and the many theories it spawned. Less well known but considerably more fun to imagine is the time she took Eleanor Roosevelt on a nighttime joyride from Washington, D.C., to Baltimore on April 20, 1933. The brief flight took place with both of them in their evening wear following a White House dinner party.
“I’d love to do it myself. I make no bones about it,” the First Lady told the Baltimore Sun after the flight. “It does mark an epoch, doesn’t it, when a girl in an evening dress and slippers can pilot a plane at night.” In fact, Roosevelt herself had recently received a student pilot license and briefly took over the controls of the twin-engine Curtiss Condor, borrowed from Eastern Air Transport at nearby Hoover Field. Eleanor’s brother Hall also ditched the dinner party in favor of the flight that night, as did Thomas Wardwell Doe, the president of Eastern Air Transport, and Eugene Luther Vidal (head of the Bureau of Air Commerce) and his wife Nina Gore, parents of author Gore Vidal. When the plane returned after the short journey, the Secret Service guided everyone back to the White House table for dessert. Roosevelt and Earhart remained friends for the rest of Earhart’s life, sharing an interest in women’s causes, world peace, and of course, flying.
Melitta Bentz’s invention is one coffee drinkers now take for granted, but it was revolutionary in the early 1900s. At the time, other home brewing methods required a lot of time and cleanup — not to mention a tolerance for bitter coffee and sludgy grounds at the bottom of your mug. While pricey cloth coffee filters were available, they were used like tea bags, steeping grounds in hot water that produced a subpar cup and a mess. Many coffee connoisseurs brewed their morning java in percolators, but those could leave a burnt taste and failed to filter out smaller grounds.
Bentz, a German woman with an affinity for coffee, was determined to find a better brewing process that didn’t require extensive cleanup. During one experiment, she reached for notebook paper as a potential liner, filling the makeshift filter with coffee grounds. She placed the filter inside a pot she had punched holes in and poured hot water over the grounds, allowing brewed coffee to cleanly drip through to a cup below. With the creation of drip coffee brewing, Bentz began producing the paper filters at home, and was granted a patent for her drip-cup apparatus in 1908. With help from her family, she launched a line of drip-coffee makers and filters in 1909, branding the items with her own first name. Bentz died in 1950, but her company — now run by her grandchildren — produces nearly 50 million coffee filters each day.
The Origin Story of the “Bra-Burning Feminist” Is a Myth
Think of the Swinging ’60s and you might imagine one of the most popular stereotypes: women burning their brassieres to protest society’s rigid rules. The stunt has been referenced in discussions about gender equity for decades — but it turns out it never actually happened at the event most often mentioned in connection with it. Here’s what did: On September 7, 1968, members of the group New York Radical Women gathered outside the Miss America Pageant on New Jersey’s Atlantic City boardwalk to protest the event. Their argument? The pageant degraded women by promoting unrealistic beauty standards and strict social expectations. The protest was also meant to highlight larger issues American women faced, such as being denied their own credit cards or the right to continue working during pregnancy. Protest organizers originally planned to burn items related to their discontent, such as bras and makeup, but local police shut down the stunt, citing safety concerns around a fire on the boardwalk. Instead, the group hauled out a metal “freedom trash can,” which became a disposal site for undergarments, cookware, wigs, and issues of Playboy magazine — all items participants deemed “instruments of female torture.” The gathering also crowned a live sheep in a mockery that compared beauty pageant participants to fairground show animals.
Even without a blaze, bra burning became synonymous with the women’s liberation movement. A New York Post article linking the pageant protest with draft card burning misconstrued events, an error some historians say popularized the belief that feminists were setting fire to their undergarments. And while it’s possible that later demonstrations inspired by the fictitious fire actually torched a bra or two, large-scale bra burnings weren’t recorded events. Some activists believe the lingerie legend overshadowed the event’s larger message, but that it wasn’t all bad — the famed protest helped catapult the women’s equality movement into mainstream conversations.
In 1887, Nellie Bly launched her first undercover story for The New York World, becoming a “girl stunt reporter,” part of a then-popular movement of female reporters who embedded themselves in investigations to expose dangerous working conditions, corrupt public figures, and social atrocities. Bly’s initial investigation involved a 10-day stay at the infamous Blackwell’s Island Asylum in New York City, where women experiencing mental health crises (as well as others sent there for a variety of reasons, including not speaking English) were subjected to cruel “treatments,” rotten food, and abuse. After her release, Bly penned a story that exposed the institution’s horrors and led to public calls for improved conditions, including a grand jury investigation and budget increases to properly house and help patients. It also became one of her most famous books, titled Ten Days in a Mad-House.
State seals are often crimped or stamped on legal documents, lending them authenticity. Yet these small symbols have another role, as miniature visual histories specific to each state, often simultaneously representing hopes for the future. At least that’s how artist Emma Edwards Green viewed the seal she created for Idaho in 1891 — which just so happens to be the only state seal designed by a woman.
Idaho became the 43rd state on July 3, 1890, formed from a territory that had once included land in present-day Montana and Wyoming. Upon statehood, Idaho legislators looked to commission the state seal’s design by way of a competition, with a generous $100 prize (about $3,300 today) for the winning artist. Green, an art teacher who had relocated to Boise after attending school in New York, was in part inspired by the fact that it seemed Idaho would soon give women the right to vote. In March 1891, Green’s work was selected as the winner, beating out submissions from around the country.
The final design, which is also featured on Idaho’s flag, is packed with symbolism. Worked into the design are cornucopias and wheat to represent Idaho’s agriculture, a tree meant to be reminiscent of the state’s vast timberlands, and a pick and shovel held by a miner. Green’s most forward-thinking detail, however, is a man and woman standing at equal heights in the seal’s center, a symbol of gender equality that would eventually come with voting rights for all. True to their word, Idaho legislators passed women’s suffrage in 1896 — five years after Green’s seal became the state’s official symbol — making Idaho the fourth state to enfranchise women, more than 20 years before the 19th Amendment gave the same right to women nationwide.
The Inventor of the Home Security System Was a Nurse
Necessity is the mother of invention, and that can certainly be said of Marie Van Brittan Brown and her home security system. In the mid-1960s, Brown lived in a rough neighborhood in Queens, New York, while working as a nurse. She was often alone at night, so she decided to design her own peace of mind. Her invention featured four peepholes on the front door and a motorized camera that could look through the holes at varying heights. The camera was connected to a television inside the home, and a microphone both inside and outside the door allowed her to interrogate uninvited visitors. For added security, Brown also devised a way to alert police via radio. This ingenious use of cameras and closed-circuit television helped Brown score a patent for her security system in 1969. Today, Brown’s invention is widely regarded as the cornerstone of modern home security systems.
Pauli Murray’s Legal Arguments Were Used in Brown v. Board of Education
Pauli Murray was enormously influential as a lawyer, writer, and teacher. She became California’s first Black deputy attorney general in 1945, as well as the first African American to earn a Doctor of Juridical Science from Yale Law School two decades later. Additionally, the acclaimed scholar saw her legal arguments used in the groundbreaking cases of Brown v. Board of Education (1954), which struck down segregation in public schools, and Reed v. Reed (1971), which extended the rights under the 14th Amendment’s Equal Protection Clause to women.
Publicly critical of the sexism rife within the ranks of the Civil Rights Movement, Murray helped launch the National Organization for Women (NOW) in 1966. Eventually, she found herself out of step with its leadership and stepped away. On her own once again, Murray resigned from her teaching post and entered New York’s General Theological Seminary, en route to one final historic achievement in 1977 as the first African American woman to be vested as an Episcopal priest.
Agatha Christie’s characters have done it all — survived attempted murder, traveled to far-off lands, and solved mystery after mystery. But the bestselling author didn’t just write about adventure; she also sought it out, sometimes on a surfboard. Two years after publishing her first novel, Christie embarked on an international trip with her first husband, Archibald. Their 1922 stop in South Africa included an attempt at surfing, where it’s possible she may have become the first Western woman to stand up on a surfboard. The globetrotting couple quickly fell in love with the sport, and went on to catch swelling waves off the coasts of Australia, New Zealand, and Hawaii. Christie, in letters to her mother, recounted the tricky experience of learning to surf, describing the sport as “occasionally painful” thanks to a “nosedive down into the sand.” But the writer eventually became more skilled, detailing in her 1977 autobiography that nothing could compete with the rush of approaching shore at high speeds. She also wrote about surfing in her novel The Man in the Brown Suit, in which her protagonist, nicknamed “Anna the Adventuress,” goes surfing in Cape Town.
Christie’s pursuit of the perfect wave was unusual for an Englishwoman of her time. The Museum of British Surfing suggests she and her husband may have been two of the earliest Brits to attempt the activity. However, they did have regal company: Prince Edward, the British royal who would eventually abdicate the throne in 1936 to marry Wallis Simpson, was photographed surfing in Hawaii two years before Christie rented her first surfboard.
Clearing away dinner dishes is easier (and faster) today than it was in 1886, when Josephine Cochrane patented the first mechanical dishwasher. As a frequent host of dinner parties at her Shelbyville, Illinois, mansion, Cochrane was concerned about maintaining her fine dishware’s pristine condition. But as a busy socialite, she didn’t want to do the tedious work of scrubbing each piece herself to ensure it stayed that way; she relegated the task to servants, whose work occasionally caused chips and cracks. Cochrane’s solution was to create a dishwashing unit that kept her costly tableware out of the slippery sink and instead stationary while being sprayed with jets of water.
Cochrane, the daughter of an engineer and granddaughter of a steamboat innovator, was likely familiar with inventive tinkering despite lacking formal education in science or math. But after her husband’s death in 1883 left her with looming debt and few resources to pay it off, her dishwashing contraption transformed from a time-saving idea into a path for financial security. Cochrane was awarded a patent for her dishwasher design three years after being widowed and displayed her innovation at the World’s Columbian Exposition of 1893, where visitors marveled at the event’s only machine created by a woman. With exposure from the fair, Cochrane began marketing her contraptions to hotels, restaurants, and hospitals. (The cost was often too much for homemakers.) After her death in 1913, Cochrane’s company was purchased by Hobart Manufacturing Company, the original producer of KitchenAid-brand products.
Mary Katharine Goddard Was the First Known Female Postmaster in Colonial America
Mary Katharine Goddard was among the first female publishers in the U.S., a socially precarious venture for a colonial woman during the country’s fight for independence. Working with her mother, Sarah, and brother, William, Mary Katharine founded multiple publications starting in the 1760s. William frequently traveled between cities to establish new papers, leaving the bulk of news collecting and printing to his sister. In 1774, he appointed Mary Katharine to run The Maryland Journal while he focused on other pursuits (such as lobbying for a national postal service) and served time in debtor’s prison. During the height of the Revolutionary War, Mary Katharine made a name for herself with fiery anti-British editorials. In 1775, she was appointed Baltimore’s first postmaster — likely the first woman to hold such a position in colonial America — and in 1777, Congress commissioned her to print copies of the Declaration of Independence. (Surviving copies feature her printer’s mark at the bottom.) Despite her success, however, Mary Katharine was pushed out of both roles at the war’s end. In 1784, William rescinded her title as publisher, creating a lifelong rift between the siblings. Not long after, she was also removed from her postmaster job on the basis of sex. She wrote to George Washington asking to be reinstated, but the President passed her complaint to the postmaster general, who left her plea unanswered.
Credit: Scott Gries/ Hulton Archive via Getty Images
Jennifer Lopez Inspired the Creation of Google Images
Jennifer Lopez has worn a lot of memorable dresses on a lot of red carpets over the years, but only one broke the internet to such an extent that it inspired the creation of Google Images. The multi-hyphenate entertainer first wore the plunging leaf-print silk chiffon Versace gown to the 2000 Grammy Awards in L.A., which former Google CEO Eric Schmidt later revealed led to “the most popular search query we had ever seen.” The problem was that the then-two-year-old search engine “had no surefire way of getting users exactly what they wanted: J.Lo wearing that dress.” Thus, in July 2001, “Google Image Search was born.”
Two decades later, to the delight of everyone in attendance, Lopez also closed out Versace’s Spring 2020 show in Milan by wearing a reimagined version of the dress, after other models walked the catwalk to the tune of her hit 2000 single “Love Don’t Cost a Thing.” After a projected montage of Google Image searches for the original dress and a voice saying, “OK, Google. Now show me the real jungle dress,” J.Lo herself appeared in an even more provocative and bedazzled rendition of the gown.
Lyda Conley Was the First Native American Woman to Argue a Supreme Court Case
Lyda Conley’s legacy was preserving that of her ancestors — specifically their final resting place. Conley acted as a staunch (and armed) defender of the Wyandot National Burying Ground, a Kansas cemetery at risk of sale and destruction some 60 years after its creation. The cemetery was established in 1843 following typhoid and measles outbreaks that took hundreds of Wyandot lives; the loss was a particular blow to an Indigenous community that was forcibly relocated thanks to broken treaties with the U.S. government and the cruel Indian Removal Act of 1830. In 1890, Kansas senators introduced legislation to sell the burial ground. Although it failed, the effort encouraged Lyda Conley to attend law school to defend the cemetery in which her own parents, siblings, and grandparents were interred. Conley was admitted to the Missouri Bar in 1902, and within four years put her legal skills to work as the federal government moved to sell the cemetery. Conley and her sister Lena began a legal and physical siege for its protection, building an armed watch station called Fort Conley on the grounds and warning, “Woe be to the man that first attempts to steal a body.” In 1910, her legal fight made its way to the U.S. Supreme Court, where she became the first Native American woman (and third woman ever) to argue a case before the judges. While the court ruled against her, years of media coverage about the cemetery worked in her favor. In 1913, the Kansas Senate passed legislation protecting the cemetery, which was designated a National Historic Landmark in 2017.
Credit: Images Press/ Archive Photos via Getty Images
Jackie Kennedy Helped Save Grand Central Terminal From Being Demolished
Much like she did in preserving the history of the White House, Jackie Kennedy played a key role in maintaining one of New York City’s most prominent landmarks. In the mid-1970s, developers hatched a plan to demolish part of Grand Central Terminal to build an office tower. The former First Lady was among a group of notable New Yorkers who objected to the plan, and in 1975, she spoke at a press conference at Grand Central’s famed Oyster Bar restaurant to protest the destruction of the beaux arts-style structure. She and other preservationists worked to ensure the building’s protection, which was ultimately assured by the U.S. Supreme Court decision Penn Central Transportation Co. v. New York City. A plaque dedicated in 2014 at the entrance on 42nd Street and Park Avenue honors Jacqueline Kennedy Onassis for her role in saving the indelible Manhattan icon.
And Grand Central Terminal isn’t the only NYC landmark to commemorate her legacy. Located at the northern end of Central Park, where Jackie was known to jog, the Jacqueline Kennedy Onassis Reservoir pays homage to the former First Lady’s contributions to the city. The artificial body of water, constructed between 1858 and 1862, spans 106 acres and was the largest human-made body of water in the world at the time of its creation.
During World War II, movie star Hedy Lamarr and modernist composer George Antheil came up with a “secret communication system” that used “frequency hopping” between radio signals to direct torpedoes without enemy interference. She and Antheil received a patent in August 1942 and offered their invention to the U.S. military. But the government wasn’t interested in the invention or Lamarr’s intelligence — instead, the actress was informed that her beauty was the best way to help the war effort. Instead of rejecting this sexist suggestion, Lamarr went on to sell millions in war bonds.
The frequency-hopping system that Lamarr and Antheil invented during World War II was adapted by the U.S. Navy and used during 1962’s Cuban missile Crisis. Later it contributed to technological innovations such as Bluetooth and GPS. Yet Lamarr’s contribution was ignored. She expressed her feelings about this in a 1990 interview: “I can’t understand why there’s no acknowledgment when it’s used all over the world.” Lamarr was slightly mollified when she was recognized by the Electronic Frontier Foundation with a Pioneer Award in 1997.
Credit: Photo 12/ Universal Images Group via Getty Images
The Legend of Zelda Video Game Was Named for F. Scott Fitzgerald’s Wife
Video games aren’t often associated with literary figures, but The Legend of Zelda has always been unique. Take, for instance, the fact that its title character was named after writer, artist, and Jazz Age icon Zelda Fitzgerald, whose marriage to The Great Gatsby author F. Scott Fitzgerald generated nearly as many headlines as his professional output. Zelda, who’s been described as the first flapper of the Roaring ’20s (and the inspiration for Gatsby’s Daisy Buchanan), was chosen because a Nintendo PR rep suggested that the eponymous princess should be “a timeless beauty with classic appeal” and that Zelda Fitzgerald was one such “eternal beauty.”
Shigeru Miyamoto, the game’s creator, agreed: “She was a famous and beautiful woman from all accounts, and I liked the sound of her name,” he has said. The name chain didn’t end there; actor Robin Williams was such a fan of the series that he named his daughter after the Princess of Hyrule. As for Zelda F. herself, she was — rather fittingly — named for the fictional heroine of a 19th-century novel.
Only One U.S. First Lady Has Ever Been Featured on Paper Currency
Five Presidents are featured prominently on U.S. bills currently in circulation — George Washington, Thomas Jefferson, Abraham Lincoln, Andrew Jackson, and Ulysses S. Grant. Yet only one First Lady has been given the same honor: Martha Washington. She also happens to be the only real-life woman (as opposed to mythical figures representing abstract concepts such as liberty) to have her portrait printed on U.S. paper currency. In 1896, Martha appeared alongside her husband on the back of the $1 note in a design commemorating 120 years of American history, but a decade prior she had her own bill — the U.S. Treasury’s $1 silver certificate. First released in 1886 — 84 years after her death and 17 years after $1 bills began featuring George Washington — the silver certificate could be exchanged for precisely one dollar’s worth of silver. The bills were eventually discontinued in 1957, yet the design featuring Martha remains the second-longest-issued paper money in U.S. history.
Eleanor Roosevelt Wrote a Newspaper Column for Nearly 30 Years
Starting at the very end of 1935 and continuing until her death in 1962, Eleanor Roosevelt kept a regular, nationally syndicated newspaper column called “My Day.” Eventually, it appeared in 90 different U.S. newspapers, detailing both her actions of the day and causes she supported — including ones that perhaps diverged a little from FDR’s views. After her husband’s death, she spoke even more freely about her viewpoints, and chose to keep advocating through her writing instead of running for office herself. Some newspapers dropped her column after she advocated for the election of Adlai Stevenson II in his run against Dwight D. Eisenhower in 1956, leading United Features Syndicate to instruct her to limit her support for candidates, which she did not do. For the majority of the run, Eleanor published six columns a week; only after her health began to decline in the last couple of years of her life did she cut that down to three.
Following four bloody years of the U.S. Civil War, two women called for a “mother’s day” to push for peace. In the summer of 1865, Ann Jarvis created Mothers’ Friendship Days in West Virginia that aimed to bring together Americans from all political backgrounds, and she continued the annual tradition for years. Inspired by Jarvis, Julia Ward Howe — who famously penned the lyrics to “The Battle Hymn of the Republic” — also wrote an “Appeal to Womanhood Throughout the World” in 1870, highlighting men’s role in war and calling on women to resist being “made a party to proceedings which fill the world with grief and horror.” She also tried to establish June 2 as “Mother’s Day for Peace.” However, it wasn’t until 1908 that Anna Jarvis (the daughter of the West Virginia peace activist) celebrated a “Mother’s Day” in May in honor of her deceased mother. Within a decade, the observance became a nationally recognized holiday.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Take a trip down memory lane with this collection of nostalgic TV show facts from around the website. Did you know that M*A*S*H was based on a true story? How about the fact that the pilot of I Love Lucy was lost for over 40 years? Dig deeper into these stories and more with 25 of our favorite vintage TV show tidbits.
The “M*A*S*H” Finale Was Watched by More People Than Any Other Series Finale
After 11 years on the small screen, M*A*S*H aired its series finale on February 28, 1983 — and made history in the process. More than 106 million people tuned in to watch “Goodbye, Farewell and Amen,” making it the most-viewed series finale ever. Until Super Bowl XLIV in 2010, which saw the post-Hurricane Katrina New Orleans Saints defeat the Indianapolis Colts, it was the most-watched television broadcast in U.S. history. No episode of a scripted series has come close in the decades since. The series finale of Cheers earned 80.4 million viewers, Seinfeld got 76.3 million, and Game of Thrones — the most-talked-about show on television for years — had 19.3 million.
More Than 100 Cheesecakes Were Eaten on “The Golden Girls”
On TheGolden Girls, there were very few problems that a slice of cheesecake couldn’t solve, from small scuffles to big life crises. Throughout seven seasons, more than 100 cheesecakes were eaten during the ladies’ late-night kitchen table commiserations.
However, if you look closely, you’ll notice that Dorothy rarely takes a bite. In real life, Bea Arthur reportedly hated cheesecake.
Lucille Ball Was Only the Second Woman to Appear Pregnant on Network TV
When Lucille Ball became pregnant in real life, she and her husband and co-star, Desi Arnaz, considered taking a hiatus from I Love Lucy — but then thought it would be an opportunity to break the mold. “We think the American people will buy Lucy’s having a baby if it’s done with taste,” Arnaz said. “Pregnant women are not kept off the streets, so why should she be kept off television? There’s nothing disgraceful about a wife becoming a mother.” Ball ended up being the one of the first women to appear pregnant on a major television network and received more than 30,000 supportive letters from fans, despite the fact that the cast wasn’t allowed to say the word “pregnant” on-screen.
Angela Lansbury Wasn’t the First Choice for Jessica Fletcher in “Murder, She Wrote”
It’s nearly impossible to imagine anyone but Angela Lansbury playing Jessica Fletcher, but she wasn’t a shoo-in for the job. Doris Day turned it down; Jean Stapleton (aka Edith Bunker) also declined, partly because she didn’t feel ready to jump into another series so soon after wrapping up the 1970s sitcom All in the Family. “Every time I saw Angela during those years, she’d say, ‘Thank you, Jean,’” Stapleton once said.
Out of all of her roles, Lansbury ended up identifying the most with Fletcher. “The closest I came to playing myself … was really as Jessica Fletcher,” Lansbury told Parade magazine in 2018. However, in 1985 — a year after the show began — she also told The New York Times: “Jessica has extreme sincerity, compassion, extraordinary intuition. I’m not like her. My imagination runs riot. I’m not a pragmatist. Jessica is.”
“Masterpiece Theatre” Is the Longest-Running Prime-Time Drama in the History of U.S. Television
Masterpiece Theatrepremiered its first episode on January 10, 1971, following the success of a 1967 adaptation of John Galsworthy’s The Forsyte Saga. Stanford Calderwood, who was then the president of WGBH, Boston’s PBS affiliate, saw that success and wondered whether there might be a growing American appetite for British drama. His instincts proved spot-on. While on vacation in London, he convinced the execs at BBC that a partnership could prove fruitful for both networks; now, 50 years later, American viewers continue to clamor for classic British stories told with beautiful sets and elaborate costumes.
Valerie Harper Almost Didn’t Get the Role of Rhoda on “The Mary Tyler Moore Show” Because She Was Too Pretty
Rhoda Morgenstern, Mary Tyler Moore’s Bronx-born sidekick, was the last major role to be cast in the series, with more than 50 actresses reading opposite Moore for the part. Valerie Harper nailed her audition as Rhoda and even brought her own cloth for washing Mary’s apartment window in her first scene. But the producers weren’t sure she matched their vision.
“She was something we never expected the part to be… which is someone as attractive as she was,” Burns said in Mary and Lou and Rhoda and Ted. “But you’ve got to go with the talent.” Director Jay Sandrich felt strongly Harper was right for the role and suggested she not wear any makeup for her callback. Producers immediately changed their minds when they brought Moore in to read a scene with Harper. Rhoda’s character switched gears a little bit — rather than being unattractive, which is subjective anyway, Rhoda just felt like she was unattractive.
“Rhoda felt inferior to Mary, Rhoda wished she was Mary,” Harper later recalled. “All I could do was, not being as pretty, as thin, as accomplished, was: ‘I’m a New Yorker, and I’m going to straighten this shiksa out.’”
“Rubber Duckie” Was a Billboard Hit Song
Of all the catchy and memorable songs on Sesame Street, the only one to ever become a certified Billboard hit was “Rubber Duckie,” which was on the Hot 100 for seven weeks in 1970, topping out at No. 16. The tune was performed by Jim Henson himself, in character as Ernie — and was also nominated for a Grammy for Best Recording for Children that year. Little Richard covered the song in 1994, and an all-star version for National Rubber Duckie Day, featuring Tori Kelly, James Corden, Sia, Jason Derulo, Daveed Diggs, and Anthony Mackie, was released in 2018.
Ron Howard Accepted His “Happy Days” Offer to Avoid the Draft
Ron Howard was ambivalent about accepting an offer to headline what became Happy Days, as he’d already experienced sitcom success with The Andy Griffith Show and was looking forward to starting film school at USC. However, he’d also been saddled with what he called a “horrible draft number,” and given that he stood a better chance of avoiding the Vietnam War through work than a college deferment, he elected to roll the dice with the good-natured ’50s sitcom.
The Theme Song for Each “Seinfeld” Episode Is Different
For the first seven seasons of Seinfeld, every episode started with Jerry Seinfeld doing a stand-up routine. But what only eagle-eared listeners will notice is that the theme song was made to match those monologues, which means every single episode had a slightly different one. Composer Jonathan Wolff used instruments like the bass — plus his fingers and mouth — to improvise the sounds, and synced them to Seinfeld’s stand-up timing to build a simple melody that could easily start and stop for jokes.
“I have no idea how many themes we did for Seinfeld…” he told Great Big Story. “The timing, the length, had to be adjustable in a way it would still hold water and still sound like the Seinfeld theme.”
Mork From “Mork & Mindy” Originated on “Happy Days”
Fans may remember that Mork from Ork initially appeared in Richie Cunningham’s dream during a February 1978 episode of Happy Days, a premise apparently conceived of by the 8-year-old son of series creator Garry Marshall. Although this seemed like a terrible idea to the writers, they quickly realized the potential of the situation when the little-known actor Robin Williams wowed during his audition and rehearsals. Mork then proved a hit after going toe-to-toe with the Fonz on-screen, prompting Marshall and his cohorts to devise a spinoff series about the character in time for the fall 1978 TV season. Meanwhile, the “My Favorite Orkan” Happy Days episode was reedited for syndication to show that the alien encounter was real.
Only One Actor Appeared in Every “M*A*S*H” Episode
M*A*S*H experienced several significant cast changes, and a few favorite characters were replaced with equally dynamic new ones — a standard practice on long-running shows today, but rare back then. Of the many actors who appeared on the show, Alan Alda (Benjamin Franklin “Hawkeye” Pierce) was the only star to appear in every episode. Through its run, the actor took increasing creative control of the series, directing 31 episodes including the finale, and co-writing 13 episodes. He became the first person ever to win Emmy Awards for acting, directing, and writing for the same show. Loretta Swift (Margaret “Hot Lips” Houlihan) was a close second in terms of longevity; she appeared in all 11 seasons but missed a handful of episodes along the way.
Credit: Jim Smeal/ Ron Galella Collection via Getty Images
Each of the Four “Golden Girls” Stars Won an Emmy Award
The Golden Girls was an Emmys darling from the start, eventually accumulating 68 nominations and 11 awards, with each of the four leads taking home a trophy at one point. Bea Arthur, Rue McClanahan, and Betty White all received Best Actress nods in 1986, with White winning the honors. The following year, it was McClanahan who clinched the title, and then in 1988, it was Arthur’s turn — as well as Estelle Getty’s, who earned the Supporting Actress honor. During her speech, Arthur noted that her thank-yous were from “the four of us” since “we’ve all won.”
Credit: Frank Edwards/ Archive Photos via Getty Images
Lucille Ball’s Mom Was at Every Single Taping of “I Love Lucy”
Ball’s mother, DeDe Ball, went to every single taping of her daughter’s sitcom. In fact, her laughter can often be heard coming from the live audience — and she can even be heard saying, “Uh oh!” at times.
Speaking of famous mothers, Cher’s mother appeared in one episode of the show, long before Cher became a household name. Her mother, Georgia Holt, was a model who made a few TV cameos, including one memorable — but brief — appearance in a 1956 episode of I Love Lucy where the crew goes to Paris and is baffled by the avant-garde fashion. At the end, Holt is seen walking by as a model in an outfit inspired by the potato sack.
The Real Owner of Mary Tyler Moore’s Apartment Building Displayed Political Banners to Keep Producers From Coming Back
The 1892 home that provided the exteriors for Mary’s apartment became so famous that the owners were inundated with visitors and tour buses, and eventually, they’d had enough. When they got word that the crew was coming back to film more exterior shots in 1972, owner Paula Giese displayed a large “Impeach Nixon” banner across the front. (She was a prominent political activist, so it was a two-for-one deal.) It worked. They didn’t get their new shots, and Mary eventually ended up moving.
M*A*S*H was loosely based on the 1970 Robert Altman film of the same name, which was an adaptation of the 1968 novel MASH: A Novel About Three Army Doctors, by Richard Hooker, the pen name of former U.S. Army surgeon H. Richard Hornberger. The Mobile Army Surgical Hospital, or MASH (the asterisks between the letters were a creative design element used in the fictional versions), was first deployed by the U.S. Army during World War II as an attempt to move surgical care closer to wounded soldiers.
The charismatic character of Benjamin Franklin “Hawkeye” Pierce (played by Alan Alda) was created by Hornberger based on his own medical heroics. During the Korean War, Hornberger was assigned to the 8055th MASH, which traveled the 38th parallel dividing the Korean Peninsula, now the demilitarized zone that divides North and South Korea. His novel took 12 years to write and five years more to find a publisher, and eventually, Hornberger sold the television rights for the incredibly low amount of $500 (still only a few thousand dollars today) per episode.
Credit: Astrid Stawiarz/ Getty Images Entertainment via Getty Images
The Original Name of “Sesame Street” Was “123 Avenue B”
While names like The Video Classroom and Fun Street were tossed around, the most serious contender for the name of what later became known as Sesame Street was 123 Avenue B, since it fit the vibe of the inner-city set of the show. But the name was abandoned because it was an actual street address — and also because there was concern that those outside of New York City may not relate. The show’s writer, Virginia Schone, came up with the name Sesame Street, though it wasn’t immediately embraced, as many worried it would be hard for young kids to pronounce. After a weekend of brainstorming and no better options, it became the official title. “We went with it because it was the least bad title,” Cooney told Sesame Workshop.
“Downton Abbey” Is the Most Successful and Popular “Masterpiece Theatre” Miniseries
In its 50-year history, no Masterpiece miniseries has drawn as much buzz as Downton Abbey, which debuted in the U.K. on September 26, 2010, and on PBS the following January. The series, which aired its final season in the U.S. in 2016, chronicled the lives of an aristocratic family and their domestic servants in the fictional Yorkshire county estate of Downton Abbey. It tackled historic events ranging from the First World War to the 1918 influenza pandemic to the Irish War of Independence, all through the lens of the highly hierarchical household. It’s the most nominated non-U.S. series in Emmy history, with a total of 59 nominations and 12 wins. In 2019, a full-length feature film was released due to popular demand, followed by another film in 2022.
The Red Trolley on “Mister Rogers’ Neighborhood” Traveled 5,000 Miles Annually
The beloved children’s television program Mister Rogers’ Neighborhood wasn’t complete without the anthropomorphic Trolley, which helped transport viewers into the Neighborhood of Make-Believe. In a given year of the show, Trolley’s commutes covered 5,000 miles, according to PBS, more than the length of the world’s longest river, the 4,123-mile Nile.
Trolley’s precise origins are somewhat mysterious, but we do know the one-of-a-kind model was hand-built from wood by a Toronto man named Bill Ferguson in 1967, the year before Mister Rogers’ Neighborhoodpremiered. The TV host’s love for trolleys went all the way back to his own childhood; during one 1984 episode of Mister Rogers’ Neighborhood, he visited the Pennsylvania Trolley Museum and remembered accompanying his dad on long trolley trips. Today, Trolley is on permanent display at the Fred Rogers Center at Saint Vincent College in Rogers’ hometown of Latrobe, Pennsylvania.
The “Happy Days” Theme Song Didn’t Open the Show Until Season 3
The famed Happy Days theme song, written by Norman Gimbel and Charles Fox and originally sung by Jim Haas, wasn’t used for the opening credits in seasons 1 and 2. That spot was reserved for a re-recorded take of Bill Haley’s “Rock Around the Clock,” with the similar-sounding Gimbel-Fox composition on the closing credits. However, an updated version of “Happy Days,” performed by Truett Pratt and Jerry McClain, accompanied the opening credits for season 3, and eventually made its way to No. 5 on the Billboard charts. “Happy Days” was later recorded again by Bobby Arvon and used to open the show for its final season in 1983-84.
“The Mary Tyler Moore Show” Was Likely the First American Sitcom to Feature Birth Control Pills
On The Dick Van Dyke Show, which Moore starred in from 1961 to 1966, the actress and her on-screen husband, Dick Van Dyke, slept in separate beds and couldn’t say the word “pregnant.” However, just a few years later on The Mary Tyler Moore Show, not only did Mary have sex out of wedlock, but she openly took birth control pills. In a 1972 episode — the same year that a Supreme Court decision made birth control available to unmarried women in all states — Mary is having dinner with her father when her mother shouts, “Don’t forget to take your pill!” Mary and her father both yell, “I won’t,” and the embarrassed look on Mary’s face shows that she doesn’t just take a pill, but The Pill.
The Show Idea for “Sesame Street” Started at a Dinner Party
A producer at New York City’s Channel 13 public television station, Joan Ganz Cooney, was hosting a dinner party in 1966 when she chatted up Lloyd Morrisett, a Carnegie Corporation educator. He told her that one morning he found his 3-year-old staring at the television’s test pattern, waiting for something to begin. They started discussing whether there was any way for young minds to learn from the medium, and thus the entire concept of educational television — and Sesame Street — was born. It was first described as a preschool for those who couldn’t afford to attend.
Mork’s Spacesuit Was Recycled From an Episode of “Star Trek”
Since Mork was originally meant to be a one-off character, there wasn’t a whole lot of thought put into his appearance; someone simply grabbed a red spacesuit from the Paramount wardrobe collection, added a silver triangle, and the Ork uniform was born. It’s unknown whether anyone at the time caught the uncanny resemblance between Mork’s suit and the one worn by Colonel Green in the 1969 Star Trek episode “The Savage Curtain,” but we do know that Mork & Mindy dipped into the Star Trek archives at least one more time: The spaceman costume worn by Mindy’s father (Conrad Janis) in the “Mork Goes Public” episode of season 1 was comprised of a helmet and suit from two separate episodes of the sci-fi predecessor.
The Pilot for “I Love Lucy” Was Lost for Four Decades
I Love Lucy’s pilot episode, shot March 2, 1950, couldn’t be found for about 40 years. But one of Arnaz’s collaborators, Pepito Perez, later found a 35-millimeter version of it in his house. Though some of it was damaged, most of the footage aired as part of a 1990 CBS special.
Credit: Carlo Allegri/ Getty Images Entertainment via Getty Images
The “Golden Girls” Cast Once Performed for the Queen Mother
Queen Elizabeth II’s mom, the Queen Mother, was such a fan of The Golden Girls that she had the four leads perform at the London Palladium in 1988 during the Royal Variety Performance. The cast performed two of their kitchen table scenes and made sure to censor a few things to not offend the royals in attendance.
That said, the Queen Mum did have a sense of humor. One joke that was left intact was Dorothy asking Blanche how long she waited to have sex after her husband died, with Sophia wittingly interjecting, “Until the paramedics came.” The response made the often-reserved royal laugh out loud.
The First and Last Conversations Between Jerry and George in “Seinfeld” Were the Same
In a full-circle moment, the first scene of the series started in a coffee shop with Jerry telling George that a button on his shirt was too high and that it “makes or breaks” the shirt since it’s in “no man’s land.” And in the very last scene of the finale, when they’re all sitting in a jail cell, he alludes to it again, saying: “The second button is the key button. It literally makes or breaks the shirt.”
As the camera pans back, George says, “Haven’t we had this conversation before?” to which Jerry ends the series with “Maybe we have.”
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Did you know that Ben and Jerry learned to make ice cream via a correspondence course? Or that before Lamborghini was famous for luxury cars, they sold tractors? Many of the world’s top brands have fascinating stories, and we’ve collected some of our favorites from around the site for your reading pleasure.
McDonald’s Once Tried Making Bubblegum-Flavored Broccoli
Kids weren’t lovin’ it when McDonald’s tried to add bubblegum-flavored broccoli to Happy Meals. In 2014, the fast-food giant’s then-CEO, Donald Thompson, revealed the bizarre experiment at an event hosted by a venture capitalist firm. Under pressure to make Happy Meals healthier, the company reflected on how toothpaste and amoxicillin producers had used artificial bubblegum flavoring to make their goods more palatable to children. McDonald’s decided to try a similar tactic with the divisive cruciferous veggie.
“Mickey D’s” food scientists did successfully make broccoli taste like bubblegum, likely by employing a combination of strawberry, banana, and cherry flavors. However, a focus group of kids was confused by the final product, which they enjoyed about as little as standard broccoli (we’re guessing it wasn’t pink). The item was never added to the McDonald’s menu, so parents who want to impress their kids with a tastebud switcheroo will have to settle on cotton candy grapes.
We all remember the Marlboro Man: an able-bodied outdoorsman, usually a cowboy, who enjoyed a hard-earned puff from his cigarette amid a day of honest labor, his steely gaze beckoning us to “come to where the flavor is” in the land of Marlboro Country. Except the real Marlboro Man never smoked — at least not the “original,” an individual by the name of Bob Norris who featured in the brand’s early TV commercials. A Colorado rancher who was offered the job after being seen in a photo with his friend John Wayne, Norris reluctantly became the face of an overwhelmingly successful advertising campaign by the Leo Burnett agency that made Marlboro the world’s top-selling cigarette brand by the 1970s.
But while Norris epitomized the Marlboro Man’s image of rugged individuality, he ultimately proved too principled to last in the role; when his children asked why he was promoting a product they were forbidden to try, he reportedly hung up his Stetson after 12 years of cigarette pitch work. Of course, Norris was an anomaly among his Marlboro brethren. While he lived to the ripe old age of 90, others who followed in his bootsteps learned the hard way what decades of smoking could yield, with several later publicly speaking out against the habit before dying from smoking-related illnesses.
There’s off the map, and then there’s Argleton. The English town was visible on Google Maps until 2009, which is notable for one major reason: No such place exists. So how did it get listed? Though never confirmed by Google, it’s been speculated that Argleton may have been a trap street — a fictitious road used by cartographers to catch anyone copying their work. The reasoning is as simple as it is clever: If a street (or, in this case, town) that you made up ends up on another map, you’ll have caught its creator red-handed in copyright infringement.
Though little more than an empty field in West Lancashire, Argleton once had its own (presumably auto-generated) job listings and weather forecasts. Once its (non-)existence became known on the internet, humorous T-shirts with slogans such as “New York, London, Paris, Argleton” and “I visited Argleton and all I got was this T-shirt” appeared online, too. Google itself was tight-lipped on the subject, releasing a brief statement noting that “Google Maps data comes from a variety of data sources. While the vast majority of this information is correct there are occasional errors.”
Credit: Mark Erickson/ Getty Images News via Getty Images
The Mall of America Is Owned by Canadians
Baseball, apple pie, and shopping — all three are American favorites. So it may be a bit surprising that one of the country’s largest shopping destinations is overseen by our neighbors to the north. That’s right: The Mall of America is owned by Canadians. Despite its name, the supersized shopping complex — found just outside Minneapolis in Bloomington, Minnesota — was developed by the Triple Five Group, a Canadian retail and entertainment conglomerate. (Notably, while the Mall of America is truly humongous, it was once surpassed in sheer size by the West Edmonton Mall, a Canadian shopping center built by the same company in the 1980s, and which reigned for decades as the largest mall in North America.)
In the decades since its opening, the Mall of America has grown, increasing to 5.6 million square feet and stuffed with 520 stores and 60 restaurants. For those who aren’t into shopping, there’s more to do than just wait around in the food court — today, the Mall of America is home to a 13-screen movie theater, an indoor theme park, a mini-golf course, and the largest aquarium in the state of Minnesota.
Today, the name Lamborghini is synonymous with automotive opulence, but the Bologna, Italy-based company has an origin story that’s more humble than you might expect. Born in 1916, Ferruccio Lamborghini served in the Italian Air Force as a mechanic during World War II, learning the ins and outs of some of the most advanced vehicles in the world. Returning home after the war, Lamborghini knew his home country would need to increase agricultural output to recover from the devastation of the conflict. With other tractor companies (one of them being FIAT) too expensive for his war-weary compatriots, Lamborghini put his mechanical skills to work and created cheap-yet-powerful tractors salvaged from surplus military material.
Starting with its first tractor, named Carioca, in 1948, Lamborghini Trattori became an immensely successful business. Lamborghini’s fortune from the tractor business, along with other proceeds from his dabblings into air-conditioning and heating systems, provided enough capital for Lamborghini to buy his own Ferrari 250 GT sports car in 1958. Ever the mechanic, Lamborghini was unimpressed with his Ferrari (especially its less-than-luxurious clutch) and even began a feud with Enzo Ferrari himself. So, he decided to make his own sports car, and in 1963, Automobili Lamborghini launched a legacy of fine automobile craftsmanship that has lasted for 60 years and counting. (They also still make tractors.)
Pringles Inventor Fredric Baur’s Ashes Were Buried in a Pringles Can
When considering a final resting place, most people ponder the conventional options, such as a coffin or, for those who prefer cremation, an urn. That was not the case for Pringles inventor Fredric Baur, whose devotion to his innovative packaging method (which stacks his perfectly curved creations in a tall tube) was so intense that he had his ashes buried in a Pringles can.
“When my dad first raised the burial idea in the 1980s, I chuckled about it,” Baur’s eldest son, Larry, has said of his father’s wishes. But this was no joke. So after the inventor died in 2008, his children made a stop on their way to the funeral home: a Walgreens, where they had to decide which can to choose. “My siblings and I briefly debated what flavor to use,” Larry Baur added. “But I said, ‘Look, we need to use the original.’” Baur’s ashes now rest, in the can, at his grave in a suburban section of Cincinnati, Ohio.
Nintendo Was Founded Before the Fall of the Ottoman Empire
The Ottoman Empire feels like an entity of a time long past, while the name Nintendo conjures up images of modernity — electronics, video games, arcades, and mustachioed plumbers. However, Nintendo was actually founded before the Ottoman Empire ended, and this period of overlap isn’t measured in a matter of months or even a few years. When the Ottoman sultanate was eliminated in 1922 after the widespread geographic shuffle that followed World War I, Nintendo had already been in business for 33 years.
Of course, this wasn’t the Nintendo that many of us know today — Nintendo didn’t make its first electronic video game until 1975. Founded on September 23, 1889, Nintendo’s original mission was a humble one: selling playing cards, specifically Japanese-style cards called Hanafuda. The company did pretty well, but decided to expand further in later decades. Nintendo struck a deal with Disney in 1959 to create playing cards with Disney characters on them, and in the 1960s, Nintendo sold a series of successful children’s toys, including Ultra Hand and Home Bowling, before becoming the official Japanese distributor of the Magnavox Odyssey — the first commercial home video console. Seeing the promise of such a machine, Nintendo threw its weight behind this emerging entertainment category. The rest, as they say, is history.
Volvo Gave Away Their Seat Belt Patent to Save Lives
Seat belts are a standard feature in today’s cars and trucks, but it hasn’t always been that way. In the 1950s and ’60s, car manufacturers weren’t required to include safety belts in vehicles. When they were built in, the earliest seat belts were simple two-point restraints that secured across the waist (aka lap belts). While a step in the right direction, lap belts had some downsides — they didn’t protect the upper body during a collision and could even cause injuries during high-speed crashes. Recognizing these issues, Swedish carmaker Volvo hired Nils Bohlin (a former aviation engineer who helped create pilot ejection seats) as the company’s safety engineer, and tasked him with a redesign. Bohlin’s creation — a more comfortable V-shaped belt that stays in position across both the chest and hips — was drafted in under a year, and is the style used in cars today. Volvo quickly added the belts to its cars in 1959, before the inventor even secured a patent. But when he did, Bohlin and Volvo didn’t look to profit off the safety feature. Instead, they released the design publicly, urging all car manufacturers to add the upgraded belts. After years of presentations and crash test dummy demos, Volvo eventually made headway — the evidence of which is found in our cars today and credited with saving lives around the world.
Michelin Stars Were Originally Connected to an Effort to Boost Tire Sales
In the restaurant business, there is no greater honor than the Michelin star. Awarded on a ranking from one to three, Michelin stars are the standard of greatness when it comes to fine dining. Chefs pin their reputations on them, and having (or not having) them can make or break a business. So it might seem strange to discover that this culinary accolade is intimately entwined with… car tires. Brothers Andre and Edouard Michelin, founders of the Michelin tire company, created the Michelin Guide — a free booklet full of useful information for French motorists.
To help raise the guide’s prestige (and also help motorists explore Europe again following World War I), the brothers reintroduced the handbooks in 1920, featuring more in-depth hotel and restaurant information — and instead of being free, they now cost seven francs. Within a few years, Michelin also recruited “mystery diners” to improve its restaurant reviews (they still work undercover), and in 1926, they began handing out single Michelin stars to the very best restaurants. Five years later, Michelin upped the amount of possible stars to three, and they have continued searching for the world’s best food in the nearly a century since. Today, the guides — and stars — cover more than 30 territories across three continents.
Credit: Tessa Bunney/ Corbis News via Getty Images
It Took the Editors of the Oxford English Dictionary Five Years Just To Reach the Word “Ant”
If you think reading the dictionary sounds exhausting, try writing one — largely by hand, no less. That’s what the editors of the original Oxford English Dictionary had to do after the Philological Society of London deemed existing dictionaries “incomplete and deficient” in 1857. They had their work cut out for them: In 1884, five years after beginning what they thought would be a decade-long project, principal editor James Murray and his team reached an important milestone — the word “ant.” That year, they began publishing A New English Dictionary on Historical Principles (as it was then known) in installments called fascicles, with the 10th and final fascicle seeing the light of day in 1928. To say that the project’s scope was larger than anticipated would be putting it mildly. What was intended as 6,400 pages spread across four volumes ballooned into a 10-volume tome containing 400,000 words and phrases. The dictionary took so long to finish, in fact, that Murray died 13 years before its completion.
Ben and Jerry Learned How to Make Ice Cream by Taking a $5 Correspondence Course
The founders of the country’s leading ice cream brand spent only a pint-sized sum learning how to make their product. Both growing up on Long Island, New York, Ben Cohen and Jerry Greenfield became friends in seventh grade, back in 1963. Originally, they set their sights on being a doctor (Greenfield) and an artist (Cohen). But once they reached their 20s — a rejected medical school applicant and a potter who dropped out of college — they decided to enter the food industry instead. The duo came close to becoming bagel makers, but realized that producing ice cream was cheaper (bagel-making equipment can be pretty pricey). Their dessert education arrived through a Penn State College of Agricultural Sciences correspondence course, which sent them a textbook in the mail and required only open-book tests.
All of the ice cream was made in a 5-gallon machine, and Ben & Jerry’s shop originally sold eight flavors: Oreo Mint, French Vanilla, Chocolate Fudge, Wild Blueberry, Mocha Walnut, Maple Walnut, Honey Coffee, and Honey Orange. However, as the flavors got wilder — think Chunky Monkey, Cherry Garcia, and Phish Food — many more outposts and a wholesale delivery business followed, as did an IPO. In 2000, Unilever — the parent company of Breyers and Klondike — paid $326 million to acquire Ben & Jerry’s.
Pepsi has been nearly synonymous with cola for more than a century, but it wasn’t always called that. We have pharmacist Caleb Bradham to thank for the bubbly beverage, as well as its original name: Brad’s Drink. Believing that his concoction had digestive benefits, Bradham sold it at his pharmacy in New Bern, North Carolina. Brad’s Drink didn’t last long, however — it was renamed Pepsi-Cola in 1898. The new name was partly derived from the word “dyspepsia,” a technical term for indigestion, and was meant to convey the tasty beverage’s supposed medicinal properties. Bradham trademarked the name in 1903, and the company grew exponentially over the next few years, with 240 franchises opening across 24 states by 1910.
You’d be forgiven for assuming that IKEA is a Swedish word related to furniture. In fact, it’s an acronym that combines the initials of founder Ingvar Kamprad (IK) with the name of the farm where he grew up (Elmtaryd) and a nearby village (Agunnaryd). Kamprad was just 17 when he founded the company in 1943, initially selling small household items — think pens and wallets — rather than beds and sofas. He likely had no idea that there would one day be more than 450 IKEA stores across the globe.
Walt Disney’s Cartoons Were Originally Called “Laugh-O-Grams”
Before founding the animation studio that bears his name, Walt Disney was a commercial artist in Kansas City, Missouri. It was there, around 1919, that he began making hand-drawn cel animations of his own, which were screened in a local theater and dubbed “Laugh-O-Grams.” The studio he acquired following his cartoons’ success had the same moniker, but it was a short-lived venture — Laugh-O-Gram’s seven-minute fairy tales and other works were popular with audiences, but financial troubles forced Disney to declare bankruptcy in 1923.
Disney, his brother Roy, and cartoonist Ub Iwerks moved to Hollywood the same year and founded Disney Brothers Cartoon Studio, which quickly changed its name to Walt Disney Studios at Roy’s behest. Had it not been for Laugh-O-Gram, however, it’s likely that Disney’s most famous creation would never have been born. The inspiration for Mickey Mouse came from a brown mouse who frequented his Kansas City studio trash basket — a “timid little guy” Disney was so fond of that before leaving for Hollywood, he “carefully carried him to a backyard, making sure it was a nice neighborhood,” at which point “the tame little fellow scampered to freedom.”
Credit: Maja Hitij/ Getty Images News via Getty Images
Guinness World Records Started Out as a Guinness Brewery Promotion Intended To Help Settle Bar Bets
In 1954, Sir Hugh Beaver, the managing director of Guinness, thought up a way to reduce pub disputes so bartenders could focus on pouring his company’s signature beers. He suspected that every bar could benefit from a book filled with verified facts and stats about subjects that might arise mid-conversation over a drink. Two events in particular prompted his decision: Earlier in the decade, he and fellow guests at a hunt in Ireland memorably argued about Europe’s fastest game bird, which they had no means of identifying. Then, on May 6, 1954, English athlete Roger Bannister became the first person to run a mile in less than four minutes, causing public interest in records-related news to surge. Norris McWhirter had served as the stadium announcer during Bannister’s historic run, and Beaver hired both him and his identical twin, Ross McWhirter — another sports journalist — to assemble The Guinness Book of World Records.
The McWhirter twins spent about three months working feverishly on their 198-page compendium. Although initially meant to be given out for free at bars to promote Guinness, the book became so popular, the company started selling it, soon to great success. To date, more than 150 million books from the series — eventually renamed Guinness World Records — have been purchased, educating readers in 40-plus languages.
Credit: Dorann Weber/ Moment Mobile via Getty Images
Chef Boyardee Was a Real Person
The world knows him as the jovial-looking fellow whose face has graced untold numbers of ravioli cans, but to those who knew him in life, he was Ettore “Hector” Boiardi — which is to say, Chef Boyardee was a real person. Born October 22, 1897, in Piacenza, Italy, Boiardi was working as an apprentice chef by the age of 11 and founded the company bearing his name in 1928, after he and his family settled in Cleveland. The business began because Boiardi’s restaurant there was so successful that patrons wanted to learn how to make the dishes at home, which was remarkable for the time — Italian food wasn’t nearly as well known (and beloved) as it is today. In fact, Chef Boyardee has been credited with helping to popularize the cuisine in America. There was just one problem: “Boiardi” was difficult for Americans to pronounce, so his products were sold under the phonetic name of Chef Boy-Ar-Dee (since simplified to its current spelling).
The Zelda Video Game Was Named for F. Scott Fitzgerald’s Wife
Video games aren’t often associated with literary figures, but the Legend of Zelda has always been unique. Take, for instance, the fact that its title character was named after writer, artist, and Jazz Age icon Zelda Fitzgerald, whose marriage to The Great Gatsby author F. Scott Fitzgerald generated nearly as many headlines as his professional output. Zelda, who’s been described as the first flapper of the Roaring ’20s (and the inspiration for Gatsby’s Daisy Buchanan), was chosen because a Nintendo PR rep suggested that the eponymous princess should be “a timeless beauty with classic appeal” and that Zelda Fitzgerald was one such “eternal beauty.” The name chain didn’t end there; actor Robin Williams was such a fan of the series that he named his daughter after the Princess of Hyrule. As for Zelda F. herself, she was — rather fittingly — named for the fictional heroine of a 19th-century novel.
The world’s largest coffeehouse chain, Starbucks, almost had a very different name. According to a 2008 Seattle Times interview with the company’s co-founder Gordon Bowker, the famous java chain was once “desperately close” to being called “Cargo House,” a name meant to tie the first store (in Seattle’s Pike Place Market) to the idea of beans coming from far away. Anxious for another, more pleasing moniker, a brand consultant working with Bowker mentioned that words starting with “st” felt especially strong. Bowker ran with the idea, listing every “st” word he could think of.
The breakthrough moment occurred after the consultant brought out some old maps of the Cascade mountains and Mount Rainier — both close to the company’s hometown of Seattle — and Bowker stumbled across an old mining town named “Starbo.” The name lit up a literary reference embedded in his mind: Starbuck, a name of a character from Herman Melville’s 1851 masterpiece Moby-Dick; or, The Whale. Bowker readily admits that the character has nothing to do with coffee, but the moniker stuck, and the company doubled down on the nautical theme by introducing a mythological siren, likely influenced by a seventh-century Italian mosaic, as its now-famous green-and-white logo.
About 200 Feral Cats Roam Disneyland, Where They Help Control Rodents
Spend enough time at Disneyland and you’ll see them. Maybe you’ll spot one snoozing in the bushes near the Jungle Cruise or observing you warily as you ride the tram, but one thing is certain: However many cats you see, there are more out of sight. About 200 feral cats roam the Happiest Place on Earth, where they earn their keep by helping to control the rodent population. The felines were first seen not long after Disneyland opened in 1955, when they took up residence in Sleeping Beauty Castle, and it soon became evident that keeping them around had more advantages than trying to escort them off the premises.
The mutually beneficial alliance even includes permanent feeding stations for the cats, as well as spaying or neutering and vaccinations. Though not official cast members, these adept hunters — who mostly come out at night — have earned a devoted following of their own. There are websites, Instagram feeds, and YouTube videos devoted to them. They’re not quite as popular as the actual rides at Disneyland, of course, but for cat lovers, they’re an attraction all their own.
Credit: John Keeble/ Getty Images News via Getty Images
Salvador Dalí Designed the Chupa Chups Logo
You may not know it by name, but you’re almost certainly familiar with Salvador Dalí’s best-known work, “The Persistence of Memory,” which depicts melting clocks on a bleak landscape. No less famous, albeit in an entirely different way, is the Chupa Chups logo — which Dalí also designed. While the idea of a surrealist collaborating with a lollipop company may sound odd, it begins to make sense when you learn a bit more about the eccentric artist — starting with the fact that he was close friends with Chupa Chups founder Enric Bernat, a fellow Spaniard.
The two met at a café one day in 1969, with Bernat making Dalí aware of his need for a logo and the world-renowned artist quickly taking care of it for him. He did so with great intention, of course: “Acutely aware of presentation, Dalí insisted that his design be placed on top of the lolly, rather than the side, so that it could always be viewed intact,” Phaidon notes. Dalí reportedly designed the instantly recognizable daisy-based logo in less than an hour on that fateful day, and it’s still in use decades — not to mention billions of sales — later.
The First Logo for Apple Featured Sir Isaac Newton Sitting Under an Apple Tree
Apple has always been known for its design. Before its iconic logo resembled an actual apple, however, it featured Sir Isaac Newton sitting under an apple tree. This is, of course, a reference to the legend of Newton formulating his law of universal gravitation after getting bonked on the head by a falling apple — which ranks among history’s best-known “aha!” moments. The more widely accepted version of events is that Newton merely observed a falling apple, but that doesn’t make the event any less fun to ponder. In addition to the drawing, the logo featured a line from poet William Wordsworth: “Newton … a mind forever voyaging through strange seas of thought … alone.” The logo — which debuted when the company was founded in 1976 — was short-lived, however, in part because co-founder Steve Jobs felt the design couldn’t be effectively rendered in smaller versions. Soon, he hired graphic designer Rob Janoff, who came up with the logo now recognized worldwide.
Some Historians Consider Cracker Jack America’s First Junk Food
It all started with Chicago candy and popcorn peddlers Frederick and Louis Rueckheim, German immigrants who crafted a non-sticky caramelized popcorn as a way to stand out from other popcorn vendors. Their version — with a sweet, crunchy coating that was different from the salted popcorn and kettle corn available at the time — became a hit after it was mass-produced in 1896.
Cracker Jack’s early marketing warned prospective customers about the effects of the product. “Do not taste it,” one 1896 article cautioned. “If you do, you will part with your money easy.” Some historians believe that the caramel-coated popcorn and peanut treat jump-started the American snack food industry around the turn of the 20th century. It may even hold the title of the country’s first junk food, though the types of junk food popular today didn’t make their appearances until the 1950s. It was a song, however, that helped cement Cracker Jack’s snack status. In 1908, songwriter Jack Norworth — entirely unknown to the Rueckheims — composed “Take Me Out to the Ball Game” after seeing an advertisement for an upcoming game. The song, which mentions the snack by name, led to a surge in sales that forever linked Cracker Jack with sports.
There Is Only One Remaining Blockbuster Location — In Bend, Oregon
At its peak in 2004, Blockbuster, the wildly successful movie rental chain, boasted 9,094 locations. Today it has just one. Bend, Oregon, is home to the former giant’s last remaining outpost, a status the store attained when its counterpart in a suburb of Perth, Australia, closed in 2019. Originally opened in 1992 as Pacific Video, the location became a Blockbuster franchise store eight years later — and doesn’t look to be closing any time soon. That’s thanks in part to the 2020 documentary The Last Blockbuster, which contributed to the brick-and-mortar store being cemented as a tourist attraction among nostalgia-minded visitors.
Credit: John Moore/ Getty Images News via Getty Images
There Are Three Mandatory Flavors of Girl Scout Cookies Sold Each Year
Though there have been many changes to the kinds of Girl Scout Cookies sold over the decades, three stalwart flavors are mandated each year: Thin Mints, Do-si-dos (also called Peanut Butter Sandwiches), and Trefoils. None of these varieties existed in their current form in the earliest years of cookie sales, but a version of Thin Mints can be traced back to 1939, when troops started selling a flavor known as “Cooky-Mints.” By the 1950s, shortbread had joined the lineup, alongside the renamed Chocolate Mints and sandwich cookies in vanilla and chocolate varieties. Peanut Butter Sandwiches hit the scene soon after, and by 1966, all three of the aforementioned flavors were among the group’s bestsellers. Other cookies came and went in the decades that followed, but Thin Mints, Do-si-dos, and Trefoils have been staples since the 1970s — and for good reason.
Credit: Mario Tama/ Getty Images News via Getty Images
The Name “Snapple” Is a Portmanteau
The brand name Snapple is a portmanteau of two words — “snappy” and “apple.” When the company began in 1972, founders Leonard Marsh, Hyman Golden, and Arnold Greenberg (who ran a health food store in New York City’s East Village) aimed to sell fruit juice-based soft drinks. One early product was a carbonated apple soda called “Snapple.” That original product wasn’t without its issues, however: Some of the bottles would ferment, sending the caps flying. That didn’t deter the trio, who went on to become some of the first to sell soft drinks made with natural ingredients. They officially changed the company’s name from Unadulterated Food Products to “Snapple” in the early 1980s.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Time flies when you’re having fun, and we’ve been having a lot of fun at Interesting Facts. Many facts that are amazing, mind-boggling, awe-inspiring, and just plain interesting have arrived in your inbox — and we’ve collected the 50 most popular from around the site so that you can revisit crowd favorites. From rats giggling when they’re tickled to each human having a unique tongue print, here are some of the most interesting facts we’ve ever run.
Berry classification is a confusing business. People began referring to some fruits as “berries” thousands of years before scientists established their own definitions, some of which are still debated. Today, little effort is made to teach the public about what botanically constitutes a berry, so here’s a bit of help. It’s generally accepted that all berries meet three standards. First, they have a trio of distinct fleshy layers (the outer exocarp, middle mesocarp, and innermost endocarp); second, their endocarps house multiple seeds; third, berries are simple fruits, meaning they develop from flowers with a single ovary.
Blueberries and cranberries are true berries, as their names imply. Other berries may surprise you: Avocados, eggplants, grapes, guava, kiwis, papayas, peppers, pomegranates, and tomatoes are all, botanically speaking, berries. Bananas are berries, too, since they meet all three requirements. The exocarp of a banana is its peel, while the mesocarp is the creamy middle surrounding the seedy, also-edible endocarp.
Jackson, Mississippi’s State Capital, Lies Atop a Dormant Volcano
At first glance, Jackson, Mississippi, is like any other state capital, with its domed capitol building standing squarely in the heart of the city. However, 2,900 feet below the surface lies a surprising secret — an ancient volcano. Although the West Coast and Hawaii are the U.S.’s biggest volcanic hot spots, active volcanoes also dotted the northern Gulf of Mexico region millions of years ago.
One of these volcanoes was the Jackson Volcano, and the city’s Mississippi Coliseum now sits above its ancient crater. Thankfully for the city’s residents, the volcano is extinct and hasn’t erupted since around the age of the dinosaurs. And while Jackson is the only capital city set atop a volcano, volcanic formations can also be seen within the limits of other U.S. cities, including Portland, Oregon, and Honolulu, Hawaii.
Arizona and Hawaii Don’t Observe Daylight Saving Time
If you’re one of the 61% of Americans who’d like to stop resetting the clock twice a year, it might be time to move to Arizona or Hawaii. Except for the Navajo Nation in Arizona, the Grand Canyon and Aloha states don’t observe daylight saving time, meaning they won’t be falling back in November or springing forward next March. Rather, they live in what must be a permanent state of bliss, never having to remember whether the latest clock change means they’re getting an hour less of sleep the next night or an hour more.
Though polls like the one cited above consistently show that Americans are tired of changing their clocks, making daylight saving time permanent is just as popular as ignoring it altogether — one poll showed 59% of respondents were in favor of the idea. The Senate unanimously passed a bill to do just that in March 2022, though the Sunshine Protection Act, as it’s called, has yet to move forward in the House.
Tenth President John Tyler Still Has a Living Grandson
More than 200 years after the 10th President of the United States was born, one of his grandsons is still alive. As impossible as that may seem, the math — and biology — checks out. John Tyler, who was born in 1790 and became President in 1841 after William Henry Harrison died in office (possibly of pneumonia), had a son named Lyon Gardiner Tyler in 1853. This son was born to the then-60-something Tyler and his second, much younger, wife, Julia Gardiner. Lyon then had two sons of his own in his 70s (also with a much younger second wife), one of whom — Harrison Ruffin Tyler, born in 1928 — is still gracing the Earth in his early nineties. It may make this feat slightly less surprising to know that Tyler had 15 children, more than any other POTUS in U.S. history.
Male Squirrels Get Smarter in the Fall
Autumn heralds the arrival of many things: pumpkin pie, crisp morning air, and, apparently, more intelligent rodents. Male squirrels get smarter in the fall due to their hippocampus (a part of the brain involved in memory) increasing in size during the caching season — the time of year when they gather even more nuts than usual. Interestingly, female squirrel brains don’t show the same effect; researchers speculate that male squirrel brains may change in the fall to act more like the females’ brains already function all year long. The slightly bigger brains may help male squirrels remember exactly where they’ve stored their nuts, although scientists are still teasing out how.
When detectives investigate a crime scene in any prime-time cop drama, they’re often on the hunt for one thing: fingerprints. Because these intricate patterns of whorls and lines are exclusive to each individual, fingerprints have been a go-to method for tracking down suspects for more than a century. However, our fingerprints are not unique when it comes to being, well, unique. Our tongues, like our fingerprints, are also specific to each individual. That’s right — people have tongue prints, which vary from one person to another due to both shape and texture. And perhaps surprisingly, the organ has been gaining some popularity as a method for biometric authentication.
While fingerprints can be altered, eyes affected by astigmatisms or cataracts, and voices changed just by the all-too-common cold, the human tongue is relatively protected from external factors. Sticking out one’s tongue for a print also involves a layer of conscious control and consent that goes beyond what’s required for retinal scans or even fingerprinting, which could make it a more appealing biometric tool for some.
All Blue-Eyed People Likely Descended From a Single Ancestor
Eyes are said to be the windows to the soul, but they’re also a glimpse at humanity’s genetic past. Scientists estimate that between 6,000 and 10,000 years ago, the eye color of all Homo sapiens was brown — likely an evolutionary advantage, as the melanin pigment offers some protection from UV radiation. But then, something changed. Sometime during the Neolithic expansion in Europe, an individual was born with a mutation to the OCA2 gene. This gene code controls melanin production in the iris, and the mutation caused this person’s eyes to turn blue rather than the usual brown. Because blue eyes can only form as a result of this mutation, scientists theorize that all blue-eyed people — about 10% of the world population — are a relative of this original lone blue-eyed ancestor.
Fingerprints are one of the few parts of the human body that generally never change — in some cases, even after thousands of years. Scientists who study ancient civilizations by way of mummified remains can attest: Mummies have fingerprints. But how?
Mummification works by drying out soft tissue such as skin, halting decomposition and preserving the body, fingerprints included. Recovering the fingertip impressions isn’t easy, but it is possible; the job requires soaking or injecting mummified hands with hydrating solutions that plump the tips. From there, the fingertips are inked and copied in a fashion similar to how modern fingerprints are recorded.
Ketchup Was Originally Made Out of Fish
The ketchup we slather onto hot dogs, burgers, and fries today once had a different purpose: Doctors believed it was best consumed as a health tonic. Ketchup has come a long way from its roots in China as far back as the third century BCE, when cooks fermented seafood to create a salty, amber-colored sauce that resembles modern fish sauce (an anchovy-based condiment that adds umami flavor to many Asian dishes). By around the 16th century, British sailors had taken word of ketchup back to their home country, and British cooks tried to replicate it with their own versions made from walnuts and mushrooms. It’s not clear exactly when tomatoes came on the scene, though the first known tomato ketchup recipe appeared around 1812, published by Philadelphia horticulturist James Mease.
It is sometimes said that there are two types of tickling: knismesis and gargalesis. The former is the “light, feather-like” kind, which doesn’t induce laughter, while the latter is more high-pressure and does cause laughter. And while you may think of humans as the only creatures susceptible to gargalesis, one of our much smaller counterparts is as well: the humble rat. Rats actually love being tickled, especially on their back and belly, and there’s even a specific term for the frolicking they do in between tickles: freudensprünge, or “joy jumps.” Sadly, rat giggles are too high for us to hear without special microphones that can reproduce the sound in a lower register. (That doesn’t make videos of rats being tickled any less adorable, however.)
Alfred Hitchcock’s “The Birds” Was Partly Based on a True Story
With apologies to anyone who already found The Birds terrifying while under the impression that it was wholly fictional: Alfred Hitchcock’s avian thriller was partly based on a true story. Said event took place on California’s Monterey Bay in August 1961, when “thousands of crazed seabirds” called sooty shearwaters were seen regurgitating anchovies and flying into objects before dying on the streets. The Master of Suspense happened to live in the area, and called the Santa Cruz Sentinel — which had reported on the strange goings-on in its August 18 edition — for more information. Long after his movie was released two years later, the bizarre event remained shrouded in mystery: What would inspire birds to act this way, and were they as malicious as they seemed in Hitchcock’s movie?
The truth ended up being both straightforward and a little sad. The scientific consensus is now that the birds were poisoned by toxic algae found in a type of plankton called Pseudo-nitzschia. The birds weren’t attacking anyone; they were disoriented and barely in control of their actions. That explanation is absent from Hitchcock’s thriller, which also drew inspiration from Daphne du Maurier’s short story of the same name. (Hitchcock’s Rebecca was a du Maurier adaptation, too.) A resounding success, The Birds is widely considered one of Hitchcock’s greatest works, alongside Psycho, Vertigo, Rear Window, and North by Northwest.
U.S. Elections Used To Be Held Over a 34-Day Window
As implied by its name, Election Day is, well, a single day. That wasn’t always the case, however: States used to hold elections whenever they wanted within a 34-day period leading up to the first Wednesday in December. This ultimately created some issues, as you might imagine — early voting results ended up holding too much sway over late-deciding voters, for one thing. The current date was implemented by the Presidential Election Day Act of 1845, and federal elections now occur every two years on the first Tuesday after the first Monday in November.
That may sound arbitrary at first, but the date was chosen quite deliberately. American society was much more agrarian in the mid-19th century than it is today, and it took a full day of traveling for many to reach their polling place. Church made weekends impractical, and Wednesday was market day for farmers, so Tuesday proved ideal. November, meanwhile, worked because weather was still fairly mild, and the harvest was complete by then.
Smokey Bear Has His Own ZIP Code
In 1944, working on a commission from the War Advertising Council and the U.S. Forest Service, Saturday Evening Post artist Albert Staehle and writer Harold Rosenberg crafted the reassuring, safety-conscious Smokey Bear, now the face of the country’s longest-running public service campaign, where he frequently shares his famous slogan: “Only you can prevent wildfires.” In 1950, a 5-pound black bear cub rescued from a New Mexico wildfire by Taos Pueblo firefighters was christened “Smokey Bear” as a living homage to the popular protective figure.
This bear spent the rest of his life at the National Zoo in Washington, D.C. There, he, his successor — Smokey II — and their alter ego received up to 13,000 letters, drawings, Christmas cards, and honey shipments each week. To help sort these deliveries, the bears were given their own Washington, D.C., ZIP code: 20252. From around 2007 to 2014, the ZIP code was decommissioned, but it was revived for the mascot’s 70th anniversary. The original bear also has his own Instagram and Twitter accounts, where he shares fire prevention tips with the hashtag #OnlyYou — now a more vital message than ever.
The International Space Station Is the Most Expensive Item Humans Have Ever Created
The most expensive movie ever made is Pirates of the Caribbean: On Stranger Tides, which cost a whopping $410 million. That’s a pretty penny to be sure, but it’s less than half a percent of the most expensive human-made object in history: the International Space Station, whose price tag comes in at $100 billion. Launched in 1998 after more than a decade of careful (and often difficult) planning, the ISS is a collaboration between five space agencies: NASA (United States), Roscosmos (Russia), JAXA (Japan), ESA (Europe), and CSA (Canada). It has been continuously occupied since 2000, with a full-time international crew conducting microgravity experiments and other research.
Portland, Oregon, Was Named in a Coin Toss
What’s the most you’ve ever lost in a coin toss? For Asa Lovejoy, it was the opportunity in 1845 to name the city he’d recently established with Francis Pettygrove. The two decided to settle their disagreement as to what their new land claim should be called with a two-out-of-three coin flip that Pettygrove won. Pettygrove chose “Portland” because he hailed from the city of the same name in Maine; Lovejoy had intended to name the place after his hometown of Boston.
Now known as the Portland Penny, the one-cent piece used in the fateful toss was minted in 1835 and retrieved by Pettygrove after his victory. It remained with him when he founded Port Townsend, Washington, and was eventually given to the Oregon Historical Society, which now keeps it on display.
People Breathe Primarily Out of One Nostril at a Time
The human nose is a biological wonder. It can smell up to 1 trillion odors, trap harmful debris in the air before it enters your lungs, and affect your sex life. But arguably its most important job is to condition the air you breathe before that air enters your respiratory tract. This means warming and humidifying the air before it passes to your throat and beyond.
To do this, the nose undergoes a nasal cycle in which one nostril sucks in the majority of the air while the other nostril takes in the remaining portion. A few hours later (on average), the nostrils switch roles. This cycle is regulated by the body’s autonomic nervous system, which swells or deflates erectile tissue found in the nose. Although we don’t notice this switch throughout the day, if you cover your nostrils with your thumb one at a time, you’ll likely observe that air flow through one is significantly higher than the other. This is also why one nostril tends to be more congested than the other when you have a cold (the nondominant one gets more filled with mucus).
By Some Accounts, North Dakota Didn’t Technically Become a State Until 2012
North Dakota was admitted to the Union as the 39th state on November 2, 1889, except it kind of sort of wasn’t. Its constitution left out a key detail that, according to some, was enough of a technicality that North Dakota didn’t actually become a state until 2012. A local historian by the name of John Rolczynski first noticed in 1995 that North Dakota’s state constitution failed to mention the executive branch in its section concerning the oath of office, which he felt made it invalid; the United States Constitution requires that officers of all three branches of a state’s government be bound by said oath, and North Dakota’s only mentioned the legislative and judiciary branches.
This led to a campaign that included an unanswered letter to then-President Bill Clinton and ended with the successful passage of an amendment to Section 4 of Article XI of the state constitution, which fixed the omission. “It’s been a long fight to try to get this corrected and I’m glad to see that it has,” Rolczynski said at the time. North Dakota had enjoyed all the benefits and responsibilities of statehood for well over a century by that point, of course, but you can never be too thorough.
Leaders have historically used body doubles to thwart would-be assassins, but Queen Elizabeth II’s double served a different — and significantly less bloody — purpose. A big part of being the queen of the United Kingdom was simply showing up. Whether opening a hospital or hosting a foreign dignitary, the queen was always busy. A majority of her events required rehearsals, and that’s where Ella Slack came in.
Slack got the job while working for the BBC’s events department in the 1980s. Although she doesn’t look like her majesty, Slack is about the same height and build, so if an event needed to test camera angles or see if the sun would be in the queen’s eyes, Slack was the person for the task. She stood in for the queen more than 50 times, including riding in the royal carriage and attending rehearsals for the opening of Parliament.
A Misheard Song Lyric Is Called a “Mondegreen”
If you’ve confused “Takin’ Care of Business” with “Makin’ Carrot Biscuits,” or “Bennie and the Jets” with “Betty in a Dress,” you’ve been tricked by a mondegreen. As Merriam-Webster explains, this phenomenon occurs when a word or phrase “results from a mishearing of something said or sung.” You can thank American writer Sylvia Wright for the term, which she coined in a 1954 Harper’s essay. When Wright was a child, her mother read to her from the book Reliques of Ancient English Poetry. A favorite entry featured the line, “And laid him on the green,” which Wright misheard as “And Lady Mondegreen.”
A mondegreen occurs when there’s a communication hiccup between the syllables you hear and the meaning your brain assigns to them. Mondegreens are especially common when you hear music but cannot see the singer’s face, like when listening to the radio. They’re also more likely to happen when the singer has an accent.
A Famous Caricaturist Hid the Name of His Daughter in His Drawings for Decades as a Game
Some caricaturists, whether in celebrity restaurants or theme parks, face customers who are less than thrilled with their portraits, but to be drawn by caricaturist Al Hirschfeld was considered an honor. Hirschfeld began working with The New York Times in 1929, often drawing the stars of Broadway and Hollywood, but it wasn’t until the birth of his daughter Nina in 1945 that a now-legendary game began. In many of his drawings following her birth, for the Times and other prominent publications, Hirschfeld hid his daughter’s name “in folds of sleeves, tousled hairdos, eyebrows, wrinkles, backgrounds, shoelaces — anywhere to make it difficult, but not too difficult, to find,” Hirschfeld once said. Next to his signature, the artist included the number of times “Nina” appeared throughout the image.
This tradition inspired an unofficial puzzle for decades, as readers scanned Hirschfeld’s work to find each and every “Nina” — and this included Hirschfeld himself. According to his foundation’s website, the artist became so accustomed to adding his daughter’s name as part of his artistic process that he often had to go back through the piece and find every hidden “Nina” for himself in order to come up with the total count. Hirschfeld continued this tradition for nearly 60 years, until his death at the age of 99 in 2003.
Beer Makes Humans More Attractive to Mosquitos
There are a few ways to avoid the itch-inducing bites of summer’s biggest pest: the mosquito. Wearing long-sleeved apparel and dousing yourself in insect repellent can help, but avoiding some beverages — particularly alcohol — might further protect you. According to a 2010 study of mosquito biting preferences, beer makes humans more attractive to the paltry pests.
Researchers found that Anopheles gambiae, a mosquito species in the genus responsible for transmitting malaria, were more attracted to humans who had consumed beer (compared to those who consumed only water), and the results were evident as soon as 15 minutes after the humans began drinking. It’s unclear why beer primes humans to become bite victims, though some scientists believe it could be partly linked to body temperature; alcohol expands the blood vessels, a process that slightly increases skin temperature and also makes us sweat — two factors that may attract more hungry mosquitoes.
Earth is home to a staggering number of creatures: By one estimate, more than 8.7 million species of plants and animals live on its lands and in its waters. Mammals, however, make up a small fraction of that number — just 6,495 species. If you’re wondering which warm-blooded animals are most numerous, glance to the night sky. That’s where you’ll probably find bats, which account for 21% of all the mammals in the world.
Bats have been around for more than 50 million years, which helps explain why they’re such a fine-tuned part of our ecosystem. Nectar-eating bats are master pollinators of more than 500 plant species (including cacao for chocolate and agave for tequila), thanks to their ability to fly and transport pollen farther than bees. They’re also nature’s bug zappers, keeping mosquito, moth, and beetle populations in check. The flying insect hunters are so effective — eating half their body weight in bugs each night — that scientists credit them with saving U.S. farmers $1 billion in pesticides and crop damage each year. Bats even help combat deforestation by dropping seeds over barren areas: Bat-dropped seeds can account for up to 95% of regrowth in cleared forests in tropical areas, a huge accomplishment for such small creatures.
The National Animal of Scotland Is the Unicorn
America has the eagle, England has the lion, and Scotland has the unicorn. And while the horned mythological creature may not actually exist, the traits it represents certainly do: Purity, independence, and an untamable spirit are all qualities Scotland has long cherished. Unicorns appeared on the country’s coat of arms starting in the 12th century, and were officially adopted as Scotland’s national animal by King Robert I in the late 14th century. For many years, the coat of arms included two of the legendary beings, but in 1603 one was replaced by a lion to mark the Union of the Crowns. Fittingly for the then-newly united England and Scotland, folklore had long depicted the two creatures as butting heads to determine which one was truly the “king of beasts.”
Scottish kings also displayed that fighting spirit, which may be why unicorns were generally depicted in Scottish heraldry as wearing gold chains — only the land’s mighty monarchs could tame them. Unicorns remain popular in Scotland to this day, with renditions found on palaces, universities, castles, and even Scotland’s oldest surviving wooden warship.
Today carrots are practically synonymous with the color orange, but their auburn hue is a relatively recent development. When the carrot was first cultivated 5,000 years ago in Central Asia, it was often a bright purple. Soon, two different groups emerged: Asiatic carrots and Western carrots. Eventually, yellow carrots in this Western group (which may have developed as mutants of the purple variety) developed into their recognizable orange color around the 16th century, helped along by the master agricultural traders of the time — the Dutch. Today, there are more than 40 varieties of carrots of various shapes, sizes, and colors, including several hues of purple.
Some Monarch Butterflies Travel Over 3,000 Miles on Their Annual Migration
No animal on Earth travels quite like the eastern monarch butterfly. Its journey begins in the early days of spring on a few mountains in central Mexico. Millions of the monarchs (Danaus plexippus plexippus) fill the branches of oyamel firs, and as the temperature warms up, they soak in the sun and begin their epic journey northward — a 3,000-mile trip that looks more like a bird’s migration than an insect’s.
But it’s not only the miles that make the butterfly’s journey so remarkable — it’s also the means. A typical monarch butterfly lives for only about four weeks, not nearly long enough to complete the journey to the northern U.S. and Canada. So the migration becomes a multigenerational one. In a typical year, it will take four generations for monarch butterflies to finish the seasonal quest their great-grandparents started. To return south in the fall, a “super generation” — also known as the Methuselah generation (after the long-lived biblical patriarch) because it can live eight times longer than its ancestors — will travel 50 miles a day by riding thermal currents southward before finally resting in the same oyamel firs in central Mexico. All hail the monarch!
Human eyes are entirely unique; just like fingerprints, no two sets are alike. But some genetic anomalies create especially unlikely “windows” to the world, such as gray eyes. Eye experts once believed that human eyes could appear in only three colors: brown, blue, and green, sometimes with hazel or amber added. More recently, the ashy hue that was once lumped into the blue category has been regrouped as its own, albeit rarely seen, color. Brown-eyed folks are in good company, with up to 80% of the global population sporting the shade, while blue eyes are the second most common hue. Traditionally, green was considered the least common eye color, though researchers now say gray is the most rare, with less than 1% of the population seeing through steel-colored eyes.
The Color Red Appears in Nearly Every Shot of “The Shining”
You’d be forgiven for failing to notice some of The Shining’s more intricate details, since there’s a good chance you were covering your eyes with your hands the first time you watched it. Those details really do add to the experience of Stanley Kubrick’s 1980 horror classic, however, including the fact that the color red appears in nearly every shot. Some of these appearances are obvious — that famous scene of blood pouring out of the elevator, the red-walled men’s room where Jack Torrance (Jack Nicholson) freshens up — but many are quite subtle. Did you ever notice that the darts young Danny (Danny Lloyd) plays with are red, for instance, or that a book placed on a table in the opening scene and the dress Wendy (Shelley Duvall) wears are red as well? According to one analysis, the inclusion of the scarlet hue is meant to be a visual nod to Jack’s deteriorating mental condition as the Overlook Hotel takes hold of him.
Credit: Sean Gallup/ Getty Images News via Getty Images
There’s a Beach in the Maldives That Glows in the Dark
If you were wowed by those glow-in-the-dark stars on your bedroom ceiling as a kid, you may need to book a trip to the Maldives. The small nation of more than 1,000 islands in the Indian Ocean is home to at least one beach, on Mudhdhoo Island, that often glows in the dark — and it’s a completely natural phenomenon. We have ostracod crustaceans (aka seed shrimp) to thank for the effect, as the millimeter-long creatures have the ability to emit a blue light for as long as a minute or more. Though scientists are unsure why they do so, some believe it happens when a “mass mortality” event occurs.
Seed shrimp are far from the only creatures who shine this way: The chemical reactions that create bioluminescence occur in other organisms whose bodies contain luciferin (light-emitting organic compounds; the name comes from the Latin “lucifer,” meaning “light-bearing”). That list also includes fellow ocean-dwellers such as firefly squid and sea sparkles, as well as fireflies, glow-worms, and certain bacteria and fungi on land.
The Snickers Candy Bar Was Named After One of the Mars Family’s Favorite Horses
While names like Hershey’s and 3 Musketeers (which originally included three bars) are fairly straightforward, some candy bar monikers are more elusive. Case in point: What, exactly, is a Snickers? Well, it’s actually a “who” — and not a human “who” at that. The candy bar was named after one of the Mars family’s favorite horses. Franklin Mars founded Mars, Incorporated (originally known as Mar-O-Bar Co.) in 1911, introducing Snickers in 1930; when it came time to name his product, he did what any pet-lover would do, and immortalized his equine friend as only a candy magnate could. (By some accounts, the horse had passed away shortly before the product’s launch.)
At Least 31 Languages Have a Word Very Similar to “Huh”
“Huh” is a humble word, often a near-involuntary linguistic response, but behind this simple interrogatory palindrome is an extraordinary truth — it’s also universal. According to research conducted in 2013 by the Max Planck Institute for Psycholinguistics in the Netherlands, a version of the word can be found in nearly every language on Earth.
Researchers analyzed 31 languages, including Spanish, Mandarin, Icelandic, and Indigenous tongues. What they found was that every one included a word similar in both sound and function to the English “huh.” For example, in Mandarin it’s “a?”, Spanish “e?”, Lao “a?”, and in Dutch “he?” No matter the language, the word includes a relaxed tongue, rising pitch, and if there’s a sound before the vowel, it’s an “h” or a glottal stop (a consonant sound made by closing the glottis, the space between the vocal folds). Although there is some variation in pronunciation, the word shows staggeringly little difference among languages compared to what might be expected.
Redheaded People May Require More Anesthesia
There are all sorts of (false) rumors and superstitions floating around about redheads: They bring bad luck. They have fiery tempers. They’re more susceptible to pain sometimes and hate going to the dentist. On that last account, at least, there’s a decent amount of research that might explain the anecdotal evidence.
A 2004 study found that redheaded subjects required 19% higher dosages of an anesthetic (desflurane) to realize a satisfactory effect. The following year, another study found redheads to be more sensitive to thermal pain, and resistant to the effects of a different injected anesthetic (lidocaine). The apparent difference, for those natural carrot tops, involves the presence of melanocortin 1 receptor (MC1R) gene variants in the pigment-producing cells known as melanocytes. These variants stymie the hormones that would otherwise turn red hair a different shade, while also seemingly influencing secretions related to pain tolerance. However, research doesn’t support the idea that redheads have a lower pain tolerance generally.
As beloved as the crisp fall weather seems to be, English speakers haven’t always paid attention to it … at least not linguistically. Historically, the more extreme seasons have always been named — specifically winter, which was so important that it was used to mark the passage of time by the Anglo-Saxons, who counted their years in winters. But when English speakers of the past referred to summer’s end, they often used the term “harvest,” from the Old English (and ultimately Germanic) haerfest. The first recorded usage of “harvest” to mean a season appears in the 10th century, but the word didn’t stick around in common usage (it was considered outdated by the 1700s).
Eventually, the English language began recognizing the transitional seasons. “Autumn” emerged around the 1300s, taken from the Latin autumnus and French autompne, and slowly pushing out “harvest.” “Fall” cropped up around the 1500s as part of “fall of the leaf,” mirroring the popular phrase “spring of the leaf” used for the vernal equinox, and it’s likely that these phrases were simply shortened to give the seasons their modern names. “Autumn” and “fall” have been used interchangeably ever since.
Istanbul, Turkey, Is Located in Both Europe and Asia
In addition to its more than 2,500-year-old history and fascinating architecture (including the Hagia Sophia, built as a church in the sixth century CE), Istanbul is notable for being split between two continents, Europe and Asia, by a thin ribbon of water called the Bosporus. Around one-third of Istanbul’s residents live in Asia — east of the Bosporus — while the rest live in Europe. The European portion of Turkey is also known as East Thrace or Turkish Thrace (after the ancient Thracian tribes that inhabited the region), while the Asian region is sometimes called Anatolia.
Pepsi has been nearly synonymous with cola for more than a century, but it wasn’t always called that. We have pharmacist Caleb Bradham to thank for the bubbly beverage, as well as its original name: Brad’s Drink. Believing that his concoction had digestive benefits, Bradham sold it at his pharmacy in New Bern, North Carolina. Brad’s Drink didn’t last long, however — it was renamed Pepsi-Cola in 1898.
The new name was partly derived from the word “dyspepsia,” a technical term for indigestion, and was meant to convey the tasty beverage’s supposed medicinal properties. Bradham trademarked the name in 1903, and the company grew exponentially over the next few years, with 240 franchises opening across 24 states by 1910.
An Estimated $58 Million in Loose Change Is Left Behind on Airplanes Each Year
If you think the change in your couch adds up, just try a 747. It’s been estimated that as much as $58 million is left behind on airplanes every year — a princely sum, to be sure, but one that makes sense when you remember how many people are often in the air at any given time. In an average year, the Federal Aviation Administration handles more than 16 million flights — which is to say that you probably won’t become a millionaire by looking through the seats of your next flight as you deplane.
In fact, a great deal of loose change never even makes it off the ground. Nearly $1 million was left behind in security bins in 2018, all of which was collected — and kept — by the Transportation Security Administration. That amount, which the TSA is required to report, has been steadily growing in recent years: $531,000 was left behind in 2012, compared to $960,105 in 2018. If you don’t want to add to that number, you may want to go cashless on your next cross-country flight.
The Last U.S. President With Facial Hair Was William Howard Taft
On Inauguration Day in 1913, mustachioed President William Howard Taft passed the presidential baton to clean-shaven Woodrow Wilson. What Taft couldn’t have known at the time was that his departure began a long streak of clean-shaven faces occupying the Oval Office.
In fact, out of the 46 Presidents in U.S. history so far, only 13 have had any facial hair. Although sixth President John Quincy Adams, eighth President Martin Van Buren, and 12th President Zachary Taylor sported impressive mutton chops, the first serious presidential facial fuzz belonged to 16th President Abraham Lincoln — thanks to an 11-year-old girl whose 1860 letter convinced him to grow out his whiskers. After Lincoln, eight of the next 10 Presidents sported some sort of facial hair.
Green Bell Peppers Are Just Unripe Red Bell Peppers
If you’ve ever found yourself in the grocery store struggling to decide between red and green bell peppers — or even just wondering what the difference is between them — you may be interested to learn that they’re the very same vegetable. In fact, green bell peppers are just red bell peppers that haven’t ripened yet, while orange and yellow peppers are somewhere in between the two stages. As they ripen, bell peppers don’t just change color — they also become sweeter and drastically increase their beta-carotene, vitamin A, and vitamin C content. So while the green variety isn’t quite as nutritious as its red counterpart, the good news is that one eventually becomes the other.
Orson Welles (1915 – 1985).
Food on Planes Tastes Different in Part Because of the Cabin Conditions
Airline meals of the past look more appealing — and probably tasted better, too. But that may have less to do with food quality, and more to do with altitude. Turns out, cabin conditions required for today’s high-altitude flights affect our taste buds, making even overly sweetened or salted foods bland. Our sense of taste is heavily impacted by scent, and as many frequent flyers know, air travel can wreak havoc on the mucus membranes inside our noses. Cabin pressure — usually set to the equivalent of about 6,000 to 8,000 feet above sea level — decreases oxygen levels in the blood, which actually dulls the body’s olfactory receptors. The lack of humidity in the air also dries out nasal passages, making taste buds essentially numb, and reducing your perception of saltiness or sweetness by 30%.
To make matters worse, studies show that the constant hum of plane engines also reduces our ability to taste sweet and salty foods, though it may actually enhance umami flavors like soy sauce and tomato juice and seasonings like curry and lemongrass.
There’s a Genus of Spiders Named After Orson Welles
Orson Welles is among the most influential filmmakers of all time, but his impact isn’t confined to the world of cinema and radio. The multihyphenate behind Citizen Kane has even made a splash among biologists — there’s a genus of giant spiders named after him. In total, there are 13 species in the Orsonwelles genus, all of which are found in the Hawaiian islands: six on Kauai, three on Oahu, two on Molokai, and one each on Maui and Hawaii itself (the Big Island).
Several of the creepy-crawlies are named after movies Welles directed and roles he performed: Orsonwelles macbeth, Orsonwelles bellum (named for War of the Worlds, with bellum meaning “war”), Orsonwelles othello, Orsonwelles falstaffius, and Orsonwelles ambersonorum. (The last of these is named for The Magnificent Ambersons, which some say is Welles’ greatest film — sorry, Citizen Kane!) If you consider yourself an arachnophobe, try not to fret too much over the description of these eight-legged creatures as “giant”: They’re only about the size of a thumbtack.
You may love Disney, but you probably don’t love it as much as Jeff Reitz. The 49-year-old brought new meaning to the term “Disney adult” by visiting the Happiest Place on Earth 2,995 days in a row — a streak that only ended when Disneyland shut down during the pandemic. It began as “a joke and a fun thing to do” between him and a friend when the two were in between jobs on New Year’s Eve 2011, and it continued for eight years, three months, and 13 days.
The original plan was to spend every day of 2012 at the park, in part because it was a leap year and Reitz liked the idea of going 366 days in a single year, but he didn’t feel inclined to stop once 2013 rolled around. He became the unofficial record-holder at the 1,000-day mark and was close to reaching 3,000 days before COVID-19 prevented that particular milestone when Disneyland shut down on March 14, 2020.
There Are No Bridges Across the Amazon River
When it comes to the Amazon River, there’s no such thing as water under the bridge. The idiom simply doesn’t apply there, as no bridges cross the Amazon River despite it being at least 4,000 miles long. This isn’t because the idea has never occurred to anyone — it would just be extremely difficult to build any. The Amazon has both a dry season and a rainy season, and during the latter its waters rise 30 feet, causing 3-mile-wide crossings to grow by a factor of 10 as previously dry areas are submerged. The river bank itself is also in a near-constant state of erosion due to how soft the sediment it consists of is, and there’s no shortage of debris floating in the water.
Beyond all those logistical hurdles, there simply isn’t much use for bridges across the massive river. For one thing, there are few roads on either side of the Amazon that need to be connected. The river is, of course, in the middle of a dense rainforest, the vast majority of which is sparsely populated.
Credit: Buyenlarge/ Archive Photos via Getty Images
The Phrase “Don’t Mess With Texas” Was Created To Discourage Road Littering
Three decades ago, Texas was facing an enormous problem: trash, as far as the eye could see, piled up along its scenic and city roadways. The cleanup was arduous and costly — by the mid-1980s, the Texas Department of Transportation (aka TxDOT) was spending nearly $20 million each year in rubbish removal along highways alone. To save money (and the environment), leaders of the Lone Star State knew they had to get trash under control, which they decided to do with a series of public service announcements. But little did TxDOT know that its cleanliness campaign would become larger than life.
The iconic line, dreamed up by an Austin-based ad agency, initially launched on bumper stickers deposited at truck stops and fast-food restaurants. The first “Don’t Mess With Texas” commercial, which aired at the 1986 Cotton Bowl, honed in on Texans’ love for their land, telling viewers that littering was not only a crime but “an insult” to the state’s landscape. The phrase soon became a rallying cry for Texans, and the advertisement was so popular that TV stations around the state received calls asking for it to be aired again. Within a year, TxDOT estimated that roadside litter had dropped by 29%. The ad campaign continued and is credited with reducing highway trash by 72% in its first four years. The slogan has become only more popular over time, used at protests, declared by presidential candidates, and chanted at football games — all proof that state pride is held deep in the hearts of Texans.
The Last American to Collect a Civil War Pension Died in 2020
By 1956, the last surviving Civil War veteran had died, but the Department of Veterans Affairs continued issuing pension payments for decades to come — up until 2020. Irene Triplett, a 90-year-old North Carolina woman, was the last person to receive a Civil War pension, thanks to her father’s service in the Union Army. Mose Triplett was originally a Confederate soldier who deserted in 1863 and later joined a Union regiment, a move that kept him out of the fight at Gettysburg, where 90% of his former infantry was killed. Switching sides also guaranteed Mose a pension for the remainder of his life, which would later play a role in him remarrying after the death of his first wife.
At age 78, Mose married the 27-year-old Elida Hall — a move historians say was common during the Great Depression, when aging veterans needing care could provide financial security to younger women. The couple had two children, including Irene, who was diagnosed with cognitive impairments that allowed her to qualify for her father’s pension after both parents’ deaths. By the time of Irene’s own passing in 2020, the U.S. government had held up its duty, paying out Mose Triplett’s pension for more than 100 years.
Pink Was Once Considered a Color for Baby Boys, While Blue Was for Baby Girls
Before pink and blue, there was white. For much of the 19th century, most infants and toddlers wore white dresses regardless of their biological sex. But around 1900, childcare experts began to push for a greater distinction between little girls and boys, amid fears that boys were growing up “weaker” and “lazier” than their fathers had. Many U.S. publications and stores responded in part by recommending pink clothing for boys and blue clothing for girls, although some also recommended the opposite color scheme. According to Dressmaker magazine, “Blue is reserved for girls as it is considered paler, and the more dainty of the two colors, and pink is thought to be stronger (akin to red).”
But around World War II, everything changed. Soon pink was heavily marketed as the preferred color for girls, and blue for boys. It’s not entirely clear what led to the switch, and the colors chosen were somewhat arbitrary — the focus was primarily on creating clothes specific for each child in an attempt to curb hand-me-downs, and thus sell more product. Once the 1950s began, hospitals wrapped newborns in pink or blue blankets, based on their sex (today’s standard blankets contain pink and blue stripes).
During the Civil War, a Sack of Flour Was Repeatedly Auctioned Off to Raise Money for Wounded Soldiers
In January 1865, four months before the Civil War’s end, Harper’s Weekly published the story of a peculiar flour sack credited with raising thousands of dollars for injured soldiers. The tale — entirely true — began in Austin, Nevada, the previous year. On the eve of city elections, two wagering men, area merchant Reuel Colt Gridley and Dr. Henry Herrick, placed a bet on the vote’s outcome. The loser would pay up with a 50-pound sack of flour, but not before a dose of public humiliation: Whoever lost had to ceremoniously march down the town’s main strip with the bag, all to the tune of “John Brown’s Body” (a patriotic melody that would later inspire “The Battle Hymn of the Republic”).
Within a day, the losing bettor, Gridley, was being cheered on by his fellow townsfolk — who turned out in numbers to watch the spectacle — as he followed a brass band down the city’s center, flour sack over his shoulder. At the end of his march, he handed the sack to the bet’s winner, Herrick, but not without first recommending it be donated to the Sanitary Commission, a relief agency that provided care for sick and injured Union soldiers. Herrick agreed, and soon after the hefty sack of flour was auctioned for $350. But in an act of gallantry, the winner asked that the sack be sold again, raising another $250. Surrounding towns joined in, and before long Gridley and the “Sanitary Sack of Flour” had gone as far as San Francisco and raised $63,000. Newspapers spread the story, leading the flour sack across the country, raising upwards of $275,000 (more than $4 million today) and ending up as far as New York City. Gridley, who had started the journey as a Confederate sympathizer, returned to Nevada an ardent supporter of the Union; the famed Sanitary Sack returned with him and remains on display in Reno at the state Historical Society Museum.
Philadelphia Cream Cheese Isn’t Actually From Philadelphia
Despite the name, Philadelphia Cream Cheese is definitely not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods. So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s.
Humans May Have Evolved Fingers and Toes That Wrinkle in Water To Help Them Grip Wet Objects
Spend some time in a pool, lake, or other watery location, and it won’t take long to see deep ridges of wrinkles spreading across the pads of your fingers and toes. Despite often earning the unflattering adjective “pruney,” these wrinkles disappear in about 20 minutes once back on land. At first glance, these H2O-induced crevices seem like a simple case of osmosis, in which water floods membranes (in this case our skin) to equalize on both sides. But then why doesn’t the rest of the human body wrinkle when submerged in water?
Today, we know this wrinkling is caused by constriction of the blood vessels (which is also why fingers and toes turn pale at the same time). The leading theory as to why this happens is that our hands evolved to wrinkle in wet environments to improve grip, whether we’re running in a rainstorm or grasping a potential meal in a freshwater stream. Several studies, including one published by Manchester Metropolitan University in 2021, found that grip improved dramatically when hands were especially wrinkly after water submersion. Scientists are still debating the true nature of this involuntary skin response, but at least now you can look upon your deeply creased digits with a new respect — even if they are “pruney.”
For about 37 years of its 245-year history, the U.S. has been without a second-in-command. Before the passage of the 25th Amendment in 1967, there was no procedure for filling the role if a commander in chief died in office. Instead, there just wasn’t a VP if that happened — at least not until the next presidential election. Thanks to this legislative quirk, John Tyler, Millard Fillmore, Andrew Johnson, and Chester Arthur (all VPs under a President who died in office) served their entire presidential terms without a Vice President. Other Presidents have gone without VPs for at least part of their terms, whether through resignation (two) or because their veeps died in office (seven).
Every Film Actor John Cazale Appeared in Was Nominated for Best Picture
There are impressive filmographies, and then there’s John Cazale’s. The actor only appeared in five films during his lifetime, all of which were nominated for Best Picture at the Academy Awards: The Godfather (1972), The Conversation (1974), The Godfather Part II (1974), Dog Day Afternoon (1975), and The Deer Hunter (1978). Even more remarkably, three of them — both Godfathers and The Deer Hunter — won the top prize. The last of these was released after Cazale’s untimely death from bone cancer in March 1978, at which time the 42-year-old thespian was the romantic partner of fellow great Meryl Streep. (He was also in 1990’s The Godfather Part III via archival footage, which didn’t break his streak — that sequel was also up for Best Picture.)
Before Carving Pumpkins for Halloween, People Used To Carve Turnips
In the 19th and early 20th centuries, turnips weren’t just begrudgingly served for dinner, but also used as small lanterns. The durable root crop is often harvested as the weather cools, and in Ireland, that was just in time for Samhain, the Celtic celebration of summer’s end. Ancient Celts believed that the separation between the living world and spirit realm was at its weakest during autumn, making it possible for ghosts and demons to cause mischief. To protect themselves and their homes, superstitious folk across the British Isles would carve frightening faces into produce — sometimes potatoes or beets, but most commonly turnips — as a way to ward off harm. With a lit candle placed inside, the illuminated faces acted as old-world lanterns that banished the unwanted and guided the way along dark paths.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The Sunday funnies are as much a part of the quintessential U.S. newspaper as the crossword puzzle and want ads. And though the number of print newspapers in circulation has decreased significantly over the past few years, comic strips and their characters are still a touchstone of our culture: Calvin and Hobbes, Cathy, Peanuts, Garfield, and Dilbert are just a few of the household names that are rich with nostalgia. Before these colorful comic strips came into being, however, there were decades’ worth of comics depicting everything from a wisecracking police detective to a punk kid with a slingshot to a glamorous newspaper reporter who always got the scoop.
Here, we’ve rounded up some of the longest-running comic strips, many of which have changed hands over the years. Whether we realize it or not, comic strips help capture the essence of the moment, and through them we can glimpse the zeitgeist of a generation — often with a necessary splash of humor.
The Katzenjammer Kids isn’t widely known anymore, but it holds a few Sunday Funnies distinctions. Cartoonist Rudolph Dirks was 20 years old when his comic following the mischievous duo of Hans and Fritz, two young troublemakers who get into tiffs with their parents and school officials, first ran. Dirks created the series for William Randolph Hearst’s New York Journal in 1897, but when he took a job at New York World, Hearst kept the name of the comic. This led to a lawsuit and Dirks’ creation of a competing, near-identical strip called The Captain and the Kids, which ran from 1914-1979. The Katzenjammer Kids was drawn by a number of other cartoonists until it ceased syndication in 2006 — an amazing 109-year streak. Dirks, most notably, is credited as the first cartoonist to use speech balloons to express character dialogue, a practice that is still very much used today.
Gasoline Alley (1918-present)
Gasoline Alley was created by cartoonist Frank King in 1918 during his tenure at the Chicago Tribune. The premise is pretty straightforward: The strip follows a group of automobile enthusiasts who meet in an alley. It largely revolves around a main character, Walt Wallet, and his circle of family and friends. In 1921, King’s editor asked for the comic to introduce a baby, since he believed that would draw more female readers. To solve the problem of Walt being a bachelor, King had Walt discover a baby on his doorstep; he named him Skeezix (a cowboy slang term for an orphaned calf). And then, King added a dimension that was brand-new to comics: He let the characters age in real time. Skeezix grew into a child and then a teen and eventually enlisted during World War II. Walt also aged, married, had more children and then grandchildren. Gasoline Alley is still going strong; it’s now drawn by cartoonist Jim Scancarelli, who explains Walt now aging well past 100 as “Walt has good genes.”
Rounding out the trifecta of longest-running, uninterrupted Sunday comic series of all time is Barney Google and Snuffy Smith, the brainchild of cartoonist Billy DeBeck. It started as a daily strip in the sports section of the Chicago Herald and Examiner in 1919, with the title Take Barney Google, F’rinstance. The titular character was a diminutive man with large “banjo” eyes who played poker and bet on horse races and prize fights. A horse named Spark Plug was added in 1923, and in 1934 Barney met Snuffy Smith, a hillbilly moonshiner who has been with him ever since.
Fun fact: The common phrase “googly eyes” actually originated from the comic strip, in reference to Barney’s huge eyes, and a song called “Barney Google (with the Goo-Goo-Googly Eyes)” was released in 1923.
Little Orphan Annie (1924-2010)
For most folks, the mention of Little Orphan Annie conjures up images of an innocent little redhead and her beloved dog, Sandy. It might even cue that catchy song about perseverance and how the sun will most definitely come out tomorrow. But although the strip was positively received over the years, it was also pulled from newspapers nationwide numerous times for its often controversial storylines. Its creator, Harold Gray, was known to use the Daddy Warbucks character as a mouthpiece for his political views, and those plots included everything from calling all political leaders criminals to criticizing the country’s mental health care system. Eventually, the comic ran its course and was discontinued in 2010, having spawned a beloved musical and a number of movies.
Popeye (full name: Popeye the Sailor) was first introduced into the cultural lexicon after appearing in the King Features comic stripThimble Theatre in 1929. But the muscular sailor proved to be so popular that the strip was renamed Popeye in later years. Popeye’s defining features — a pipe protruding out of his mouth, two anchor tattoos on his forearms, and his love of spinach — have been mainstays in pop culture and have shown up in comic books, video games, TV cartoons, and even a 1980 live-action film starring Robin Williams as Popeye. The popular fast-food chain Popeyes, however, is not tied to the comic, despite previous speculation.
Blondie (1930-present)
Blondie is a comic strip that shows how the (cartoon) nuclear family can shift and adapt over time. The strip, created by Chic Young in 1930, chronicles the daily lives of titular character Blondie Boopadoop, a former flapper turned housewife; her husband, Dagwood Bumstead, a former heir who’s always late for work; and their two teenage children, Alexander and Cookie. Though the characters themselves haven’t aged a day, their lingo and accessories have shifted over the years to get with the times, including the slow modernization of their kitchen, the addition of cell phones, and references to Facebook, modern music, and current TV shows. After Chic died in 1973, creative control for the strip passed to his son, Dean Young; the comic is still going strong, 90 years later.
Dick Tracy the comic strip birthed Dick Tracy the American icon. Cartoonist Chester Gould created the sharp, crime-fighting police detective in 1931, and the titular character is perhaps best known for his square jaw, bright yellow hat and coat, and his super-enviable two-way wristwatch. One of the long-running series’ most memorable periods (besides the Madonna period) was perhaps the “Space Period,” when Dick was fighting crime and tracking down bad guys on the Moon. Presently, the detective continues to fight crime (on Earth) and occasionally makes a cameo in other comic strips, as he did at Blondie and Dagwood’s 75th anniversary party in 2005.
Prince Valiant (1937-present)
Hal Foster was already known for his Tarzan comic strip in 1937 when he approached media mogul William Randolph Hearst with an idea for a comic strip. Foster, who was a fan of the King Arthur and the Knights of the Round Table legends, pitched a strip he called Derek, Son of Thane — an idea that Hearst loved with a title he hated. The comic was renamed Prince Valiant and Foster would go on to depict the young royal’s epic adventures through different time periods, ranging from the late Roman Empire to the High Middle Ages. One notable feature of the strip is that there are no word or thought balloons at all; instead, the story is illustrated in captions situated at the bottom or sides of the panels.
Brenda Starr was originally created as a “girl bandit” character, but creator Dale Messick was encouraged to make the Rita Hayworth-esque Starr a reporter instead so that the Chicago/New York syndicate would pick it up. Not only that, but the creator was using a pen name: Knowing that the publisher had sworn off “women cartoonists,” Dalia Messick switched to the more male-sounding name Dale Messick professionally. But even after it was accepted, Brenda Starr, Reporter still got second-class treatment, at least initially — when it first published in 1940, Brenda was relegated to the Sunday comic book supplement rather than the daily paper. Luckily, Brenda was a star, and the strip was a success long after Messick stopped writing it in 1982.
Beetle Bailey (1950-present)
Regular readers of the Sunday comics will recognize Beetle Bailey for its consistent aesthetics and humor throughout the decades, due in large part to the fact that its creator, Mort Walker, was the illustrator for the strip for 68 years — from its inception in 1950 up to his death in 2018. This makes Beetle Bailey one of the oldest comic strips that was still being produced by its original creator, which is no small task. His sons Brian, Greg, and Neal Walker are keeping the strip alive following Mort’s death. The comic strip chronicles the titular character’s antics at Camp Swampy, which is broadly based on the real-life Camp Crowder in Missouri, where Walker was once stationed.
Mischievous little boys tend to do well for comic strips, as evidenced by Hank Ketcham’s Dennis the Menace, a strip that has since sparked a live-action TV show in the 1960s, an animated show in the 1980s, and a series of film adaptations. The muse for the popular series was none other than Ketcham’s young son Dennis, who was just 4 years old when the cartoonist first dreamed up the idea. Supposedly, Ketcham was trying to find the perfect name for his character when his then-wife Alice stormed into his studio and exclaimed,”Your son is a menace!” and thus, Dennis the Menace was born.
B.C. (1958-present)
Like an early Flintstones, B.C. features a group of cavemen and talking animals (including, of course, dinosaurs and other prehistoric creatures). It is often a tongue-in-cheek take on modern technology and woes, with shops in the strip displaying carved stone signs for “retail store,” “wheel repair,” or “psychiatrist.” Cartoonist Johnny Hart created the strip in 1958, and his grandson Mason Mastroianni took over as both the artist and the writer of the strip when Hart died in 2007. The prehistoric comic continues to this day.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Not all fragrance companies have the staying power of Chanel. For example: Chaqueneau, a New Jersey-based perfume company, made headlines in the 1950s for selling their perfumes only to men. “After all, there ought to be something a man can buy for a woman that she can’t buy for herself,” read a Saks Fifth Avenue brochure for one of the aforementioned scents, Chaqueneau-K. “Chaqueneau-K will never be sold to a woman.” For obvious reasons, the brand lacked staying power. But Chaquenau’s story is by no means indicative of the booming billion-dollar fragrance industry, which is expected to be valued at more than $40 billion annually by 2025.
A vacation to the South of France changed Gabrielle “Coco” Chanel’s young business forever. During her trip, she met Russian Frenchman Ernest Beaux, a second-generation perfumer. Instead of joining the market of mere floral and fruity scents, Chanel desired “an artificial perfume” that was constructed, like a couture gown. Beaux’s mixture contained a healthy pour of soapy-smelling aldehydes; some think this ingredient reminded Chanel of her mother, a laundress she lost at age 12. Each ounce of the fragrance also boasts the essence of 1,000 jasmine flowers and 12 roses, both sourced from the same 50-acre field in Pégomas, France. The first Chanel No. 5 ad featured a flapper-era Chanel sketched by the caricaturist Sem, while the designer was photographed for a 1937 follow-up. For many women, owning Chanel No. 5 remains a rite of passage — a bottle is sold every 30 seconds.
Favored by Rita Hayworth and Mad Men’s Joan Holloway (played by Christina Hendricks), Shalimar takes its name from the Sanskrit word meaning “abode of love.” The direct inspiration for Guerlain’s signature scent comes from gardens commissioned by 17th-century royalty. India’s Shalimar Gardens were masterminded by Mughal Emperor Jahangir, while his son Shah Jahan — overseer of the Taj Mahal’s construction — had a royal refuge built with around 450 fountains in Pakistan. Thus Shalimar’s Baccarat crystal bottle is designed to mirror an Eastern garden basin. The fragrance’s notes include bergamot, leather, and vanilla, a blend that earned praise from Chanel No. 5 architect Beaux. “If I had used that much vanilla, I would have ended up with sorbet or custard,” said Beaux. “But Jacques Guerlain created a masterpiece, Shalimar!” As of 2017, 108 bottles were purchased around the globe every hour.
Ginette “Catherine” Dior had a life worthy of a biopic. A member of the French Resistance during World War II, she was arrested and deported to Germany, where she labored in a concentration camp and factories supporting the Axis effort. Once free, she returned to France to farm flowers. In between, the British and French governments bestowed Catherine with honors for bravery. But the most sentimental accolade might have come from her older brother, Christian, when his fashion house named Miss Dior perfume after her. Featuring notes of narcissus, iris, and orris root, the fragrance was crafted by Jean Carles and Paul Vacher — after Carles lost his sense of smell (amazingly, he worked from memory). Natalie Portman has fronted the scent since 2010.
When model-actress Shelley Hack posed for the debut Charlie ad, she wore a three-piece suit, loafers, and a bowtie. Another campaign featured a woman toting a briefcase as she pats a man’s bottom. Revlon targeted liberated women seeking to buy their perfume with their own wages. Oprah Winfrey was so riveted that she brought Hack on her show in 2008 to discuss “the Charlie girl.” “I wanted to stride like her with confidence,” Winfrey said. “I wanted to be this fabulous.” Among the scent’s notes are lily of the valley, geranium, and coriander, and its golden bottle is in the collection of the Smithsonian National Museum of American History. While the fragrance predated Charlie’s Angels by three years, Hack eventually co-starred in 25 episodes.
In the late ‘70s, would-be customers for Yves Saint Laurent’s Opium perfume were known to pocket samples and yank posters when a store sold out. They yearned to sniff the cinnamon, sandalwood, and patchouli fragrance feted by Cher and Truman Capote at Studio 54. Yet for decades, Opium’s name and campaign imagery earned condemnation on multiple continents. A 1980 commercial followed supermodel Linda Evangelista’s search for the scent in a crowded Chinese marketplace, wielding a fan of cash. Twenty years later, London’s British Advertising Standards Authority forced YSL to take down Opium billboards that displayed writhing model Sophie Dahl — wearing only shoes and jewelry — when the photo generated more than 900 complaints. Nonetheless, the label never apologized. The Musée Yves Saint Laurent celebrated its late founder with a 2018 exhibition called “Yves Saint Laurent: Dreams of the Orient.”
The most successful celebrity perfume empire is a tale of two Lizes: Screen legend Elizabeth Taylor (1932-2011) partnered with Elizabeth Arden on a line that bloomed into 16 scents. To promote the first, “Passion,” future-dame Taylor embarked on a month-long American tour in 1987. Then the two-time Best Actress Oscar winner made her love of precious gems accessible to mall-goers with White Diamonds (notes: Egyptian tuberose, jasmine, and carnation). Along with the launch came “White Diamonds: The Movie,” a lilac-hued commercial from the ‘90s, where Taylor offered her earrings to help a strapped poker player, saying, “These have always brought me luck.” The fragrance’s lifetime sales surpassed $1 billion in 2013, and a portion of earnings from each of the star’s perfumes supports the Elizabeth Taylor AIDS Foundation.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Six little words changed movie history forever: “The name is Bond – James Bond.”
Based on the book series by British author Ian Fleming, the debonair spy’s high-octane escapades have captured moviegoers’ imaginations in 27 films over the past 60 years and have grossed more than $7 billion worldwide. Since the character’s first big-screen appearance in 1962, six actors have portrayed 007, with a new Bond expected to be announced soon. While many aspects of the tuxedo-clad Bond have remained constant, each new actor brought his own spin to the character.
Here’s a look back at the evolution of Bond, from his first appearance in theaters to the most recent iteration of the martini-sipping, Aston Martin-driving secret agent.
Sean Connery was the first actor to portray Bond on film, first starring in 1962’s Dr. No. Suave, sophisticated, and the creator of the on-screen Bond that we know today, Connery is still considered to be the007 by many fans. And given that he’s responsible for making Bond a household name, it’s hard to disagree.
However, there was one person who wasn't sold at first: Fleming. The author initially thought the Scottish performer seemed too brutish and unrefined to portray Bond. Luckily, Fleming’s tune changed after he saw Connery in action.
In addition to Dr. No, Connery played the spy in five subsequent films: From Russia With Love (1963), Goldfinger (1964), Thunderball (1965), You Only Live Twice (1967), and Diamonds Are Forever (1971).
Starring in only one Bond movie, 1969’s On Her Majesty’s Secret Service, Australian actor George Lazenby had the unenviable task of taking up the Bond mantle after Connery left (though Connery returned two years later for one more film). Many believed that Lazenby just wasn’t up to the task. After all, he was young — only 29 at the time — which some critics saw as a detriment to his rendition of the well-traveled and experienced Bond.
After On Her Majesty’s Secret Service, Lazenby claims he was offered a contract to star in six more Bond films but was advised by his agent to turn it down; apparently, his agent feared that the increasingly popular hippie culture of the 1960s and ’70s would render the franchise antiquated and irrelevant.
After Connery officially drank his last sip of his “shaken, not stirred” martini, Roger Moore starred in seven Bond films between ’73 and ’85: Live and Let Die (1973), The Man with the Golden Gun (1974), The Spy Who Loved Me (1977), Moonraker (1979), For Your Eyes Only (1981), Octopussy (1983), and A View to Kill (1985).
Unlike his predecessor Lazenby, critics enjoyed Moore’s different take on Bond: smarmy and even a little silly. Moore’s portrayal of Bond took the franchise in a lighter direction and steered away from the darker tone of the Connery films.
Moore’s role as Bond was certainly popular, and he’s tied with Connery for the highest number of on-screen portrayals of the spy.
A classically trained Shakespearean actor, Timothy Dalton took his Bond duties in a serious direction. Starring in two films — The Living Daylights (1987) and License to Kill (1989) — Dalton didn’t stray too far from the source material to make his portrayal as accurate as possible. And that’s exactly what we got: a Bond that was cold, calculating, and more ruthless than any we’d seen prior.
Critical response was divided, as some felt his portrayal was too dark, especially when compared to Moore, but few would argue that he didn’t accurately represent the Bond people know from the books.
But Dalton’s Bond didn’t stay entirely true to the original character — and for good reason. Certain aspects of author Fleming’s dated source material, such as references to casual racism and homophobia, were omitted. These would be just a few of the changes that filmmakers would make to the Bond films to make them more suitable for modern audiences.
Starring in four Bond films — GoldenEye (1995), Tomorrow Never Dies (1997), The World Is Not Enough (1999), and Die Another Day (2002) — Pierce Brosnan’s portrayal was well-regarded among viewers and critics. Seen as a blend of his predecessors, Brosnan brought to the table Connery’s coolness, Dalton’s darkness, and Moore’s wry humor, and created a Bond unlike any other seen on film.
One notable aspect of Brosnan’s portrayal was his stance against Bond’s smoking. Despite the Bond of Fleming’s novels smoking 60+ cigarettes a day, the Irish actor denounced the unhealthy habit and opted to play the character more aligned with Brosnan’s actual beliefs. Of course, he did smoke a cigar in Die Another Day.
Credit: Greg Williams/ Getty Images Entertainment via Getty Images
Daniel Craig
The most recent actor — and only blond — to portray Bond on film, Daniel Craig is hailed as one of the most accurate Bonds on the big screen. Craig’s Bond is steely, serious, and charming — exactly what Fleming envisioned in the original novels. The British actor made his Bond debut in 2006’s Casino Royale and starred in four other Bond films: Quantum of Solace (2008), Skyfall (2012), Spectre (2015), and No Time to Die (2021).
Despite the accuracy of his portrayal, he followed the modern tradition of adapting a few of Bond’s less-than-desirable characteristics. Like Brosnan, Craig agreed that it didn’t make sense for Bond to smoke. But it wasn’t for social or political reasons. According to Craig, it was just common sense: “I don’t wish for [Bond] to smoke. Fleming wrote a Bond who smoked 60 cigarettes a day. I can’t do that and then run two-and-a-half miles down a road, it just doesn’t tie in.”
Now that Craig has taken his final bow as 007, there’s much speculation about who will portray Bond next. As befits a character that continues to evolve, there are rumors the new Bond will be Black, with Idris Elba (Luther) and Regé-Jean Page (Bridgerton) topping the list. Other names that are floating around to step into Bond’s shoes include Jamie Dornan (50 Shades of Gray), Tom Hardy (Mad Max), and Henry Cavill (Superman).
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
The ubiquity of holiday songs on television, radio, and social media from Thanksgiving through Christmas ensures that we’ll be able to sing these anthems in our sleep. But lesser known are the backstories behind these famous tunes, which share common themes but draw from vastly different sources. From centuries-old standards to modern classics, here’s a look at the origins of six Yuletide favorites.
Legend says that an Austrian priest hastily whipped up this classic for his parishioners to sing after mice chewed through the church organ, but the truth is decidedly less dramatic. Having already crafted a poem titled “Stille Nacht,” Father Joseph Mohr enlisted schoolteacher and musician Franz Xaver Gruber to compose an accompanying melody on guitar for a performance at Christmas Mass in 1818. Enthusiasm for this simple but powerful piece steadily spread across Europe and overseas, which prompted an English translation by New York Episcopal priest John Freeman in 1859, before both German and English troops famously sang the song during a WWI Christmas ceasefire in 1914.
This ode to outdoor wintertime fun may have been composed in the warm-weather locale of Georgia — despite the claims of a city in Massachusetts. All aside, historians agree that the song was the work of James Lord Pierpont (uncle of Gilded Age tycoon J.P. Morgan), who copyrighted it in 1857 under the title of “One Horse Open Sleigh.” First recorded in 1889, “Jingle Bells” never achieved the massive sales recorded by some of the other standards on this list, though it does have the unique distinction of being performed in space in 1965.
Tin Pan Alley writer Haven Gillespie wasn’t feeling the holiday spirit after attending his brother’s funeral in September 1934, but he nevertheless agreed to write a children’s Christmas song at the urging of his publisher. Riding the subway after the meeting, Gillespie started reminiscing about his mother’s warnings that St. Nick was monitoring his behavior, and within 15 minutes he’d scribbled the lyrics for “Santa Claus is Coming to Town” for composing partner J. Fred Coots. Popular entertainer Eddie Cantor took it from there by singing the ditty on his Thanksgiving radio show, and a holiday standard immediately took root.
Tasked with penning a story for a department store Christmas giveaway in 1939, copywriter Robert L. May took a page from the 1823 poem “A Visit from St. Nicholas” and created a now-familiar tale about a reindeer who saves the day with his distinct snout. The book reached some 2 million customers, but true fame only arrived after May reclaimed rights to the story and passed it on to songwriter Johnny Marks the following decade. By the time the song landed in the lap of Gene “the Singing Cowboy” Autry in 1949, there was no slowing its rise to the top of the Billboard charts in early 1950 and Rudolph’s ascent to the firmament of Yuletide culture.
First sung by Judy Garland in 1944’s Meet Me in St. Louis, “Have Yourself a Merry Little Christmas” nearly missed out on its prominent introduction to the world. Writer Hugh Martin’s early draft had to be rescued from the trash by partner Ralph Blane, and it was subsequently fleshed out with such depressing lyrics that Garland refused to sing them. Martin grudgingly made the changes to satisfy Garland and director Vincente Minnelli, only to tweak the lyrics again to provide something more “jolly” for a 1957 Frank Sinatra rendition. Both the original and Sinatra versions have since been re-recorded many times over, by artists ranging from James Taylor to Twisted Sister.
Another holiday classic with Hollywood roots, this Irving Berlin-composed reflection on the timeless joys of the season was originally earmarked for an earlier project before surfacing in 1942’s Holiday Inn. And while Berlin felt that “Be Careful, it’s My Heart” would be the film’s biggest hit, it was the Bing Crosby-crooned “White Christmas” that instead grabbed listeners and claimed an Oscar in 1943. But even that achievement barely hints at its impact, as Crosby found an insatiable audience for the song while performing for American troops overseas. Along with fueling a 1954 movie of the same name, as well as covers by Elvis Presley, Bette Midler, Michael Bublé, and many other stars, Crosby’s “White Christmas” stood as the best-selling single of all time until Elton John’s “Candle in the Wind 1997.”
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.