Original photo by R. Wellen Photography/ Shutterstock

Today, riding an elevator is a mundane activity, but little more than two centuries ago, these mechanical contraptions were steam-powered, death-defying wonders. In the years since, these mostly unseen pieces of urban infrastructure have become a key part of what makes modern cities possible. Without them, a city’s upward trajectory would be impossible, and the design of our world would be unimaginably different. Here are six amazing facts about the humble elevator, from its surprisingly ancient origins to the many places it may take us in the future.

Drawing of Archimedes.
Credit: Roger Viollet Collection via Getty Images

Greek Mathematician Archimedes Invented an Elevator in 236 BCE

The elevator is a surprisingly old invention. According to writings from the ancient Roman engineer Vitruvius (the same Vitruvius who inspired Leonardo da Vinci’s “Vitruvian Man”), the Greek mathematician Archimedes invented a primitive elevator back in 236 BCE. Archimedes’ contraption bore little resemblance to today’s people-movers: It worked via manpower, with ropes drawn around a drum that was then turned by a capstan, a large revolving cylinder often used to wind ropes on ships. Although the attribution was written after Archimedes’ death, the invention makes sense for the great Greek thinker, who was famous for his exploration of compound pulley systems. Elevators join the list of other surprising ancient inventions, including such wonders as the world’s first steam engine and the world’s first computer.

Staircase in an old apartment building.
Credit: Alexander Sorokopud/ Shutterstock

Before the Modern Elevator, Top Floors Were Undesirable

Today the most luxurious high rises are crowned with multimillion-dollar penthouses, but before the rise of elevators (pun intended), the most desirable floors were those closest to the ground. The first building to include elevators at the design stage was the 130-foot Equitable Life Building in downtown Manhattan, which was built in 1870. Society was slow to adjust to the elevator, and the building was designed to look like it had fewer floors than it did. Also, the insurance company that worked out of the building still occupied the “valuable” lower floors, while the custodian enjoyed the upper floors. The era of the penthouse didn’t arrive in full swing until the 1920s, when the decade’s economic boom brought a flurry of construction projects to New York City and other cities around the world.

Elisha Graves Otis shows his first elevator in the Crystal Palace, New York City.
Credit: Bettmann via Getty Images

An American Inventor Created the First Modern Passenger Elevator

A key part of the very first passenger elevator was invented by Elisha Graves Otis, who founded the Otis Elevator Company, a manufacturer still in business today. Otis invented a safety device that would prevent an elevator car from falling if the cable broke. Before Otis’ invention, elevators were dangerous contraptions primarily reserved for moving cargo in factories, warehouses, and mines. In 1854, Otis introduced his “safety elevator” at New York City’s Crystal Palace, also known as “Exhibition of the Industry of All Nations,” where he asked someone to cut the rope that was holding him up. Once cut, the platform dropped only a few inches before catching him. This enhanced safety feature helped sway public opinion by demonstrating that elevators could be a safe means of vertical transportation. Today, elevators are considered statistically safer than stairs.

Cora Fossett, female elevator operator on a Cincinnati skyscraper.
Credit: Bettmann via Getty Images

People Once Trained for Years To Be Elevator Operators

Although Elisha Otis invented a safer elevator, that didn’t mean the device was foolproof. For decades, operating an elevator was considered a highly skilled job that required years of study in some parts of the world, such as Germany. In the late 19th century, elevators were operated using “shipper ropes,” and operators were trained on the precise timing of pulling these ropes to arrive at the right floor. A well-trained operator was highly desirable, since they made the difference between a smooth ride or a death-defying jumble of starts and stops.

Over the decades, the job of the elevator operator became increasingly automated. In 1887, American inventor Alexander Miles designed the first automatic elevator doors, after reading about several accidents involving people falling down elevator shafts. But it wasn’t until the 1960s — a little over a century after Elisha Otis introduced the first safety elevator — that automated elevator cars began to replace human operators entirely.

Aerial View of Downtown Shanghai, including the Shanghai Tower.
Credit: owngarden/ Moment via Getty Images

The Fastest Elevator in the World Travels Up to 67 Feet Per Second

In the early days, elevators could only travel at about 40 feet per minute. After some 150 years of innovation, the world’s fastest elevator can now travel 67 feet in a second (or around 46 miles per hour). This elevator is located in Shanghai Tower in China, which also includes the longest continuous elevator run, at 1,898 feet. Originally installed by the Japanese company Mitsubishi Electric in 2015, the elevator got an upgrade in 2016, allowing it to traverse a path from the second-level basement to the tower’s 119th floor in just 53 seconds. The elevator in the CTF Finance Center, also located in China, comes in a very close second, traveling at 65 feet per second.

German Engineers Designed a Sideways Elevator in 2017

Since their invention two millennia ago, elevators have done just two things — go up and go down. However, in 2017 a German elevator company began testing an elevator that can travel in any direction. Nicknamed the “Wonkavator” after the multidirectional elevator seen in 1971’s Willy Wonka & the Chocolate Factory, the machine was hailed as “the biggest development in the elevator industry” since the device’s invention. However, a sideways elevator is only the beginning of what’s in store for the technology’s future. Scientists (and sci-fi writers) have also hypothesized about the feasibility of a space elevator that can ferry future astronauts from the Earth’s surface to outer space — completely forgoing the need for expensive, pollution-belching rockets.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Angelica Reyes/ Unsplash

Few brands are as recognizable as Starbucks, a company that began in 1971 as a single Seattle-based store before blossoming into one of the world’s most notable coffee suppliers. Starbucks’ legendary green-and-white logo can be found in most corners of the globe, providing that important morning boost to coffee lovers everywhere. With a legacy over five decades old, Starbucks’ history is as fascinating as its coffee is invigorating. Keep reading to start your day off right with six facts about Starbucks to perk you up.

Starbucks Coffee logo.
Credit: Robert Alexander/ Archive Photos via Getty Images

The Company’s Name Was Inspired by the Novel “Moby-Dick”

While the word “Starbucks” is known by coffee lovers worldwide, the singular version of that word holds a different meaning in the world of literature. The name of the brand was inspired by Herman Melville’s 1851 work Moby-Dick, though that almost wasn’t the case. When deciding on a name for their new company in 1971, the founders of Starbucks briefly considered “Cargo House.” The goal during those early brainstorming sessions was to come up with a name that captured an adventurous spirit and also reflected the storied fishing history of the Pacific Northwest. This in turn led to a suggestion by co-founder Gordon Bowker, who proposed the name “Pequod,” after the ship from Moby-Dick. However, the group decided that going for a “cup of Pequod” didn’t sound particularly appealing, forgot about Moby-Dick for the time being, and went back to the drawing board.

As the brainstorming continued, Bowker claims that his business partner, designer Terry Heckler, mentioned that words beginning with “st” felt powerful. In searching for words beginning with “st,” the group came across an old mining town called “Starbo” on a map of the nearby Cascade Mountains. This reminded Bowker once again of Moby-Dick, and specifically the character Starbuck, who served as the first mate for Captain Ahab. Bowker’s suggestion was a hit with his co-founders, and they tacked on an “s” at the end and officially adopted the name “Starbucks” for their new brand.

View of musician Ray Charles performing.
Credit: Paul Natkin/ Archive Photos via Getty Images

Starbucks Won a Grammy Award in Collaboration With Ray Charles

Starbucks hasn’t only achieved greatness in the world of coffee, but in the music industry, too. Starbucks helped co-produce the 2004 Ray Charles album Genius Loves Company, which proved to be the final studio album by the legendary singer and pianist. The album features 12 awe-inspiring duets with other musical greats, including Willie Nelson, Norah Jones, and B.B. King. It wouldn’t have been possible without the backing of Starbucks, who partnered with Concord Records to produce the album. The result was a smash hit — Genius Loves Company went on to win eight Grammy Awards, including both Record and Album of the year, and also sold enough copies to go triple-platinum.

Starbucks later acquired the Hear Music record label in 2007, expanding its influence in the world of music. However, despite producing albums for esteemed artists including Kenny G, Paul McCartney, and Carly Simon, the label ultimately fell by the wayside as digital music displaced physical media. Even still, Starbucks began a partnership with streaming service Spotify in 2016, ensuring that the coffee company would remain involved in the music scene to some degree.

Employees of the Central Intelligence Agency walk in front of the Agency's facility.
Credit: Pool/ Getty Images News via Getty Images

There’s a Special Starbucks for Members of the CIA

With upwards of 30,000 publicly accessible franchises worldwide, there’s only one Starbucks that specially caters to members of the Central Intelligence Agency. Located inside CIA headquarters in Langley, Virginia, this Starbucks is only available to those with the highest levels of security clearance. While the store is decorated to look like a normal Starbucks in order to help humanize an otherwise tense job, the experience at this Starbucks is anything but normal. In order to maintain secrecy, receipts merely depict “Store Number 1” as opposed to any specific location. Furthermore, baristas — who undergo extensive background checks — are forbidden from writing names on any of the cups, not even aliases; this is done to preserve the confidential identities of CIA agents. Don’t try using your Starbucks rewards card here either, as such perks are banned for fear that they could “fall into the wrong hands.” Despite all these irregularities compared to the normal experience, the store remains an immensely popular fixture among CIA employees and boasts long lines at all hours of the day.

Starbucks employee holding a store branded apron.
Credit: Marvin Tolentino/ Alamy Stock Photo

Different Colored Starbucks Aprons Mean Different Things

When Starbucks was first founded in 1971, its baristas were known for wearing simple brown grocers’ aprons. In 1987, the company adopted their now-iconic green aprons featuring a brand-new logo, which remains the norm for the majority of baristas. Though you’re likely to mostly see green aprons at Starbucks locations, other designs may pop up from time to time. Some Starbucks aprons boast practical applications — traditional green aprons that are embroidered with ASL fingerspelling signify that the barista is fluent in American Sign Language. Others are unique to certain regions, including orange aprons, which are worn in the Netherlands during King’s Day, an annual Dutch celebration on April 27. This seasonality extends to America as well, where red aprons are worn around the December holidays.

The colors black and purple, however, are worn only by the best of the best. The coveted black apron is worn by Starbucks Coffee Masters, who complete the Starbucks Coffee Academy and earn their certification for being extraordinarily passionate and knowledgeable about the product. Even more prestigious is the purple apron, which signifies being a champion barista. These are given to winners of the company’s annual international Starbucks Barista Championship, making it the rarest color of the bunch.

The inside of Starbucks Reserve Roastery, the world's largest Starbucks coffee store.
Credit: NurPhoto via Getty Images

The World’s Largest Starbucks Is Located in Chicago, Illinois

In 2019, the 32,000-square-foot Tokyo Reserve Roastery ceded its “World’s Largest Starbucks” title to a brand-new location on the Magnificent Mile in Chicago, Illinois. Encompassing 35,000 square feet of area and spanning five stories, the new world’s largest Starbucks provides a different experience on every floor. The first and second stories offer customers the chance to sample Reserve brand Starbucks coffee, consume baked goods, and purchase Chicago-themed Starbucks merchandise. Moving up to the third floor, you can find an experiential coffee bar, featuring unique nitrogen-infused gelato drinks and special pistachio lattes, among other concoctions. Floor four offers a different kind of Starbucks experience, as it’s home to a bar brewing up decadent alcoholic cocktails. Last but not least, the fifth floor allows the opportunity for customers to enjoy their Starbucks beverages in a private rooftop setting. All in all, you’re not going to find a bigger Starbucks anywhere in the world.

Starbucks Coffee pumpkin spice latte to-go on a patio table.
Credit: Patti McConville/ Alamy Stock Photo

Starbucks Invented the Pumpkin Spice Latte

Love it or hate it, the pumpkin spice latte is a part of the American coffee identity. It’s hard to imagine that the drink didn’t exist as recently as the early 2000s, and we have Starbucks to thank for the seasonal treat — they introduced it in 2003.

Pumpkin spice lattes were created by the “Liquid Lab” at Starbucks’ Seattle headquarters, and are considered to be the brainchild of Peter Dukes. Dukes had the idea for the latte back in 2001, at a time when Starbucks was trying to conceive of a fall-themed beverage that would become as popular as their seasonal holiday drinks. Short of an actual recipe, the testers brought pumpkin pies into a lab, poured espresso atop, and ate the pie in what proved to be a delicious treat. After matching the taste in drink form, the result blew up into a worldwide sensation.

Pumpkin spice lattes were first tested in 100 Starbucks stores in 2003 before launching worldwide the following year. They went on to sell upwards of 500 million cups in the drink’s first 18 years on the market. The drink has expanded far beyond Starbucks ever since, becoming an autumnal staple of coffee shops everywhere.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Pictorial Press Ltd/ Alamy Stock Photo

Even in a golden age for sitcoms that churned out fan favorites from All in the Family to Taxi, Laverne & Shirley was a breath of fresh air, an anything-goes romp featuring a pair of female leads willing to get down and dirty in a way not seen since the heyday of Lucille Ball. Aided by a pair of lovably goofy upstairs neighbors, a pizza-slinging dad, and a singing boxer (among others), Laverne & Shirley became an instant hit in January 1976 and continued going strong into the ’80s, until the curtain finally dropped following an eight-season run.

Although series creator Garry Marshall lamented the behind-the-scenes stress of the show in his memoir My Happy Days in Hollywood, there was nothing but magic when the cameras rolled on Penny Marshall and Cindy Williams playing off one another as Laverne DeFazio and Shirley Feeney. So let’s sit back with a glass of milk and Pepsi and take a look at the makings of a show about a couple of feisty Milwaukee ladies who did it their own way.

American producer, writer, actor and director Garry Marshall.
Credit: Nancy R. Schiff/ Archive Photos via Getty Images

Laverne and Shirley Were Inspired by Garry Marshall’s Brawling Date

As with Mork & Mindy, Laverne & Shirley originated with the characters’ guest roles on Happy Days, in this case as tough-talking dates for the Fonz and an overwhelmed Richie. According to Garry Marshall’s memoir, when seeking to fill out the backstory of these “girls from the wrong side of the tracks,” he thought back to a night out in Brooklyn in the late 1950s, when his date responded to a rude comment from another woman by engaging her in a full-blown fistfight. That “tough-as-nails quality” permeated his vision of Fonzie’s lady friends, and the favorable audience reaction from that November 1975 episode convinced Marshall that his little sister Penny and her writing partner had the star quality for a spinoff.

Cindy Williams As Shirley Feeney.
Credit: Bettmann via Getty Images

Williams Was Reluctant To Co-Star in the Series

While Penny Marshall was ready to take direction from her big brother, Williams was ambivalent about accepting the offer to co-star in a new sitcom. As a result, producers ended up auditioning others for the part of Shirley, even taping a test scene with another actress named Liberty Williams, although everyone seemed to agree that the original Shirley was the best. Hoping to tip the scale in the right direction, ABC executive Michael Eisner deliberately hid the Liberty Williams test tape and showed the Cindy Williams footage to his bosses for consideration. Meanwhile, the latter finally ended the suspense and agreed to do the series, paving the way for Laverne & Shirley‘s quick launch a few weeks later.

Penny Marshall Created the Show’s Signature Drink and Chant

Laverne & Shirley‘s creator wasn’t the only one who saw his formative influences funneled into the show’s fabric. According to Penny Marshall’s autobiography My Mother Was Nuts, the milk and Pepsi concoction was spawned after a stint at a Jewish summer camp, where kosher dietary restrictions prevented the pairing of milk and meat for meals and most kids drank Pepsi instead. Forbidden to drink Pepsi at home until her milk was finished first, Marshall eventually found that she enjoyed combining the two. Additionally, the distinct “Schlemiel! Schlimazel! Hasenpfeffer Incorporated!” chant that kicks off the intro was one that was oft-repeated by Marshall and her friends as they walked to school in the Bronx, although the actress had no clue where the Yiddish-infused sing-along came from.

American actresses Penny Marshall, as Laverne De Fazio, and Cindy Williams as Shirley Feeney.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Boo Boo Kitty Was Born From a Missed Line

Not to be left out, Williams also had a real-world-inspired contribution to the proceedings when she forgot a line early in the show’s run. As told in her own memoir, Shirley, I Jest, during one rehearsal she was supposed to comment on the pile of dust beneath a bed, but instead pulled out a stuffed cat that was lying there, had a flashback to one of her mother’s pets, and exclaimed, “Oh, look what I found, Laverne. It’s Boo Boo Kitty!” The stuffed animal soon became a series stalwart, but that created an entirely new problem, as producers were unable to find a backup version in case the regular one was lost or destroyed. It wasn’t until an autograph signing years later, when two women approached with their own Boo Boo Kitty dolls, that Williams learned that the playthings were sold along with pajama bags at J.C. Penney in the 1960s.

 David Lander and Michael McKean circa 1979.
Credit: Images Press/ Archive Photos via Getty Images

Lenny and Squiggy Predated “Laverne & Shirley”

The dim but devoted Lenny and Squiggy, played by Michael McKean and David Lander respectively, originated during the pair’s college years at Carnegie Mellon University in the 1960s. They revived the characters for a Los Angeles comedy troupe called The Credibility Gap, and Garry Marshall then gave the young comedians a chance to write themselves into the show. Per Penny Marshall, a couple of tweaks were made to the characters: Lander’s Anthony Squiggliano became Anthony Squiggman, because there was already an abundance of Italians on the show. Garry Marshall also shot down the idea of the neighbors serving as equally matched foils and love interests to the stars. “There has to be someone lower than the two of you,” he told his sister. “That’s Lenny and Squiggy.”

Stars of the TV series Laverne and Shirley, actresses Penny Marshall and Cindy Williams.
Credit: Bettmann via Getty Images

Williams’ Pregnancy Ended Her Run on the Show

Viewers may recall that the final season of Laverne & Shirley was largely filmed without Shirley, a development triggered by Williams’ pregnancy. In Williams’ telling, producers wanted her to work on her scheduled due date and refused to accede to requests for more reasonable hours.

According to her co-star, Williams’ husband had an ever-changing list of demands that became increasingly difficult to accommodate. The impasse left Marshall to carry the now-flailing show by herself (albeit with a doubled salary), and also drove a wedge between the two women that persisted until after Williams’ divorce in 2000. Fortunately, the passing years and hard feelings didn’t extinguish the chemistry they shared as performers, and the flames flickered back to life when they reunited for an episode of Sam & Cat in 2013.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by DutchScenery/ iStock

Some colors tend to come and go as fashion dictates, but a few have been chosen by humans for very specific, utilitarian purposes. Whether it’s about leveraging the advantages or limitations of human sight, or just evoking a particular emotional response, civil engineers and designers have used color to shape our world in ways you may not expect. Here are the stories behind the colors of five everyday objects — and why these hues are perfect for their assigned tasks.

School buses in a parking lot.
Credit: pyzata/ Shutterstock

School Buses

Glimpse a fleet of buses parked at any U.S. public school, and you’ll notice they’re all the same deep yellow — and it’s been that way for nearly a century. In an effort to standardize school bus construction around the country, thus ideally making them both safer and cheaper to mass-produce, school transportation officials met at Columbia University in 1939 to discuss the universal color for these vehicles. Fifty shades were hung up on the walls, ranging from lemon to deep orange. The color that was finally selected — known today as National School Bus Glossy Yellow, or Color 13432 — was chosen because of its ability to stand out from the background. Education officials didn’t know it at the time, but Color 13432 is wired to capture our attention, as the shade stimulates two of the three types of cones in the human eye — sending double the transmission to the brain compared to many other colors. That’s one reason a big yellow school bus is just so hard to miss.

Interior of Modern Film Studio with Green Screen and Light Equipment.
Credit: Rashevskyi Viacheslav/ Shutterstock

Green Screens

In the age of computer-generated imagery, the green screen is nearly as ubiquitous as the film camera. The technique using green screens, called chroma keying, has been around since the early days of film, but why is the screen green, exactly? Turns out, this verdant hue has more to do with human skin tones than the color itself. Most human skin is essentially some shade of orange, and because green (and in some cases blue) is far away from this hue, the color can be used by a “chroma keyer” to replace the background image without affecting the human in the middle. This also explains why meteorologists can’t wear green on St. Patrick’s Day, since they would “disappear” on newscasts; the chroma keyer would include their green-hued clothes along with the green screen.

Traffic light against the blue sky.
Credit: microstock3D/ Shutterstock

Traffic Lights

Traffic lights today help motorists navigate busy intersections, but this helpful technology is actually a direct descendant of railroad traffic lights. Before the dawn of the automobile, railroads used the color red to mean “stop” because red has the longest wavelength in the visible spectrum — meaning it could be seen farther than any other color. This was (and is) especially important on the rails, because trains can take at least a mile to come to a stop. Initially, green meant “caution” and white meant “all clear,” but when some conductors confused starlight as an all-clear signal, green eventually replaced white.

In the very early years of the automobile era, traffic lights were only two colors — green and red. The first yellow light wasn’t introduced until 1920, and the three-way traffic light we know today wasn’t patented until 1923. Yellow became known as “caution” due to the fact that it’s the second-easiest color to spot, after red. Originally, yellow was also used for stop signs as it was easier to see the color at night, but the invention of reflective materials and non-fading dyes soon saw the spread of red stop signs throughout the country.

Scrub nurse preparing tools for operation
Credit: ChaNaWiT/ Shutterstock

Surgical Scrubs

Walk into any hospital (or watch any medical drama), and surgeons are almost always wearing bluish-green scrubs. Because blue and green are far removed on the color spectrum compared to red, these cooler colors help refresh a surgeon’s eyes when operating on a patient (whose insides are essentially various shades of red). Because surgeons are visually focused on red-hued environments, glancing at a white background (the chosen hospital color of times past) can leave a ghostly green after-image, much like what your eyes experience after a camera flash. However, if the surrounding environment is green, then those after-images simply blend into the background.

Top down view on comercial airplane docking in terminal.
Credit: Piotr Mitelski/ Shutterstock

Airplanes

Although today’s Boeing 737s and Airbus A320s can feature colorful airline logos, the majority of the plane is painted white — and that’s for a good reason. Because the color white contains all colors in the visual spectrum, it’s also the most reflective, which helps keep airliners cool, especially when taxiing on runways. Airplanes are usually cooled by air sucked into engines during flight, or by external units hooked up to planes at the gate. However, if a plane experiences a long delay on the tarmac and engines are idle, things can get toasty really quick. So airlines will use any trick in the book, including lowering sun shades, opening air vents, and yes, painting planes white, to help keep passengers in the cabin as comfortable as possible.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by diignat/ Shutterstock

The most beloved children’s books are also remembered fondly by grown-ups, but there’s always something new to learn about them. Which rhyming classic, now a standard part of any nursery, had dismal sales at first? What was Where the Wild Things Are originally about — and where did the titular Wild Things come from? Which author started her iconic tales by writing letters?

From innovative illustrations to a bestseller written on a $50 bet, these six facts about favorite children’s books will send you straight to the library for a little rereading.

An old copy of Goodnight Moon, an American children's book.
Credit: Felix Choo/ Alamy Stock Photo

“Goodnight Moon” Wasn’t a Huge Success at First

Goodnight Moon by Margaret Wise Brown is one of the most famous children’s books of all time — but it was never a big success during the author’s lifetime. It sold just 6,000 copies when it first came out in the fall of 1947, and reviews were middling to mixed. So what happened?

One possible reason for its initial popularity problem could be Anne Carroll Moore, former head of New York Public Library’s children’s services and a wildly influential figure in the children’s literature world. If Moore hated a book, it made an impact that reverberated far beyond the Empire State, and she thought that Goodnight Moon was cloyingly sentimental. The book eventually became popular probably by word of mouth, but it took a long time: The title sold 4,000 copies in 1955, 8,000 copies in 1960, then nearly 20,000 in 1970, and only went up from there. It has never been out of print. Even the New York Public Library finally put it in circulation in 1972 — although the delay likely kept it off their Top Checkouts of All Time list.

At least Goodnight Moon was in good company. While Moore made many library innovations that we take for granted today, including the very idea of having a space for children at a library, she had some controversial opinions on the books themselves. She had an intense professional relationship with author E. B. White that eventually became adversarial, and hated Stuart Little with a passion — although her effort to ban it from libraries and schools got severe pushback from other parts of the literary community. She wasn’t a huge fan of Charlotte’s Web, either, but her influence had waned by the time it was published.

An assistant holds a first edition of The Tale of Peter Rabbit by Beatrix Potter.
Credit: Carl Court/ Getty Images News via Getty Images

“The Tale of Peter Rabbit” Started as a Letter to a Sick Child

At age 27, author Beatrix Potter wrote an eight-page letter, hoping to cheer up the sick 5-year-old child of her former governess. In it, she told a story of Peter Rabbit and his siblings Flopsy, Mopsy, and Cotton-Tail. Potter loved making watercolor images of animals, so she illustrated the tale before sending it off.

Publishing was not part of the original plan, but after getting an overwhelmingly positive response to her letter, she decided to send it around to publishers. After getting rejected at least six times, Potter published The Tale of Peter Rabbit independently in 1901. The next year it was picked up by a major publisher, became a major hit, and the rest is history.

The Tale of Peter Rabbit isn’t the only Potter story to start this way. Others, including The Tale of Squirrel Nutkin, The Tale of Mr. Jeremy Fisher, and some other Peter Rabbit books, were also based on illustrated letters sent to children.

Little Blue Engine graphic from the children's book.
Credit: The Protected Art Archive/ Alamy Stock Photo

Nobody Knows Who Actually Wrote “The Little Engine That Could”

Today, there’s a standard edition of The Little Engine That Could, immortalized as a standalone children’s book in 1930. That version credits the story “as retold by” Watty Piper, a pseudonym for children’s book publisher Arnold Munk — because the tale actually dates back far enough that it’s practically considered a folktale.

One Little Engine enthusiast even found a version published in Sweden in 1903. In 1906, a minister used a version of the story, complete with “I think I can” and “I thought I could,” as a parable in a sermon published in a newspaper. By 1920, the story was already in wide circulation. The 1930 version’s closest relative is The Pony Engine, published in a children’s magazine by educator Mabel Bragg in 1916. There was a legal battle in the 1950s over whether another author published a similar version in a series of newsletters in the early 1910s.

The original author’s identity remains unknown, and, with traces dating back about 120 years, anyone with direct knowledge of the story’s beginnings is almost certainly dead. At this point, maybe it’s a collaborative work, anyway.

Author/illustrator Maurice Sendak standing by an life-size scene from his book.
Credit: James Keyser/ The Chronicle Collection via Getty Images

Maurice Sendak’s “Wild Things” Were Originally Horses

It’s hard to imagine a world without Where the Wild Things Are, but if author and illustrator Maurice Sendak had just been a little better at drawing horses, things could have turned out much differently.

“At first,” Sendak told the LA Times in 1993, “the book was to be called ‘Where the Wild Horses Are,’ but when it became apparent to my editor I could not draw horses, she kindly changed the title to ‘Wild Things,’ with the idea that I could at the very least draw ‘a thing’!”

Now tasked with drawing “things,” Sendak turned to his extended family for inspiration. As a child, he dreaded when his “hideous, beastly relatives,” with what he described as bad breath, blood-stained eyes, and giant yellow teeth, would show up for dinner, ready to squeeze and pinch him.

“So I drew my relatives,” Sendak continued. “They’re all dead now, so I can tell people.”

Children's book, The Snowy Day, on an open page.
Credit: Amy Sussman/ WireImage via Getty Images

The Illustrations in “A Snowy Day” Are Mixed-Media Collages

A Snowy Day by Ezra Jack Keats was groundbreaking in many ways when it was first published in 1962: It was one of the first, if not the first, American full-color children’s books to feature a Black protagonist, for example, and one of just a handful of them to feature an urban landscape. The collaged illustrations, which earned Keats a Caldecott Medal, were fresh and innovative, using a combination of cloth, paper, and paint to create Peter and a snow-covered New York City.

Keats had illustrated children’s books before, and typically only used paint, which was the original plan for A Snowy Day. Instead, he fell into collage, making paper cutouts for the buildings, adding fabric embellishments, and dressing Peter’s mother in oilcloth. He also used homemade snowflake stamps and applied India ink with a toothbrush to complete the look. He continued to use collage for all his future works.

The book, Green Eggs and Ham by Dr Suess.
Credit: Ben Molyneux/ Alamy Stock Photo

“Green Eggs and Ham” Has a 50-Word Vocabulary

In his earlier career, Theodor “Dr. Seuss” Geisel was a little wordier; his first book, And to Think That I Saw It on Mulberry Street, is a little bit of a mouthful in the title alone. So how did we get from there to, say, Hop on Pop?

For The Cat in the Hat, Geisel’s publisher challenged him to limit his vocabulary to just 225 words chosen from a 348-word early reader vocabulary list, making it both easy and exciting for very young children learning to read. He picked the first two rhyming words he saw, “cat” and “hat,” and built the entire plot from there. The finished product was 11 words over the limit, at 236.Soon afterward, Geisel’s publisher gave him a new, more difficult challenge — write a book using only 50 words — and bet him $50 he couldn’t do it. This time, the author stuck to the limit, and the result was “Green Eggs and Ham.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by solidcolours/ iStock

You’ve probably heard that your fingerprints are unique, and that no one else in the entire world shares the same pattern of ridges as you have on your fingertips. This certainty is so absolute that fingerprints have been used as a means of identifying people for thousands of years — Chinese societies used them for this purpose possibly as early as 300 BCE. But beyond their individuality, fingerprints continue to be a mystery to scientists and are constantly being studied. That being said, here are a few facts we do know.

Two women high five on a mountaintop.
Credit: by-studio/ iStock

Your Fingerprints Contain Whorls, Arches, and Loops

Technically, the patterns of ridges on your fingertips are called dermatoglyphs, from the Greek roots derma (skin) and glyph (carving). Fingerprints are the impressions left by the dermatoglyphs, though people often refer to both the ridges and their impressions as fingerprints.

The ridges follow three universal patterns: loops, whorls, and arches. Loops are curved ridges that fall back on themselves in an elongated C shape, with the open ends of the C pointing either toward or away from the thumb. Whorls are ridges in concentric circles or spirals. Arches look like the contour of a mountain. Of the three types, loops are the most common (60% of fingerprints), followed by whorls (35%), and arches (5%).

Close-up of a handprint being brushed for evidence.
Credit: D-Keine/ iStock

It’s Almost Impossible To Change or Eliminate Fingerprints

Fingerprints don’t change pattern or naturally disappear over the course of a person’s life, and it’s incredibly difficult to get rid of fingerprints, as some famous criminals could attest. The gangster John Dillinger attempted to hide his identity by burning off his fingerprints with acid (it didn’t work). At least two other Depression-era murderers tried to obliterate their prints with knife cuts. All found out the hard way that it’s almost impossible to eliminate fingerprints on purpose. The ridges will eventually grow back into the same patterns.

However, there are a few scenarios in which someone may lack them. People with a genetic condition called adermatoglyphia are born without ridges in the skin of their fingertips, palms, or soles of their feet. Skin conditions like psoriasis and repetitive manual labor can also change or wear down the ridges. Some chemotherapy drugs cause hands to swell and blister, resulting in the loss of fingerprints.

Pregnant girl in a white dress on a background of a river.
Credit: Mariia Kokorina/ iStock

Fingerprints Form Before You’re Born

Studies have suggested that genetic and environmental factors guide the formation of ridges on the fingers, hands, toes, and feet between the third and sixth months of fetal growth. Genes that control the development of dermal layers in these body parts seem to dictate the size, shape, and pattern of the ridges. The chemical balance in the mother’s uterus probably plays a role, too. All of these influences add up to the creation of a set of dermatoglyphs that is totally unique — even identical twins have different fingerprints. Though it should be noted that despite the fact that scientists have studied fingerprints for a few centuries, they still don’t agree on how they form.

A fingerprint under a magnifying glass.
Credit: solidcolours/ iStock

The Function of Fingerprints Is Still a Mystery

Scientists also don’t agree on why we have fingerprints in the first place, but they have some theories. For decades, biologists assumed that dermatoglyphs evolved to give people a better grip on things. Two University of Manchester researchers tested this idea in 2009 by running hard plastic sheets over their fingertips and measuring the amount of friction. They found that their ridges actually decreased the contact area between the fingertip and plastic, reducing grip power.

More recent research has suggested that fingerprints improve our sense of touch. For a 2021 study, researchers at Sweden’s Umeå University recorded 12 participants’ nerve responses while finely textured cards were run over their fingertips. The responses revealed hotspots of sensitivity that matched the ridge patterns of the subjects’ fingerprints — supporting the idea that they enhance our tactile feeling.

A baby gorilla inside the Virunga National Park, the oldest national park in Africa.
Credit: leonardospencer/ iStock

Animals Have Unique Fingerprints, Too

One reason that scientists thought fingerprints helped our grip ability is that other primates who climb and hang on to trees also have them. Gorillas, orangutans, and chimpanzees have dermatoglyphs on their fingers and toes that are unique to each individual.

Koalas also have fingerprints that look remarkably human, even though humans aren’t related to koalas. Biologists believe their dermatoglyphs represent convergent evolution, in which unrelated organisms independently develop similar traits.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by CSA Images/ iStock

You’ve read about them. You’ve sung about them. You’ve watched movies about them. For centuries, American folklore has sent our imaginations into overdrive with the tales of men who conquered the dangers of the wild terrain with their strength, wits, and superhuman gifts. Some were based on the archetype of a character, others were embellishments of real people, but all served to entertain and inspire by displaying abilities beyond the reach of normal mortals. Here are eight hyperbolized heroes who outlived the confines of their era to survive as legends for later generations to admire.

Paul Bunyan carrying a log.
Credit: CSA Images/ iStock

Paul Bunyan

Few creations can match the prowess of Paul Bunyan, the titan of the North Woods who palled around with Babe the Blue Ox and was responsible for the formation of landmarks such as the Grand Canyon and the Great Lakes. For all the obvious hyperbole, the character may have been based on the real-life French Canadian lumberjacks Bon Jean and Fabian Fournier, the latter better known by the workers who traded tales at logging camps in the late 1800s.

Bunyan stories first appeared in print just after the turn of the century, but it was a marketing campaign by the Red River Lumber Company that introduced the behemoth woodsman to the masses during World War I. Collected stories soon appeared in book form, establishing a mythical mainstay that remains larger than life through the monuments in his honor that populate the northern landscape.

Davy Crockett holding a gun and a hat.
Credit: Bettmann via Getty Images

Davy Crockett

There’s no question that Davy Crockett, a three-term U.S. congressman from Tennessee, was a real man, if not the “half horse, half alligator” he allegedly claimed to be. Regardless, the folksy, bear-hunting lawmaker with scant formal education was an anomaly among his well-bred peers and was already a celebrity by the time the first Davy Crockett’s Almanack appeared in print in 1835.

The legend received another jolt when he was killed at the famed 1836 Battle of the Alamo, and by the 1840s, the Almanack was featuring more outlandish stories of its hero handily fighting off bears and alligators. Still, Crockett may well have faded into memory, were it not for his mid-1950s revival by way of the Disney TV series and movies that had children everywhere wearing coonskin caps and singing about the “king of the wild frontier.”

Johnny Appleseed 5 cent stamp on American flag.
Credit: NoDerog/ iStock

Johnny Appleseed

The black sheep of the American folklore canon, Johnny Appleseed achieved immortality not through acts of cunning or bravery, but by way of his ragtag clothing and peaceful rapport with all living creatures as he scattered his wares across the land. Ironically, the man behind the myth — John Chapman — did display immense courage, fortitude, and resourcefulness by traipsing thousands of miles across the eastern wilderness and establishing orchards to aid settlers in the first half of the 1800s.

A zealous proponent of the Church of the New Jerusalem who had no permanent home and largely refused to sleep indoors, the eccentric Chapman was already famous by the time he died in 1845. But his fame lived on through the exaggerations that became associated with his memory via the written “recollections,” poems, and children’s stories that circulated in the decades afterward.

Mike Fink, Last of the Keelboatman.
Credit: Niday Picture Library/ Alamy Stock Photo

Mike Fink

Another flesh-and-blood man who saw his celebrity swell as the frontier mythos gained steam, Mike Fink earned renown as a keelboatman on the mighty Mississippi River in the early 1800s. Tall and powerful, he allegedly boasted he could “outrun, outshoot, throw down, drag out, and lick any man in the country,” though his brashness and heavy-drinking ways may well have contributed to his death in 1823.

Five years later, the first Fink tale appeared in The Western Souvenir, giving rise to several decades’ worth of stories that focused more on his reputation for practical jokes than shows of strength. While his legend dimmed by the end of the century, Fink also received a lifeline from Disney when he was presented as the arch-foe-turned-ally in 1956’s Davy Crockett and the River Pirates.

an illustration of the fictional folkloric character of Pecos Bill.
Credit: Blank Archives/ Hulton Archive via Getty Images

Pecos Bill

The personification of the rough-and-tumble cowboy who tamed the Old West, Pecos Bill supposedly was raised by coyotes, single-handedly invented the modern methods of ranching, and could be seen riding a cyclone when not astride his bucking horse, the Widow-Maker. Such an indomitable character was no match for any enemy, though at least one account says the end came after he saw a Yankee dressed as a cowboy and laughed himself to death.

The first published Pecos Bill stories appeared around World War I from the hand of Edward O’Reilly, who insisted he heard the outlandish tales as a child, though historians have since thrown that claim into doubt. Whatever his origin, Pecos Bill’s adventures are more than wild enough to earn him a distinguished place in the tall-tale pantheon.

John Henry, an American folk hero.
Credit: Science History Images/ Alamy Stock Photo

John Henry

One of the few Black heroes of American folklore, John Henry was said to be the strongest steel driver toiling on the construction of the Chesapeake and Ohio Railway in the 1870s. When a steam-powered drill was introduced, Henry took it as a challenge to demonstrate man’s superiority over machine, winning the duel but working himself to death in the process.

As with other legends, historians have sought to uncover the source of the tale, with some claiming to have pinpointed a real John Henry and others determining that he was a composite of the anonymous hands who undertook the backbreaking labor. Regardless, his story struck a chord with audiences through the printed page and screen, and especially through the African American blues tradition that gave rise to work songs like “The Ballad of John Henry.”

Sailing old ship in a storm sea.
Credit: muratart/ Shutterstock

Old Stormalong

A Bunyanesque character for the seafaring set, Old Stormalong stood 30 feet tall, according to some accounts, tangled with the Kraken of Norse mythology, and commanded a ship so large it had hinged masts to avoid colliding with the moon. It’s unknown who — if anybody — served as the model for the character, whose origins trace back to North Atlantic sea shanties of the 1830s and ’40s. While those early work songs presented “Old Stormy” as more of an everyman sailor, it was Frank Shay’s Here’s Audacity! American Legendary Heroes (1930) that brought him to life as a titanic superman of the surf, clearing the decks for his inclusion among the famous outsized figures of the genre.

 The character of Mose, a New York Irish Bowery B'hoy and volunteer fireman.
Credit: Everett Collection Historical/ Alamy Stock Photo

Big Mose

As opposed to his rural counterparts, Big Mose was a hero of urban origins: A firefighter in New York City’s Bowery district, he supposedly stood 8 feet tall, boasted hands the size of Virginia hams, and uprooted streetcars and lampposts with ease. Once again, this was a character inspired by a legitimate person, a volunteer fireman, printer, and brawler named Moses Humphreys. And while the oral recounts were codified through the works of Ned Buntline, the Mose legend took root through a series of wildly popular stage plays in the 1840s and ’50s. Big Mose may not be as well known today as Davy Crockett and Johnny Appleseed, but his myth was every bit as formidable as the others’ during his pre-Civil War heyday.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo United Archives GmbH/ Alamy Stock Photo

It’s been a quarter century since the historic epic Titanic, directed by James Cameron, hit the big screen on December 19, 1997, and became one of the highest-grossing films of all time. A technical and artistic marvel, Titanic forever changed the cinematic landscape with its groundbreaking set design and special effects, won over audiences with its romantic plotline, and catapulted the careers of now A-listers Kate Winslet and Leonardo DiCaprio.

The film portrays the tragic sinking of the R.M.S. Titanic during its maiden voyage across the Atlantic on April 4, 1912, from the perspective of two young passengers of different social classes — Rose DeWitt Bukater (played by Winslet) and Jack Dawson (played by DiCaprio) — who fall in love onboard and are forced to navigate the deadly disaster unfolding in the background. Even 25 years later, the film holds multiple records and is etched in pop culture memory. But how much do you know about what went into making Titanic? From the massive, custom-designed set to Kate Winslet’s famous improvised scene and the film’s controversial ending, discover seven facts about Titanic in celebration of the film’s 25th anniversary.

Director James Cameron raises his Oscar after winning in the Best Director Category.
Credit: TIMOTHY A. CLARY/ AFP via Getty Images

No Film Has Won More Academy Awards Than “Titanic”

Titanic swept the 1998 Academy Awards, winning 11 of the 14 awards for which the film was nominated, including Best Picture and Best Director. That matched a previous record set in 1960 by the religious epic Ben-Hur, starring Charlton Heston. Only one other film since then has equaled Titanic’s awards haulThe Lord of the Rings: Return of the King in 2004 — but as of 2022, none has exceeded it. In addition, no film to date has secured more than Titanic’s 14 nominations, a record the film shares with the 1950 comedy All About Eve and 2017’s La La Land.

Titanic was not only an awards success but also a box-office juggernaut. It held the worldwide record for highest lifetime gross for more than 20 years. The current record-holder is 2009’s Avatar, also directed by James Cameron, but Titanic still holds the No. 3 spot, just behind Avengers: Endgame. Of course, it doesn’t hurt that Titanic kept playing in theaters for nearly 10 months after it was released.

A view of the Titanic film boat.
Credit: TCD/Prod.DB/ Alamy Stock Photo

The Watery Set of “Titanic” Held Nearly 17 Million Gallons of Water

The film’s epic story required an elaborate custom-designed set that cost an estimated $20 million to build. The set’s primary feature was a horizon tank, a specialized tank that allows filmmakers to film an ocean scene with a view of the horizon without plunging actors into the middle of an actual ocean. There are only a handful of these tanks in existence worldwide.

According to the technical journal JOM, the tank built for the film at Fox Baja Studios near Rosarito, Mexico (where the majority of the shots were filmed), was the largest shooting tank in the world at the time, containing nearly 17 million gallons of salt water. This allowed for a 774-foot-long set of the ship itself, although not everything was filmed in the giant tank. The dining room and the staircase, for example, were constructed on a hydraulic platform at the bottom of an interior tank, and were designed to be flooded with water piped in from the ocean.

After shooting wrapped on Titanic, Fox continued to use the giant tank for other sea epics, including Pearl Harbor (2001) and Master and Commander: The Far Side of the World (2003), until they sold the studio in 2007 to a group of local investors. Since then, recent projects filmed there include the second season of Fear the Walking Dead and the Netflix project Selena: The Series.

Rose lying down in Titanic film.
Credit: AJ Pics/ Alamy Stock Photo

A Utah Video Store Once Charged $5 To Make “Titanic” More Family-Friendly

Titanic earned its PG-13 rating in particular with two famously sexy scenes between Rose and Jack — one in which Rose poses nude so Jack can draw her portrait, and another where the pair steam up the backseat of an automobile in the cargo hold. Even though all we see in the latter scene is a hand against a steamy window, it was too much for some viewers — so the owner of a small video store in Utah came up with a creative solution. Sunrise Family Video in American Fork, about 25 miles northwest of Provo, began charging customers $5 to edit one or both of the racy scenes out of their home VHS copies. For an extra $3, employees would cut out any other scene customers wanted removed. The service was apparently popular, even after Paramount Pictures threatened legal action — by September 1998, the wait time for the service was five weeks.

Director James Cameron stands on the set of the movie "Titanic".
Credit: MERIE WALLACE/ AFP via Getty Images

James Cameron Drew the Iconic Nude Portrait of Kate Winslet’s Character

In the film’s famous portrait scene, Rose instructs Jack to “draw me like one of your French girls.” But it wasn’t actor Leonardo DiCaprio who sketched the portrait of Rose reclining in her suite wearing only the “Heart of the Ocean” diamond — it was, surprisingly, James Cameron.

The director, also a talented sketch artist with a background in life drawing, used reference photos of Winslet to make the finished product, which he wanted to get exactly right. “I figured it was time to put all that time I spent doing life drawing to work,” he reflected in his book Tech Noir: The Art of James Cameron. In the film, the sketch eventually ends up in Cal’s safe, but in real life, it ended up in the hands of the highest bidder, going for a reported $16,000 at auction in 2011.

White powder on a black background.
Credit: Jair Fonseca/ Shutterstock

A Mysterious Poisoning Incident Plagued the Crew During Filming

While filming in Dartmouth, Nova Scotia, in August 1996, more than 50 people working on Titanic — including star Bill Paxton, producer Jon Landau, and director James Cameron — were sent to the hospital after eating a late-night meal and beginning to feel confusion, nausea, and other strange bodily effects. (Kate Winslet and Leo DiCaprio weren’t filming in Nova Scotia at the time.) It certainly didn’t help that the dish apparently responsible for the incident — a chowder that crew members alternately described as lobster, clam, or mussel — was apparently quite delicious.

It was later determined that the cause of the incident wasn’t food poisoning, but rather someone who spiked it with PCP, a hallucinogenic also known as angel dust. Paxton, Cameron, and set painter Marilyn McAvoy have all recalled the ensuing chaos in the press over the years. Cameron got lost on a set that he’d built himself and later got stabbed with a pen by another crew member feeling the effects. At one point, there was even a conga line. The person responsible has never been found, even though local police apparently investigated the incident for more than two years. Cameron suspected a disgruntled crew member who had been fired the day before for starting trouble with the caterers.

Call in Titanic film.
Credit: Maximum Film/ Alamy Stock Photo

Spitting On Cal Was Kate Winslet’s Idea

In a memorable scene from the film, Rose’s fiancé Cal (played by Billy Zane) grabs her arm as she attempts to run away from him, so she spits on him. The original script called for Rose to stick Cal with a hat pin — but Kate Winslet decided to spit in his face instead. Cameron called the move “genius,” a callback to the scene in which Jack taught her to “spit like a man.” Winslet reportedly swished K-Y Jelly around in her mouth beforehand for maximum effect.

Zane, however, wasn’t as thrilled about the change, especially after filming multiple takes. “There are few things you remember as well as being spat upon, let [alone] 17 times,” Zane told Entertainment Tonight in 1997. Still, he “felt being on the receiving end of that juice was better than preparing it in one’s mouth prior to launch.”

Jack in Titanic film.
Credit: Maximum Film/ Alamy Stock Photo

James Cameron Insists That Jack’s Death Was Necessary

One of the most controversial scenes in Titanic is Jack’s watery death at the end. Many contend that there was plenty of room for him to survive on the wooden board that Rose was floating on in the icy waters after the ship sank. The TV series Mythbusters even aired an episode on the topic in 2012. The show ran multiple simulations to determine whether Jack’s survival was possible, but in most scenarios, they found that Jack’s death was inevitable, as the weight of their two bodies would have sunk the board too low in the water.

To James Cameron, however, the question misses the point entirely. “The script says Jack died. He has to die,” he said in response to the Mythbusters episode in 2012. “So maybe we screwed up and the board should have been a little tiny bit smaller, but the dude’s goin’ down.” Cameron doubled down on his stance in a 2017 interview with the Daily Beast: “Look, it’s very, very simple: You read page 147 of the script and it says, ‘Jack gets off the board and gives his place to her so that she can survive.’ It’s that simple.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Krakenimages.com/ Shutterstock

You can never have too many conversation-starters, and there are plenty of fascinating, wild, or just plain strange facts to go around. Sure, you could store some in the back of your mind for your next big dinner party or family holiday, but others refuse to be contained. They’re just too silly, weird, or absolutely confounding.

For example, did you know that one of the world’s most recognizable (and fully inanimate) landmarks grows in the summer, or that the dot on a lowercase “i” has a name? What about the fact that bananas are actually berries, or that one of our most beloved pets can be allergic to people? Get the full stories on these tidbits and more facts that are just too good to keep to yourself.

A women's face glowing in a dark room.
Credit: h heyerlein/ Unsplash

Humans Glow

Bioluminescence, the strange biology that causes certain creatures to glow, is usually found at the darkest depths of the ocean where the sun’s light doesn’t reach. While these light-emitting animals seem otherworldly, the trait is actually pretty common — in fact, you’re probably glowing right now.

According to researchers at Tohoku Institute of Technology in Japan, humans have their own bioluminescence, but at levels 1,000 times less than our eyes can detect. This subtle human light show, viewable thanks to ultra-sensitive cameras, is tied to our metabolism. Free radicals produced as part of our cell respiration interact with lipids and proteins in our bodies, and if they come in contact with a fluorescent chemical compound known as fluorophores, they can produce photons of light. This glow is mostly concentrated around our cheeks, forehead, and neck, and most common during the early afternoon hours, when our metabolism is at its busiest.

A veterinarian examines a sick cat.
Credit: Yekatseryna Netuk/ Shutterstock

Cats Can Be Allergic to People

If you love cats but can’t have one of your own because you’re allergic, the feeling may be mutual. It isn’t common, but cats can be allergic to people. The condition is rare in part because we humans usually bathe regularly and thus don’t shed as much dead skin or hair as other animals (and it’s somewhat unclear how much of a problem human dander may be for felines). That said, cats are fairly sensitive to chemicals and sometimes have a negative reaction to certain perfumes, laundry detergents, and soaps. Cat allergic reactions look much the same as the ones humans get — they may manifest as sneezing, runny noses, rashes, hives, or other uncomfortable symptoms. In rare cases, cats can even be allergic to dogs. (Maybe that’s why some of them don’t get along.)

A close-up of the eye of a reindeer.
Credit: James McKay/ Shutterstock

Reindeer Eyes Change Color With the Seasons

Rudolph’s nose may have been red, but his eyes were blue — except in the summer, when they would have been golden. That’s because reindeer eyes change color depending on the time of year, which helps them see better in different light levels. Their blue eyes are approximately 1,000 times more sensitive to light than their golden counterparts, a crucial adaptation in the dark days of winter. Only one part changes color, however: the tapetum lucidum, a mirrored layer situated behind the retina. Cats have it, too — it’s why their eyes appear to glow in the dark.

Chocolate chip cookies with flaky salt on a cooling rack.
Credit: Elena Veselova/ Shutterstock

Chocolate Chips Were Invented After Chocolate Chip Cookies

Ruth Wakefield was no cookie-cutter baker. In fact, she is widely credited with developing the world’s first recipe for chocolate chip cookies. In 1937, Wakefield and her husband, Kenneth, owned the popular Toll House Inn in Whitman, Massachusetts. While mulling new desserts to serve at the inn’s restaurant, she decided to make Butter Drop Do pecan cookies (a thin butterscotch treat) with an alteration, using semisweet chocolate instead of baker’s chocolate. Rather than melting in the baker’s chocolate, she used an ice pick to cut the semisweet chocolate into tiny pieces. Upon removing the cookies from the oven, Wakefield found that the semisweet chocolate had held its shape much better than baker’s chocolate, which tended to spread throughout the dough during baking to create a chocolate-flavored cookie. These cookies, instead, had sweet little nuggets of chocolate studded throughout. The recipe for the treats — known as Toll House Chocolate Crunch Cookies — was included in a late 1930s edition of her cookbook, Ruth Wakefield’s Tried and True Recipes.

The cookies were a huge success, and Nestlé hired Wakefield as a recipe consultant in 1939, the same year they bought the rights to print her recipe on packages of their semisweet chocolate bars. To help customers create their own bits of chocolate, the bars came prescored in 160 segments, with an enclosed cutting tool. Around 1940 — three years after that first batch of chocolate chip cookies appeared fresh out of the oven — Nestlé began selling bags of Toll House Real Semi-Sweet Chocolate Morsels, which some dubbed “chocolate chips.” By 1941, “chocolate chip cookies” was the universally recognized name for the delicious treat. For her contributions to Nestlé, Wakefield reportedly received a lifetime supply of chocolate.

Angry bull in Spain.
Credit: alberto clemares exposito/ Shutterstock

Bulls Can’t Actually See the Color Red

If the very idea of bullfights makes you see red, you’re not alone — even though bulls themselves can’t actually see the color. As is the case with other cattle and grazing animals such as sheep and horses, bulls’ eyes have two types of color receptor cells (as opposed to the three types that humans have) and are most attuned to yellows, greens, blues, and purples. This condition, a kind of colorblindness known as dichromatism, makes a bullfighter’s muleta (red cape) look yellowish-gray to the animals.

So why are bulls enraged by the sight of matadors waving their muletas? The answer is simple: motion. The muleta isn’t even brought out until the third and final stage of a bullfight. The reason it’s red is a little unsavory — it’s actually because the color masks bloodstains. In 2007, the TV show MythBusters even devoted a segment to the idea that bulls are angered by the color red, finding zero evidence that the charging animals care what color is being waved at them and ample evidence that sudden movements are what really aggravate the poor creatures.

Close-up of dipping french fries into ketchup.
Credit: Atsushi Hirao/ Shutterstock

Ketchup Was Originally Made Out of Fish

If you asked for ketchup thousands of years ago in Asia, you might have been handed something that looks more like today’s soy sauce. Texts as old as 300 BCE show that southern Chinese cooks were mixing together salty, fermented pastes made from fish entrails, meat byproducts, and soybeans. These easily shipped and stored concoctions — known in different dialects as ge-thcup, koe-cheup, kêtsiap, or kicap — were shared along Southeast Asian trade routes. By the early 18th century, they had become popular with British traders. Yet the recipe was tricky to recreate back in England because the country lacked soybeans. Instead, countless ketchup varieties were made by boiling down other ingredients, sometimes including anchovies or oysters, or marinating them in large quantities of salt. One crop that the English avoided in their ketchup experiments was tomatoes, which for centuries were thought to be poisonous.

Across the Atlantic, Philadelphia scientist James Mease created the first tomato-based ketchup recipe in 1812. More than half a century later, Henry J. Heinz founded his food company in Sharpsburg, Pennsylvania, initially selling pickles, horseradish, and more. The first commercial tomato ketchups — including Heinz’s 1876 product — relied on chemicals to preserve their freshness and color, including formalin and coal tar. But around 1904, chief Heinz food scientist G.F. Mason devised an all-natural blend that included tomatoes, distilled vinegar, brown sugar, salt, and spices. With the signature formula now established, the brand was able to meet the growing U.S. demand for hot dogs, french fries, and hamburgers.

Close-up of a red stop sign on the road.
Credit: Aleksandr Kadykov/ Unsplash

The Inventor of the Stop Sign Never Learned How To Drive

Few people have had a larger or more positive impact on the way we drive than William Phelps Eno, sometimes called the “father of traffic safety.” The New York City-born Eno — who invented the stop sign around the dawn of the 20th century — once traced the inspiration for his career to a horse-drawn-carriage traffic jam he experienced as a child in Manhattan in 1867: “There were only about a dozen horses and carriages involved, and all that was needed was a little order to keep the traffic moving,” he later wrote. “Yet nobody knew exactly what to do; neither the drivers nor the police knew anything about the control of traffic.”

After his father’s death in 1898 left him with a multimillion-dollar inheritance, Eno devoted himself to creating a field that didn’t otherwise exist: traffic management. He developed the first traffic plans for New York, Paris, and London. In 1921, he founded the Washington, D.C.-based Eno Center for Transportation, a research foundation on multimodal transportation issues that still exists. One thing Eno didn’t do, however, is learn how to drive. Perhaps because he had such extensive knowledge of them, Eno distrusted automobiles and preferred riding horses. He died in Connecticut at the age of 86 in 1945 having never driven a car.

Statue of a unicorn in Scotland.
Credit: Jule_Berlin/ iStock

The National Animal of Scotland Is the Unicorn

America has the eagle, England has the lion, and Scotland has the unicorn. And while the horned mythological creature may not actually exist, the traits it represents certainly do: Purity, independence, and an untamable spirit are all qualities Scotland has long cherished. Unicorns appeared on the country’s coat of arms starting in the 12th century, and were ​​officially adopted as Scotland’s national animal by King Robert I in the late 14th century. For many years, the coat of arms included two of the legendary beings, but in 1603 one was replaced by a lion to mark the Union of the Crowns. Fittingly for the then-newly united England and Scotland, folklore had long depicted the two creatures as butting heads to determine which one was truly the “king of beasts.”

Scottish kings also displayed that fighting spirit, which may be why unicorns were generally depicted in Scottish heraldry as wearing gold chains — only the land’s mighty monarchs could tame them. Unicorns remain popular in Scotland to this day, with renditions found on palaces, universities, castles, and even Scotland’s oldest surviving wooden warship.

Harvested Green Bell Peppers.
Credit: ZeynepKaya/ iStock

Green Bell Peppers Are Just Unripe Red Bell Peppers

If you’ve ever found yourself in the grocery store struggling to decide between red and green bell peppers — or even just wondering what the difference is between them — you may be interested to learn that they’re the very same vegetable. In fact, green bell peppers are just red bell peppers that haven’t ripened yet, while orange and yellow peppers are somewhere in between the two stages. As they ripen, bell peppers don’t just change color — they also become sweeter and drastically increase their beta-carotene, vitamin A, and vitamin C content. So while the green variety isn’t quite as nutritious as its red counterpart, the good news is that one eventually becomes the other.

Cotton Candy being created.
Credit: Just2shutter/ Shutterstock

A Dentist Helped Invent the Cotton Candy Machine

When folks learn that one of cotton candy’s creators cleaned teeth for a living, jaws inevitably drop. Born in 1860, dentist William J. Morrison became president of the Tennessee State Dental Association in 1894. But Morrison was something of a polymath and a dabbler, and his varied interests also included writing children’s books and designing scientific processes: He patented methods for both turning cottonseed oil into a lard substitute and purifying Nashville’s public drinking water. In 1897, Morrison and a fellow Nashvillian — confectioner John C. Wharton — collaborated on an “electric candy machine,” which received a patent within two years. Their device melted sugar into a whirling central chamber and then used air to push the sugar through a screen into a metal bowl, where wisps of the treat accumulated. Morrison and Wharton debuted their snack, “fairy floss,” at the Louisiana Purchase Exposition of 1904 (better known as the St. Louis World’s Fair). Over the seven-month event, at least 65,000 people purchased a wooden box of the stuff, netting Morrison and Wharton the modern equivalent of more than $500,000.

Young woman applying oil onto her hair.
Credit: Pixel-Shot/ Shutterstock

Human Hair Contains Traces of Gold

Gold is present in low levels throughout the Earth. It’s been found on every continent except Antarctica, as well as in the planet’s core, the oceans, plants, and in humans, too. The average human body of about 150 pounds is said to contain about .2 milligrams of gold, which we excrete through our skin and hair. Babies less than 3 months old tend to have more gold in their manes than older people, thanks to the precious metal being passed along in human breast milk. And while no one’s suggesting we should mine the gold in hair or breast milk (as far as we know), researchers are studying whether gold — and other metals — might be recovered from human waste.

Bananas laying on grass.
Credit: Waldemar/ Unsplash

Bananas Are Technically Berries

Berry classification is a confusing business. People began referring to some fruits as “berries” thousands of years before scientists established their own definitions, some of which are still debated. Today, little effort is made to teach the public about what botanically constitutes a berry, so here’s a bit of help. It’s generally accepted that all berries meet three standards: First, they have a trio of distinct fleshy layers (the outer exocarp, middle mesocarp, and innermost endocarp); second, their endocarps house multiple seeds; third, berries are simple fruits, meaning they develop from flowers with a single ovary.  

Blueberries and cranberries are true berries, as their names imply. Other berries may surprise you: Avocados, eggplants, grapes, guava, kiwis, papayas, peppers, pomegranates, and tomatoes are all, botanically speaking, berries. Bananas are berries, too, since they meet all three requirements. The exocarp of a banana is its peel, while the mesocarp is the creamy middle surrounding the seedy, also-edible endocarp. With seeds growing on the outside, blackberries, raspberries, and strawberries are, confusingly given their names, neither berries nor simple fruits. Instead, they are called aggregate fruits, because they grow from multiple ovaries of the same flower.

People walking through Grand Central Terminal.
Credit: Afif Ramdhasuma/ Unsplash

Grand Central Terminal Is Radioactive

Next time you find yourself arriving at Grand Central Terminal, feel free to inform the person sitting next to you that the architectural landmark is radioactive — and, once their expression changes, be sure to also tell them that it’s only by a harmless amount. Located in midtown Manhattan, New York’s most-beloved transportation hub (sorry, not sorry, Penn Station) was built between 1903 and 1913 out of granite, which contains higher levels of uranium than most other stones. Still, the levels aren’t all that high: The average person is exposed to 360 millirems of radiation per year, 300 of which come from natural sources, and Grand Central employees would absorb about 120 millirems at work over the course of a year.

A beekeeper examining bees.
Credit: Unsplash+ via Getty Images

Bees Can Recognize Human Faces

Humans have known about bees for a long time: 8,000-year-old cave paintings in Bicorp, Spain, show early humans scaling trees to collect honey. But modern scientists wanted to know if bees recognize us, which is why researchers have put the insects’ microscopic brains to the test. In a 2005 study, honey bees were trained to memorize pictures of human faces by scientists who rewarded them for correct matches with droplets of sugar water. While a bee’s-eye view isn’t as clear as our own gaze, the buzzing insects were able to correctly differentiate between faces up to 90% of the time — even two days after first seeing them, and when the sweet incentives were removed.

The emerging research into bee brains shows that not all living creatures need the complex brain systems humans have in order to recognize and recall environmental differences, but some researchers say that’s not entirely shocking. The Apis mellifera (aka the European honey bee) can visit up to 5,000 flowers in one day, distinguishing between buds that give off beaucoup nectar and those that don’t. So, it makes sense that bees have some form of working memory. And unlocking how bee brains work has practical applications for both us and them: Tech developers may be able to fine-tune artificial intelligence systems (in part by understanding how such tiny brains work so efficiently), and entomologists can better focus on supporting these crucial insects, which are responsible for an estimated 80% of food crop pollination.

spaghetti cactus hanging plant growing in an indoor greenhouse.
Credit: David Jalda/ Shutterstock

There’s Only One Species of Cactus Found Wild Outside the Americas

There are nearly 2,000 known species of cacti, all of which are native to the Americas alone — except for one. That would be Rhipsalis baccifera, also known as the spaghetti cactus or the mistletoe cactus, which grows wild in India, Sri Lanka, and Africa, as well as parts of the Americas. Even stranger than the idea of a single cactus species thousands of miles away from its prickly relatives is the fact that scientists aren’t exactly sure how R. baccifera ended up in the Eastern Hemisphere to begin with. The epiphytes (also called air plants) are remarkably resilient, able to survive without soil by drawing moisture from the air, and the many theories attempting to explain their broad distribution fit their strange nature.

One explanation is that birds snacked on the white berries containing R. baccifera’s seeds somewhere in South America before flying all the way to Africa, where they passed those seeds and essentially planted the cactus on the other side of the world. Problem is, scientists don’t know of any berry-snacking birds that could have actually made that journey. Another theory suggests that sailors used the cactus, with its fetching long green branches, to decorate their living quarters while journeying across the Atlantic from Brazil, then left them behind upon arriving in Africa. Yet another theory posits that the plant could have existed way back when Africa and the Americas were part of one supercontinent called Gondwana — which then broke up around 184 million years ago, leaving a little cactus on both sides. However, it’s unlikely the plant existed all those years ago. The truth is, we’ll probably just have to embrace the mystery of it all.

Lemons floating in pool water.
Credit: puhhha/ Shutterstock

Lemons Float, but Limes Sink

Few flavors complement each other like lemon and lime, with many a refreshing treat (hello, Sprite!) combining both for maximum effect. The two citrus fruits have some key differences, however, including the fact that limes sink while lemons float. You may have noticed this if you’ve ever put lime and lemon slices in a glass of water or cocktail, and the reason is simple: Objects only float if they’re less dense than the liquid they’re placed in, and while both limes and lemons have densities close to that of water, limes are denser than their yellow counterparts. That remains true whether the lemon or lime in question is whole, peeled, or sliced — a lemon will always float, and a lime will always sink.

That’s not the only difference between these citrus fruits, of course. Whereas lemons grow well in moderate climates, limes fare better in tropical and subtropical areas. Limes also tend to be smaller, which helps distinguish them from lemons even when they sometimes take on a yellowish hue as they ripen. And though the two are almost identical on a nutritional level, lemons are sweeter — which is probably why you can think of a lot more lemon-flavored candies than lime-flavored ones.

Green palm trees against a blue sky.
Credit: Tania Richardson/ Unsplash

Palm Trees Are Not Native to L.A.

There are an estimated 75,000 palm trees in Los Angeles, all of which have one thing in common: They aren’t native there. Despite being an L.A. icon on par with the Hollywood sign and Dodger Stadium, the tropical tree is no more a native Angeleno than, well, the Dodgers. Not unlike the Hollywood sign, palms were originally a marketing technique for developers hoping to attract newcomers to the area in the late 19th century. They got the idea from the French Riviera — another area palms aren’t actually native to — where like-minded developers had successfully used them just a few decades before to cultivate an image of glitz and glamour. In addition to being beautiful, palms are surprisingly easy to uproot and transport from their native tropical and subtropical environments in the Middle East, Mexico, and elsewhere, so tens of thousands of them were planted all across the California city that had once been desert scrubland.

It seems fitting that one of Los Angeles’ most enduring symbols was essentially a branding strategy chosen for its aesthetic appeal, doubly so because palm trees’ association with the city was (and is) further cemented by their ubiquity in the many films shot there. After all, most of the directors, actors, and studio executives who made Hollywood what it is today weren’t originally from the City of Angels either.

A reddish dung beetle on top of its meticulously rolled ball of dung.
Credit: Villiers Steyn/ Shutterstock

Dung Beetles Navigate Using the Milky Way

We tend to think of dung beetles as lowly creatures, right down to their name. In spite of their earthbound status, however, they do something downright cosmic that no other animal we know of does: navigate using the Milky Way. While “dancing” atop their balls of dung, they orient themselves by looking up at the night sky, catching a glimpse of the bright strip of light our humble galaxy generates, and then moving relative to its position. They do this by taking what scientists call “celestial snapshots” and storing the images in their tiny little dung-beetle minds, a surprisingly fast process that allows them to hightail it away from the dung piles they scavenge. (As for daytime gathering, they move using special photoreceptors in their eyes that allow them to see a symmetrical pattern of polarized light emanating from the sun.) Doing so quickly is imperative — there’s a lot of competition for dung out there, and daddy dung beetles need to move quickly to bury the excrement, which they later feed to their babies. The insects move rapidly in straight lines away from the dung piles, which seems to minimize the likelihood of meeting other creatures of the same kind and getting into a dung-related squabble.

There are around 8,000 species of dung beetles on Earth, 600 of which roll such balls; the others burrow directly beneath the piles of dung and store their quarry in tunnels. Most of them prefer the dung of herbivores, who tend not to digest their food that well. And while most dung beetles are lucky enough to live under dark skies that help them see the Milky Way, light pollution is a growing concern that could throw off their celestial compasses — that is, unless we become more considerate of our dung-rolling neighbors.

Western lowland gorilla, Gorilla gorilla, group mammals on grass.
Credit: Erni/ Shutterstock

The Scientific Name for the Western Lowland Gorilla Is “Gorilla Gorilla Gorilla”

Living things are categorized by a taxonomy system you may have learned about in school, starting with kingdom (e.g., animals, plants) and ending with species (e.g., Homo sapiens). Scientific names are usually expressed using their last two or three categories: genus, species, and, if there is one, subspecies. The western lowland gorilla is in the genus Gorilla, which contains the largest apes. It’s also the species gorilla, which refers to western gorillas (the eastern gorillas are G. beringei). (Western gorillas aren’t the only creatures to have an identical genus and species name; a red fox is Vulpes vulpes.)

Gorilla gorilla also contains two subspecies: G. gorilla diehli, the Cross River gorilla, and G. gorilla gorilla, the western lowland gorilla. You might say the western lowland gorilla is the most gorilla: It’s in the genus gorilla, the species gorilla, and the subspecies gorilla, making its scientific name Gorilla gorilla gorilla.

View of the Eiffel Tower in front of a purple sky.
Credit: Anthony DELANOIX/ Unsplash

The Eiffel Tower Grows a Little Each Summer

When the Eiffel Tower was first built in 1889, it was the tallest building in the world at 312 meters tall, or a little more than 1,023 feet. Today, it’s around 60 feet taller because of the radio and TV antennas at its peak, and while nothing’s going to make it the tallest building in the world again, its exact height varies by a few inches depending on the time of year. That’s thanks to a scientific phenomenon called thermal expansion. In general, when a substance heats up, its atoms become more active and move farther apart, making its volume larger. Some substances are more sensitive to thermal expansion than others, including metals like iron. Because the Eiffel Tower is made up of almost pure iron — and there’s a lot of it — hot weather leads to some different measurements. In the summer, the tower not only grows (by as much as 6 inches), but also gets a little lopsided; because the sun only hits one side, it tilts ever-so-slightly away from the sun.

Close view of two cute small baby guinea pigs.
Credit: Andrii Oleksiienko/ Shutterstock

It’s Illegal to Own Just One Guinea Pig in Switzerland

The “dignity of living beings,” including animals, is enshrined in the Swiss Constitution, so Switzerland has some incredibly detailed animal protection laws, including a provision that social creatures cannot be kept alone. This includes not only guinea pigs, but many other animals, including mice, chinchillas, parrots, and lovebirds. Other animal no-gos in Switzerland include extreme breeding, cropping dog ears, and, with very few exceptions, animal testing.

So what happens if you have two guinea pigs and one dies? You could get another guinea pig, which would require buying a ton of subsequent guinea pigs as each one dies, or just hope nobody rats you out. Alternatively, one Swiss animal lover actually started a service that lets you rent a friend to keep your remaining guinea pig company for the rest of its life.

Isolated abstract i box during a sunny cloudy day.
Credit: no_limit_pictures/ iStock

The Dot Over the Small Letter “I” Is Called a “Tittle”

Remember to cross your t’s and tittle your i’s! Those little dots over letters such as the lowercase “j” are called “tittles,” a term that dates back to the 12th century. It can also refer to any other modifying marks on a letter, known as diacritic marks — that includes things like the two dots of an umlaut, the accent over the “e” in fiancé, the squiggly line (also known as a cedilla) under the “c” in façade, or the tilde over the “n” in piñata or jalapeño. In its earliest use, it referred specifically to the character ÷, which was once used as an abbreviation for the Latin word est, but is now often used as a division sign.

A puppy with sad face causing forehead wrinkles in fur.
Credit: Stacey Welu/ Shutterstock

Sad Puppy Face Evolved Separately in Dogs

You know that look dogs get when they’re requesting attention, some of your dinner, or just a little eye contact — the one that pulls at your heartstrings? That look requires two muscles that connect the brow to either edge of the eye. Most domesticated dogs have these muscles, but their closest wolf relatives don’t. Older breeds that are a little closer to wolves may just have one.

Humans respond much more positively when animals have features, such as widening eyes, that remind us of human infants, and dogs use those muscles far more often when a human is paying attention. Dogs branched away from wolves 33,000 years ago when they started their relationship with humans, so it’s likely that those muscles evolved specifically because they gave dogs an advantage when interacting with human beings.

Four yellow squeaky duckies in water.
Credit: Kathriba/ Shutterstock

Thousands of Plastic Yellow Duckies Were Lost at Sea — and Found All Over the World

One day in January 1992, a crate slipped off a cargo ship into the Pacific Ocean while en route to Tacoma, Washington, from Hong Kong. This isn’t particularly unusual — thousands of shipping containers fall into the ocean each year — except that it was full of roughly 29,000 floating bath toys in four shapes and colors. Blue turtles, red beavers, green frogs, and yellow ducks, each around 3 inches long, emerged from their disintegrating cardboard packages and started drifting.

Ten months later, hundreds of them started washing up in Alaska, but many of them continued their oceanic journey. In the early 2000s, they hit the shores of New England. Some took a southern turn early on and ended up in Hawaii, while others traveled as far as Europe. Researchers ended up using them to study current patterns, and according to calculations by oceanic scientists, some of them may have circumnavigated the globe, while others likely became part of a Texas-sized convergence of lost plastic known as the Great Pacific Garbage Patch.

OMG, the well known expression and abbreviation for Oh My God with an exclamation mark.
Credit: Thinglass/ Shutterstock

“OMG” Predates the Internet

After decades of text messaging and home internet use, acronyms like “LOL” (“laughing out loud”), “IMO” (“in my opinion”), and “FTW” (“for the win”) have made their way into “IRL” (“in real life”) speech — but it may surprise you to know that one of the most common ones, “OMG,” predates even the earliest forms of the Internet. It even had the same meaning: “Oh my God.”

Its first recorded use dates back to 1917 in a letter to Winston Churchill from British admiral and former sea lord John Fisher, who was expressing some annoyance around naval tactics. “I hear a new order of Knighthood is on the tapis,” he wrote. “OMG (Oh! My! God!) — Shower it on the Admiralty!!”

It’s worth noting that, while “OMG” usage tends to be associated with younger people, Fisher was in his mid-70s when he authored the letter.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Jenny Pace/ Unsplash

Whether it’s an old fashioned or a classic daiquiri, every spirited sip got its start somewhere — though mixologists may argue about the true origins of these famous concoctions. (New York and London, for example, both lay claim to creating the first cocktail.) Here are 10 of our favorite cocktails and the bars that made them famous. Cheers!

A fresh cocktail shaker pouring the famous negroni cocktail.
Credit: Rinck Content Studio/ Unsplash

Negroni (Florence, Italy)

In 1919, Count Camillo Negroni bellied up to the bar at Café Casoni and asked for something stronger than his usual Americano (Campari, club soda, and vermouth). Fosco Scarselli obliged, replacing the club soda with gin, and the Negroni was born. While the ownership and name have changed a few times, you can still visit the original space on Piazza della Libertà, now known as Caffè Lietta. (Our advice for mixing the perfect version at home? Put Stanley Tucci in charge of the bar.)

Boozy rum Hemingway daiquiri with lime and grapefruit.
Credit: Brent Hofacker/ Shutterstock

Daiquiri (Havana, Cuba)

Ernest Hemingway had more than one favorite bar, but in Cuba, it was El Floridita. The bar was founded in Havana’s Old Quarter in 1817, and it was already an institution as la cuna del daiquiri — the cradle of the daiquiri — when the famous author walked in. After sampling the original, Hemingway requested “more rum, less sugar” from legendary barman and owner Constantino Ribalaigua. You can still order a Papa Doble, Hemingway’s favorite, while sitting next to his life-sized statue.

Close-up of a smokey old fashion drink on top of a bartending book.
Credit: Bon Vivant/ Unsplash

Old Fashioned (Louisville, Kentucky)

Kentucky gentlemen know from bourbon, so it’s no surprise that this Don Draper-approved cocktail hails from the Bluegrass State. Dubbed an “old fashioned” for the squat tumbler in which it’s served, this potion consisting of bourbon, sugar, bitters, and orange peel is said to have been invented at the private Pendennis Club in Louisville before making its way to New York’s Waldorf Astoria Hotel.

A group of bloody mary's on an outside table ready for a party.
Credit: mitchellpictures/ iStock

Bloody Mary (Paris, France)

Everyone argues about this one, but most cocktail historians agree that the bloody mary (appetizingly nicknamed “the bucket of blood”) was born in 1920s Paris, when bartender Ferdinand “Pete” Petiot began experimenting with vodka at Harry’s New York Bar. The spirit, which he found tasteless, was popularized by Russian émigrés fleeing the revolution. Some canned tomato juice and a few spices later, he arrived at the brunch staple we know and love today. Butch McGuire’s in Chicago takes credit for the celery stick swizzle, but the angel who added a slice of crispy bacon remains a mystery.

A refreshing French 75 cocktail with lemon and champagne.
Credit: Brent Hofacker/ Shutterstock

French 75 (Paris, France)

Boozy and bubbly, this cocktail of gin, champagne, and lemon is named after a 75- millimeter World War I field gun and carries a combat-worthy kick. The invention of legendary barman and cocktail book author Harry MacElhone (who brought Harry’s New York Bar to Paris), the French 75 is essentially a Tom Collins, but with champagne replacing the original’s club soda topper.

A shot of a hand reaching for a full martini glass.
Credit: Amy Shamblen/ Unsplash

Martini (California or New York)

The “shaken or stirred” debate has nothing on the origin of America’s most iconic cocktail, which is vigorously argued by both of the nation’s coasts. The historic town of Martinez, California, swears the gin-and-vermouth classic was created as a celebratory Champagne replacement for a gold miner who struck it rich. New Yorkers insist it’s solely the invention of the bar staff at the Knickerbocker Hotel, named after the Martini in Martini & Rossi vermouth. As for us? We’ll think about it while we have another.

A classic Sazerac cocktail with cognac, bourbon, absinthe, bitters, sugar and lemon zest.
Credit: 5PH/ iStock via Getty Images Plus

Sazerac (New Orleans, Louisiana)

Creole apothecary Antoine Peychaud is said to have served up a melange of his own bitters and his favorite cognac (Sazerac-de-Forge et fils) in a coquetier, or egg cup, in 1838. Over the years, rye whiskey replaced the cognac, and an antiques store replaced the apothecary at 437 Royal Street, but you can still sip a fine version at the Roosevelt Hotel’s historic Sazerac Bar.

A single margarita cocktail on bar counter.
Credit: Tarik Kaan Muslu/ Shutterstock

Margarita (Mexico)

Would a daisy by any other name taste as good? When the tequila is flowing, memories get fuzzy and the tales grow taller with every round. Regardless of whether this icy delight was invented by a barman-turned-milkman at the now-defunct Tommy’s in Juarez or at the still-kicking Hussong’s Cantina in Ensenada, this refreshing blend of tequila, Cointreau, and lime was popularized by Southern California liquor distributors who introduced agave-based spirits north of the border — and we’re forever grateful.

Refreshing zombie tiki cocktail with pineapple and mint on top.
Credit: Brent Hofacker/ Shutterstock

Zombie (Hollywood, California)

Along with the fog cutter and many, many more Polynesian-inspired cocktails, we owe the invention of the zombie cocktail to a man named Ernest Gantt. He returned from bumming around the South Seas post-Prohibition, dubbed himself Don the Beachcomber, and opened the world’s first tiki bar in 1934. Heavy on rum, fruit juices, and fun, these potent potables offer a kitschy taste of vacation. While the original Don’s is long gone, zombie aficionados can still live the dream at Hollywood’s Tiki-Ti, serving nostalgia (and mai-tais) since 1961.

Homemade rye bourbon manhattan with a cherry garnish.
Credit: Brent Hofacker/ Shutterstock

Manhattan (New York, New York)

One legend says that this cocktail was first served at a party for Sir Winston Churchill’s mother, Lady Randolph Churchill, at New York City’s Manhattan Club. That venerable lady can no longer confirm or deny, but the Manhattan Club still defends its claim to this heady combination of whiskey, vermouth, and bitters. While the original site at 96 Fifth Avenue now holds an apartment building, and the social club was dissolved in 1979, you can make this venerable cocktail at home, courtesy of another Manhattan institution, The New York Times.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.