Original photo by Maximum Film/ Alamy Stock Photo

From her rise to prominence by way of a quarter-century run as the host of an eponymous talk show, Oprah Winfrey has undertaken an extraordinary career journey that made her, among other things, the first woman to own and produce her own talk show, a 2013 recipient of the Presidential Medal of Freedom, and one of the few celebrities famous enough to be known solely on a first-name basis. Here are six more facts about the wildly successful screen personality, entrepreneur, and philanthropist appropriately known as the “Queen of All Media.”

Portrait of young Orpah.
Credit: ARCHIVIO GBB/ Alamy Stock Photo

Her Real First Name Is “Orpah”

According to her birth certificate, we’ve been saying the media queen’s first name incorrectly all this time. Born in January 1954 in Kosciusko, Mississippi, the future TV host was named “Orpah,” after a woman in the Bible’s Book of Ruth. However, the unusual name immediately caused confusion among family, who adopted a slightly different version of the moniker for their newest member. As she later explained during one early audition tape, “No one knew how to spell in my home, and that’s why it ended up being Oprah.” Her birth certificate still reads “Orpah,” but to everyone who knows her — which now includes millions of fans around the world — she’ll always be “Oprah.”

Al Gore talks with US talk show host Oprah Winfrey.
Credit: LUKE FRAZZA/ AFP via Getty Images

Oprah Initially Wanted No Part of Daytime Talk Television

A few years after beginning her TV career as a news anchor at age 19, Oprah was dismayed to learn that executives at Baltimore’s WJZ-TV wanted her to co-host a daytime talk show. (She reportedly worried that she “wouldn’t be taken seriously as a journalist.”) According to Kitty Kelley’s Oprah: A Biography, the newscaster begged her bosses to reconsider and left their meeting “with tears in her eyes” when she realized she had no other option. But something clicked while she conducted interviews during the August 1978 debut of People Are Talking, and Oprah realized that “this is what [she] was meant to do.”

Roger Ebert and Oprah Winfrey.
Credit: WENN Rights Ltd/ Alamy Stock Photo

A Date With Roger Ebert Sent Her Career in a New Direction

Fans are well aware of Oprah’s long-term relationship with Stedman Graham, but lesser known is her brief but consequential dating history with famed movie critic Roger Ebert. During one dinner together in the mid-1980s, Oprah revealed her uncertainty on how to handle offers to take her show into national syndication. Ebert, by then already a TV veteran as co-host of At the Movies, did some quick calculations that showed the staggering amount of money she stood to earn from a syndication deal. The financial tip wasn’t enough to save their fledgling romance, but it did point Oprah in the right direction and pave the way for the syndicated launch of The Oprah Winfrey Show in September 1986.

The Color Purple screen grab with Oprah.
Credit: Photo 12/ Alamy Stock Photo

She Made Her Film Debut With an Assist from Quincy Jones

Her early television success notwithstanding, Oprah remained hopeful of realizing her dream to become an actress. She finally got the opportunity she’d been waiting for in the mid-1980s, when music producer Quincy Jones, who was trying to pull together a big-screen adaptation of Alice Walker’s Pulitzer Prize-winning novel The Color Purple, was captivated by the then-still-relatively-unknown TV host and recommended her to his casting agent. Despite her lack of professional acting experience, Oprah wound up with the part of Sofia, a choice that was validated when she earned a Best Supporting Actress Academy Award nomination for her performance in the 1985 film. She went on to found her own production company, Harpo Productions, through which she continued to satisfy her acting ambitions, with roles in movies including Beloved (1998) and Selma (2014). She also earned acclaim for her performance in The Butler (2013), and starred alongside Reese Witherspoon and Mindy Kaling in 2018’s flashy film adaptation of Madeleine L’Engle’s A Wrinkle in Time.

The cover of an Oprah's Book Club book.
Credit: Tim Boyle/ Getty Images News via Getty Images

Oprah Acted on an Employee’s Tip To Launch Her Book Club

Among the most popular recurring segments of her show, Oprah’s Book Club grew from a love of reading shared with an intern named Alice McGee. After several years of bonding over the books they were enjoying, McGee, who had risen to become a senior producer, suggested to Oprah that they open up the literary discussion to audience members. Although her team was hesitant to support the idea at first, they determined that it could work if audiences were given enough time to read a book. So on September 17, 1996, Oprah’s Book Club hit the airwaves with her recommendation of Jacquelyn Mitchard’s The Deep End of the Ocean.

In the 15 years that followed, Oprah recommended some 70 books, from Toni Morrison’s Paradise and Edwidge Danticat’s Breath, Eyes, Memory to John Steinbeck’s East of Eden and Leo Tolstoy’s Anna Karenina. The segment encountered a few bumps along the way — most famously when the September 2005 selection A Million Little Pieces, by James Frey, was revealed to have been partially fabricated, despite being marketed as nonfiction — but its success and influence were undeniable. Authors whose books were chosen often saw massive increases in sales, a boost that became known as the “Oprah Effect.” The original Book Club ended with her show in 2011, but recent years have seen newer iterations in O, The Oprah Magazine and on Apple TV+.

Stephen Colbert and Oprah speak onstage.
Credit: Kevin Mazur/ Getty Images Entertainment via Getty Images

Oprah Despises Chewing Gum

Oprah has famously shared a list of her favorite things almost every year since the 1990s, first on her eponymous talk show, and then in O, The Oprah Magazine and, more recently, on her Oprah Daily website. One thing you’ll probably never see on the list? Chewing gum. As she shared during a 2018 appearance on The Late Show With Stephen Colbert, she “intensely” dislikes the stuff. Her aversion stemmed from a grandmother who left gum all around the house. “She would put it on the bedpost,” Oprah recalled. “She would put it in the cabinet. She would put it everywhere around. And so as a child, I would bump into it, and it would, like, rub up against me.” The media queen has since attempted to eradicate any traces of gum from her life, with limited success: “It’s barred at my offices, nobody is allowed, but when I go out into the world, I can’t bar it,” she told Colbert. “It’s a thing. It creeps me out.”

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by Timothy R. Nichols/ Shutterstock

Few figures loom as large in American history as the Founding Fathers. Although wrapped in myth and shrouded in legend, these leaders lived fascinating lives molding a fractious colony into a new nation. Although their stories have been meticulously detailed — through their own writings as well as centuries of biographies and classroom textbooks — not everything about them is well known. Which famous general lost more battles than he won? Which two Founding Fathers died on the same day? Which one invented a strange musical instrument? Here are seven little-known facts about the men who created a nation.

Benjamin Franklin, Thomas Jefferson, John Adams, Livingstone and Roger Sherman.
Credit: ullstein bild Dtl via Getty Images

John Adams and Thomas Jefferson Died on the Same Day

John Adams and Thomas Jefferson, bitter political rivals and, at times, close friends, died on the very same day — July 4, 1826, 50 years after signing the Declaration of Independence. The two were the last surviving of the original revolutionaries who helped forge a new nation after breaking with the British Empire. During their presidencies, the two diverged on policy and became leaders of opposing political parties, but at the urging of another founding father, Benjamin Rush, around 1812, Adams and Jefferson began a correspondence that lasted the rest of their lives. On his deathbed at the age of 90, Adams’ last words were reportedly “Jefferson still lives,” but he was mistaken — Jefferson had died five hours earlier in Monticello, Virginia.

Portrait of James Madison, 4th President of the United States.
Credit: Bettmann via Getty Images

James Madison Was the Shortest President in U.S. History

Although James Madison’s signature doesn’t adorn the Declaration of Independence, as the nation’s fourth President and chief architect of the Bill of Rights, he’s widely regarded as one of the most influential Founding Fathers. Madison had a large impact on early U.S. history even though he is also the country’s shortest President thus far, standing just 5 feet and 4 inches tall. That makes Madison a full foot shorter than America’s tallest President, Abraham Lincoln (and no, that height doesn’t include Lincoln’s signature stovepipe hat).

Portrait of John Hancock.
Credit: Stock Montage/ Archive Photos via Getty Images

John Hancock Was Accused of Smuggling

On May 24, 1775, John Hancock became the presiding officer over the Second Continental Congress. A little more than a year later, his signature became famous when he wrote his name in grandiose letters, taking up some 6 square inches, on the Declaration of Independence. (Legend says Hancock wanted the king to be able to see it without spectacles.) However, Hancock was also known as an importer, and — at least when it came to British tea — was accused of being a smuggler. The British seized his sloop Liberty in 1768 because of suspected smuggling, which instigated a riot. Luckily, fellow founding father and lawyer John Adams cleared Hancock of all charges, and there was only flimsy evidence for the charges in the first place.

Bottles of Samuel Adams beer.
Credit: Justin Sullivan/ Getty Images News via Getty Images

Sam Adams Might Never Have Brewed Beer

Sam Adams was the most influential member of the Sons of Liberty, a loosely organized political organization that formed in opposition to the Stamp Act in 1765. But to many Americans, he’s also the name behind one of the most successful beer brands in the U.S. The company says it picked the name because its founder, Jim Koch, “shared a similar spirit in leading the fight for independence and the opportunity for all Americans to pursue happiness and follow their dreams.” That’s good, because it’s not clear whether Sam Adams actually ever brewed beer. After his father’s death in 1748, Adams inherited his malt house, which is where grains are converted into malt that’s then sold to brewers. But within only a few years, the business was bankrupt and the malt house itself was crumbling; the whole family estate was then put up for auction. Adams proved more effective as a political firebrand than as a “maltster.”

American General and later the first President of the United States, George Washington.
Credit: MPI/ Archive Photos via Getty Images

George Washington Lost More Battles Than He Won

General George Washington embodies the phrase “losing the battle but winning the war,” because during the American Revolution, he lost more battles than he won. Despite some experience in the British army, Washington had little experience fielding a large fighting force, and the Continental Army was filled with soldiers who were far from professional fighters. However, Washington’s resilience, determination, and long-term strategy eventually won the day. According to Washington’s aide Alexander Hamilton, the plan was simple: “Our hopes are not placed in any particular city, or spot of ground, but in preserving a good army … to take advantage of favorable opportunities, and waste and defeat the enemy by piecemeal.” Washington, also aided by competent generals such as Nathanael Greene and assisted by the French Navy, decisively ended British ambitions in the colonies at the Battle of Yorktown in 1781.

Czech glass harmonica from the first half of the nineteenth century.
Credit: Print Collector/ Hulton Archive via Getty Images

Benjamin Franklin Invented a Musical Instrument Used by Mozart and Beethoven

In the mid-1700s, while serving as a delegate for the American colonies in Europe, Benjamin Franklin experienced a popular musical performance — singing glasses. Intrigued by the beautiful sound of a wet finger on glass, Franklin developed an instrument known as a “glass armonica” in 1761. Working with a glassblower in London, Franklin altered the thickness of glass bowls, interlocked along a rod, in order to produce a range of pitches.

Far from being one of Franklin’s odder ideas (like his failed phonetic alphabet), the glass armonica was an 18th-century sensation. Some of the era’s greatest composers, including Wolfgang Amadeus Mozart and Ludwig van Beethoven, wrote music for the instrument. However, it was largely forgotten by the 1820s — many musicians complained of dizziness and other symptoms after playing it, with some blaming lead poisoning or the instrument’s vibrations as the cause. Today, a few musicians still practice the subtle, ethereal art of the glass armonica.

The U.S. Coast Guard Cutter Eagle makes it way along the Hudson River.
Credit: Drew Angerer/ Getty Images News via Getty Images

Alexander Hamilton Was Captain of One of the Oldest U.S. Army Regiments in Existence

Alexander Hamilton is known for many things — he was the prolific writer behind the Federalist Papers, the first secretary of the treasury, the creator of the U.S. Coast Guard, and the inspiration for one of Broadway’s biggest musicals. What’s less celebrated about Hamilton is his military career, though when fighting broke out, the eager immigrant from Nevis island in the Caribbean joined the cause. On March 14, 1776, Hamilton was named captain of the New York Provincial Company of Artillery, and soon fought in the battles at Kip’s Bay and White Plains, among others. Hamilton slowly climbed up the military ladder, first serving as General George Washington’s aide and then as commander of a light infantry battalion at the decisive Battle of Yorktown. However, it’s his original artillery company that holds a singular distinction. Known today as 1st Battalion, 5th Field Artillery Regiment, Hamilton’s former artillery unit is one of the oldest active regiments still serving in the U.S. Army.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Moviestore Collection Ltd/ Alamy Stock Photo

The Mary Tyler Moore Show is one of the most influential and groundbreaking sitcoms in the history of television. The series follows Mary Richards, played by Mary Tyler Moore, as a 30-something working as an associate producer on a local news series and navigating single life in Minneapolis, Minnesota.

Its portrayal of an unmarried working woman finding satisfaction outside of home and family — and openly enjoying sex and dating — was unheard of in the early 1970s, but it’s not just the show’s forward-thinking writing that made it a classic. Its heartfelt relationships, witty writing, and relatable conflicts made the sitcom a hit with audiences and critics alike and cemented Moore (already an established TV star when the show debuted) as an enduring cultural icon. It went on to win 29 Emmy awards during its seven-season run on CBS. Here are eight things you might not know about the sitcom.

Scene from the final episode of The Mary Tyler Moore Show.
Credit: Bettmann via Getty Images

Lou Grant Was Ed Asner’s First Comedic Role — And He Almost Blew It

The late Ed Asner is remembered now as a strong, versatile actor, but at the time, some CBS executives questioned whether he’d be up for a prominent role in a comedy series, according to Jennifer Keisin Armstrong’s 2013 book Mary and Lou and Rhoda and Ted. Although he was casting director Ethel Winant’s first choice, the producers had kicked around some other choices before signing on, like The Odd Couple’s Jack Klugman, fellow cast member Gavin McLeod, and Second City alum Shelley Berman, who later had a small role as one of Mary’s dates.

Winant pointed to Asner’s role as a journalist in the political drama Slattery’s People as evidence that he had the right vibe. With everybody on board, they brought Asner in for an audition — and he completely biffed it, hitting the famous line, “You’ve got spunk… I hate spunk,” with a dramatic fervor that turned everybody off, according to Armstrong.

The decision had already been made not to cast him when Asner, rather than getting in his car and leaving, turned around and walked right back into the studio. “You just sat there on your asses and let me bomb like that?” he said. “I was terrible. And you know it was terrible and you were too polite to tell me. Don’t be so f******g polite. Tell me what you want in this character.”

They worked through the character for half an hour, then did a second try at the reading. He’d won everyone over except for Moore, but according to Asner, producers boldly told her, “That’s your Lou Grant,” and she was sold.

Mary Tyler Moore talking on the phone in a scene.
Credit: Bettmann via Getty Images

Mary Was Originally Supposed To Be a Divorcée Working for a Gossip Columnist

When the show was in development in the late 1960s, producers James L. Brooks and Allan Burns had pitched Mary as a recent divorcée, writes Armstrong. Moore, who had gone through a divorce herself, was on board. The original vision differed in another major way, too: Mary was supposed to be the assistant to a snappy gossip columnist in Los Angeles.

They were less attached to the Los Angeles idea, so that was quickly scrapped for a news show in Minneapolis. As for the divorcée bit, Brooks and Burns considered throwing in the towel but didn’t like the optics of quitting, so they went back to the drawing board and came up with a concept they felt played to Mary’s strength: She was setting off to the big city on her own after a big breakup rather than a doomed marriage.

Scene from the last episode of The Mary Tyler Moore Show.
Credit: Bettmann via Getty Images

Producers Loved the Fashion Possibilities of Minneapolis

While re-evaluating the show’s premise and setting, Brooks and Burns landed on Minneapolis for a few reasons, according to Mary and Lou and Rhoda and Ted: It stood out against the New York- and Los Angeles- dominated media environment, they loved the dynamic of the city being huge for Mary and tiny for New Yorker Rhoda, and the bad weather could provide plot points and unique visuals.

Another thing Minneapolis could provide: coats, and lots of them. The costume department certainly took full advantage, because Mary’s coats went on to become iconic.

Gavin MacLeod and Betty White on The Mary Tyler Moore Show.
Credit: Archive PL/ Alamy Stock Photo

Betty White Was Supposed to Be in Only One Episode

One of the series’ most beloved roles didn’t come around until season four, and it was only supposed to be temporary. Betty White played Sue Ann Nivens, an outwardly sweet host of a show called “The Happy Homemaker,” with an aggressive, sex-crazed side. In her first episode, she wantonly tries to seduce Phyllis Lindstrom’s (Cloris Leachman) husband, leading to escalating tension as Phyllis and Sue Ann film a cooking segment about a chocolate soufflé.

White was a longtime friend of Moore — and a big fan of the show — before being offered the role, and the episode was such a hit that she was brought on as a regular. The morning after that first episode, according to Armstrong, Moore came to White’s doorstep with a real-life soufflé.

Screen-grab from a scene in The Mary Tyler Moore Show.
Credit: Moviestore Collection Ltd/ Alamy Stock Photo

It’s Likely the First American Sitcom to Feature Birth Control Pills

On The Dick Van Dyke Show, which Moore starred in from 1961-1966, the actress and her on-screen husband, Dick Van Dyke, slept in separate beds and couldn’t say the word “pregnant.” However, just a few years later on The Mary Tyler Moore Show, not only did Mary have sex out of wedlock, she openly took birth control pills.

In a 1972 episode — the same year that a supreme court decision made birth control available to unmarried women in all states — Mary is having dinner with her father when her mother shouts, “Don’t forget to take your pill!” Mary and her father both yelled, “I won’t,” and the embarrassed look on Mary’s face shows that she doesn’t just take a pill, but The Pill.

Its realistic portrayal of the sex lives of women in the 1970s walked a fine line for the audiences of the time, with a lot of it hiding in quips like that one. But the series was still open about what it was doing. “I’m hardly innocent,” Mary says in one episode. “I’ve been around. Well, maybe not around, but I’ve been nearby.”

Actress Valerie Harper who appears in TV series The Mary Tyler Moore Show.
Credit: Bettmann via Getty Images

Valerie Harper Almost Didn’t Get the Role of Rhoda Because She Was Too Pretty

Rhoda Morgenstern, Mary’s Bronx-born sidekick, was the last major role to be cast in the series, with more than 50 actresses reading opposite Moore for the part. Valerie Harper nailed her audition as Rhoda and even brought her own cloth for washing Mary’s apartment window in her first scene. But the producers weren’t sure she matched their vision.

“She was something we never expected the part to be… which is someone as attractive as she was,” Burns said in Mary and Lou and Rhoda and Ted. “But you’ve got to go with the talent.” Director Jay Sandrich felt strongly Harper was right for the role and suggested she not wear any makeup for her callback.

Producers immediately changed their minds when they brought Moore in to read a scene with Harper. Rhoda’s character switched gears a little bit — rather than being unattractive, which is subjective anyway, Rhoda just felt like she was unattractive.

“Rhoda felt inferior to Mary, Rhoda wished she was Mary,” Harper later recalled. “All I could do was, not being as pretty, as thin, as accomplished, was: ‘I’m a New Yorker, and I’m going to straighten this shiksa out.’”

Actress Mary Tyler Moore in rehearsal for The Dick Van Dyke Show.
Credit: Earl Theisen Collection/ Archive Photos via Getty Images

The Real Owner of Mary’s Apartment Building Displayed Political Banners to Keep Producers From Coming Back

The 1892 home that provided the exteriors for Mary’s apartment became so famous that the owners were inundated with visitors and tour buses, and eventually, they’d had enough. When they got word that the crew was coming back to film more exterior shots in 1972, owner Paula Giese displayed a large “Impeach Nixon” banner prominently across the front. (She was a prominent political activist, so it was a two-for-one deal.)

It worked. They didn’t get their new shots, and Mary eventually ended up moving.

Portrait of Edward Asner, Ted Knight and Mary Tyler Moore
Credit: Bettmann via Getty Images

The Character Ted Baxter Was Based on News Anchor Jerry Dunphy

Jerry Dunphy was a legendary news anchor in the Los Angeles market, known for his head of white hair, and his signoff: “From the desert to the sea to all of southern California.” His style became so well-known in broadcast journalism that he often played a small role as a newscaster in movies, too. He inspired the egotistical, dim-witted Ted Baxter (Ted Knight), although Dunphy had a better head on his shoulders — and much better ratings. Dunphy also inspired the character Kent Brockman on The Simpsons.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by RapidEye/ iStock

It’s sometimes said that soccer is the world’s game, but if there’s any activity that offers something close to universal appeal, it’s probably chess, with a reported 600 million-plus regular adult participants. Given the minimal space and physical exertion needed, it’s easy to see why, although chess can definitely count as a mental workout. While most of us would fail at executing an acrobatic scissor kick, it’s somewhat easier (at least physically) to learn to strategically move chess pieces to opportune spots while slowly reducing an opponent to a quivering mess. Read on to learn five fun facts about the ultimate battle board game, but be sure to keep a watchful eye on your king.

Close-up of the antique chaturanga game board with pieces.
Credit: SvetlanaSF/ iStock

Chess Evolved From the Indian Board Game Chaturanga

Although the exact rules of the sixth-century Indian board game chaturanga are lost to time, enough is known to label it a clear forerunner to chess. Like its more recent relative, chaturanga was a simulated war game that involved moving pieces of differing attacking capabilities, with the end goal of capturing the opponent’s king-like piece. Chaturanga was eventually adapted into shatranj (or chatrang) by Persian players, and this was the version of the game that spread across Europe. With the introduction of a few key changes, including the transformation of a somewhat punchless king’s minister into the mighty queen, the modern form of chess was born in the 15th century.

A painting of Benjamin Franklin playing chess with Lady Caroline Howe.
Credit: Smith Collection/Gado. Archive Photos via Getty Images

Benjamin Franklin Helped Popularize Chess in the U.S.

A man of immense intellectual capacity, Benjamin Franklin was unsurprisingly drawn to the challenges that arose from the chessboard. He wrote of playing as far back as 1733, and later was often seen competing in public venues during his time as minister to France in the 1770s and ’80s. His treatise “The Morals of Chess,” which surfaced in 1779, is considered the first published work on the subject by an American author. Although he was perhaps admired more for his enthusiasm than pure talent, Franklin’s endorsement of the game boosted its popularity in his home country, and eventually led to his induction into the U.S. Chess Hall of Fame in 1999.

World Chess Champion Boris Spassky, of Russia.
Credit: Jeff Goode/ Toronto Star via Getty Images

Many Chess Champions Have Come From Russia

After Austria’s Wilhelm Steinitz won the first official World Chess Championship in 1886, the top ranks of international chess became increasingly dominated by Russian-born competitors. Along with delivering the game’s first female champion in Vera Menchik, the talent pool from this area of the world produced a series of men’s champions of almost exclusively Soviet/Russian nationality during the second half of the 20th century. The one player to break through the Eastern Bloc was American Bobby Fischer, who defeated Boris Spassky for the championship in 1972, although he was later stripped of the title in 1975 for refusing to follow federation rules.

Close-up of check and mate, a chess game concept.
Credit: turk_stock_photographer/ iStock

There Are More Possible Chess Moves Than Atoms in the Observable Universe

This is the sort of tidbit that pops up on internet searches without much of an explanation, but it’s valid if you follow the logic proposed by mathematician Claude Shannon, sometimes called the “father of information theory.” According to his 1950 paper “Programming a Computer for Playing Chess,” there are approximately 1,000 unique possibilities for each coupled pair of white-then-black moves, which adds up to 10 to the power of 120 possible moves — aka the “Shannon Number” — for a game lasting 40 turns. As there are an estimated 10 to the power of 80 atoms in the observable universe, that would be checkmate for the Shannon Number in this comparison of incomprehensibly enormous figures.

World Chess Champion Garry Kasparov moves a knight.
Credit: STAN HONDA/ AFP via Getty Images

The 1997 Deep Blue-Kasparov Match Marked a Turning Point for Computer Chess

While Shannon and fellow geniuses like Alan Turing were fixated on chess-playing computers as far back as the 1950s, the landmark moment in this field arrived in May 1997, when the IBM supercomputer Deep Blue defeated world champion Garry Kasparov in a six-game match. Although the humbled champ suspected that human intervention was involved because of an unusual sacrifice offered by his opponent — later explained as a bug in the software — the rapid development of computing power soon obliterated any hope of humans retaining the edge over machines. Writing of his experience playing computers in 2010, Kasparov casually mentioned that anyone could buy a “home PC program that will crush most grandmasters.” In fact, it’s now been at least 15 years since a human beat a computer in a chess tournament.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by Moviestore Collection Ltd/ Alamy Stock Photo

Readers love a rags-to-riches story — which could be why “Cinderella” has such a cultural hold on us, even centuries after the tale was first recounted. Most versions of the famous fairy tale follow the same pattern: A destitute girl yearning for a better life makes a magical friend and gets a boost into better circumstances thanks to a shoe. But not every detail of the fictional servant-turned-queen’s background is predictable — here are six fascinating facts you might not know about the “Cinderella” folktale and movie.

Detail Rhodopis on Plinth by Charles Francis Fuller.
Credit: Frank Nowikowski/ Alamy Stock Photo

The First Cinderella Story May Have Come From Ancient Greece

The ball-gown-bedazzled Cinderella we know today is far from her origins, which may have been in ancient Greece. Some researchers point to the tale of Rhodopis, a story recorded by Greek geographer Strabo around the first century BCE, as a possible beginning. In that account, Rhodopis is a courtesan whose shoe is stolen by an eagle and dropped into the lap of an Egyptian pharaoh. Seeing the shoe as an omen from the gods, the royal sends soldiers throughout the kingdom to track down the shoeless woman, who eventually becomes his wife. However, not everyone agrees that the tale of Rhodopis is truly the first “Cinderella” story. Some historians say that Strabo’s brief description of the tale is only similar to today’s version in that it hinges on a shoe; the centuries-old version lacks a fairy godmother, cruel stepmother, and other key components we now think of as standard.

View of a Cinderella poster.
Credit: swim ink 2 llc/ Corbis Historical via Getty Images

There Are More Than 700 Versions of the Story

Whether or not Rhodopis was the first Cinderella, she certainly wasn’t the last. Fairy tales with similar shoe-based plots have cropped up worldwide — some librarians count more than 500 versions found in Europe alone, while global counts are as high as 700.

Culture has played a heavy role in each story’s details. One Italian rendition renames the princess “Zucchettina” because she was born inside of a squash. In the Danish tale, Cinderella (there called “Askepot”) wears rain boots, a detail particularly fine-tuned to Denmark’s rainy climate. However, in the version that has had the most recent popularity, first penned by French author Charles Perrault in 1697, “Cendrillon” is eventually found by her prince thanks to a glass slipper — the first edition of the story to include such a delicate shoe.

Cinderella's glass slipper at Disney's "Cinderella" Library of Congress National Film Registry Ball.
Credit: Kris Connor/ Getty Images Entertainment via Getty Images

The Famed Glass Slipper May Have Been a Political Statement

Perrault’s choice to cast Cinderella’s sparkling shoes from glass may have been less about fashion and more about politics, according to some academic researchers. Historian Genevieve Warwick at the University of Edinburgh believes that the detail was actually meant in part to poke fun at Louis XIV, king of France from 1642 to 1715. During his reign, Louis XIV (who was responsible for developing Versailles into a lavish palace) was known for donning extravagant clothing, particularly shoes. Perrault, who worked as a secretary overseeing construction at Versailles — known for its Hall of Mirrors — and the Louvre (especially glasswork), may have added the glass slipper detail as a bit of satire, mocking the increasingly ostentatious and impractical French fashions of the time; after all, it would be incredibly difficult to actually dance in shoes made of glass.

Yet there may have also been a layer of economic nationalism: Perrault was in charge of setting up a royal glassworks for France, which meant the nation no longer needed to be dependent on the glassmakers of Venice. Warwick thinks Cinderella’s transformation may have been read by contemporary readers as a metaphor for France’s self-determinism, and newfound ability to make the king’s beloved luxury products for itself.

Walt Disney working in his studio.
Credit: Bettmann via Getty Images

Walt Disney Sketched His First Cinderella Nearly 30 Years Before The Feature Film

Disney’s feature-length adaptation of “Cinderella” premiered in 1950, though the illustrator actually began tinkering with the story some three decades before. At Laugh-O-Gram, Disney’s first studio in Kansas City, the artist tested out his animation skills through an interest in fairy tales. In 1922, the young animator produced a silent, seven-minute version of “Cinderella” in which her only friend was a cat who helped with housework, and her fairy godmother sent her off to the ball in flapper attire and a car instead of a pumpkin. That same year, Disney also put out cartoon shorts of “Little Red Riding Hood” and “Beauty and the Beast” (which the company would successfully return to in 1991).

Cinderella screen-grab from 1950.
Credit: LMPC via Getty Images

“Cinderella” Saved Walt Disney From Bankruptcy

Cinderella was Walt Disney’s sixth full-length animated film (following Snow White and Bambi, among others), but it was the project that finally solidified his studio’s success. Disney and a team of animators spent six years developing Cinderella before its 1950 premiere, and the production wasn’t just a major investment of time — it was a huge financial gamble. World War II had slowed the studio’s projects and Disney had racked up nearly $4 million in debts to keep the business running; Cinderella cost around $2 million to produce and would likely have shuttered Disney’s business if it flopped. Luckily, the film grossed more than $4 million at the box office and gained three Oscar nominations for its soundtrack, which helped usher in a new era for Disney’s studio.

Actress Julie Andrews poses with a glass slipper in the role of Cinderella, circa 1957.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Rodgers and Hammerstein’s Adaptation Was Their Only TV Musical

Broadway superstars Richard Rodgers and Oscar Hammerstein II wrote 11 musicals during their partnership, though the duo created only one specifically for television viewers: Cinderella. The 90-minute production featured actress Julie Andrews in the leading role, to glowing reviews. Rodgers and Hammerstein’s sole TV musical debuted on March 31, 1957, and drew more than 100 million viewers — more than 60% of American households tuned in. Like the everlasting story, Rodgers and Hammerstein’s version has been remade for TV and stage time and again in the decades since it aired.

Nicole Garner Meeker
Writer

Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.

Original photo by Try_my_best/ Shutterstock

We spend more than a third of our lives unconscious, sleeping in beds (or elsewhere) to prepare our minds and bodies for the day ahead. Although this activity takes up a significant portion of daily life, scientists are still discovering fascinating attributes of the human sleep-wake cycle, developing a more nuanced understanding of dreams, and coming to grips with the devastating effects of sleep deprivation and disorders. These five facts delve into the science of sleep.

Man comfortably sleeping in his bed at night, with a dreaming cloud above.
Credit: Fer Gregory/ Shutterstock

12% of People Dream in Black and White

Dreams are an important mechanism of the human mind. What seems like a series of random thoughts and events is actually the brain trying to make sense of the day, remembering things that are important, forgetting things that aren’t, and overall preparing our biological computers for tomorrow. While most people dream in full color, around 12% of the population is tuned to TCM (so to speak), and often experiences dreams in only black and white. The analogy to television is an apt one, as researchers discovered in 2008 that people under the age of 25 almost never dreamed in monochrome, while members of the boomer generation and older had dreams devoid of color roughly a quarter of the time. Although it is difficult to prove definitively that TV is to blame, the number of people who reportedly dream in grayscale has slowly fallen over subsequent decades.

Portrait of tired young man sleeping while sitting at dining table in kitchen.
Credit: Prostock-Studio/ iStock

Poor Sleep Reduces a Human’s Pain Threshold

Having a poor night’s sleep comes with a multitude of real-world side effects, including sluggishness, irritability, and poor concentration. Over the long term, things get even more dire, as poor sleep can contribute to obesity, high blood pressure, and an overall weaker immune system. Sleep can also have a surprising correlation with how much pain a human can withstand. In 2015, a National Sleep Foundation poll discovered that two out of every three people experiencing chronic pain also suffered from sleep deprivation. Statistics like this inspired scientists from UC Berkeley to figure out how exactly sleep is entwined with pain tolerance. After studying two dozen healthy young adults, the researchers realized the neural mechanisms that evaluate pain signals and activate appropriate relief measures are disrupted when someone doesn’t get enough sleep. Just another reason (among many) that you should always try to get a good night’s rest.

Above view of smiling woman sleeping in bed.
Credit: skynesher/ iStock

Not Every Person Needs the Same Amount of Sleep

Some people seem to tick along just fine on five hours of sleep while others can’t even think straight on anything less than nine hours. That’s because the common recommendation of getting eight total hours of sleep is really an average — not a rule. Although a common indicator for how much sleep you need is often based on age (for example, kids need more sleep than adults because they’re still growing), differences also occur from person to person. Scientists have identified a significant portion of humans who require less than six hours to feel well rested because these sleep champions actually have a mutated gene that codes certain receptors that affect the sleep-wake cycle. These people experience higher-quality sleep that takes up less time than the average human needs to spend getting shut-eye.

An alarm clock, sleeping pills, an eye mask and a black board reading rem sleep.
Credit: Ben Gingell/ Shutterstock

Your Muscles Are Paralyzed During REM Sleep

Dreaming occurs during a process known as REM (rapid eye movement) sleep. The name comes from the physical movement of our eyes while experiencing dreams. During these bouts of REM sleep, of which there are four to six per night, brain activity changes and causes paralysis in our muscles. This normal effect of REM sleep is what’s known as muscle atonia, and it’s designed to keep humans from injuring themselves in their sleep. However, sometimes a person’s muscles still retain function during REM sleep and can cause a person to act out their dreams. This is known as REM sleep behavior disorder, and can be a real danger to the dreamer, or in some cases, the dreamer’s partner.

The reverse is also possible, as sleep paralysis occurs when someone wakes from REM sleep only to discover that they can’t move their body or speak. Both of these sleep disorders (along with many others) are types of parasomnias.

A stressed women sitting next to her bed.
Credit: PonyWang/ iStock

Extreme Sleep Deprivation Can Lead to Psychosis

While being a poor sleeper can have serious side effects, getting no sleep at all can be downright deadly. Throughout the day, our bodies burn energy and create a byproduct in the brain known as adenosine. The buildup of this nucleoside is what causes us to feel sleepy. In fact, caffeine works by blocking adenosine from binding, making us more alert as a result. While we sleep, a waste clearance system known as the “glymphatic system” essentially removes this buildup of adenosine while using cerebrospinal fluid to remove toxic byproducts throughout the central nervous system. After sleeping the required eight (or so) hours, the brain is refreshed and ready for the day ahead. However, if someone puts off going to sleep for a long period of time, adenosine builds up in the brain and eventually disrupts our visual processing system, which in turn triggers hallucinations and, in rare cases, even death. In other words, spending one-third of our lives in bed may seem like a waste of time, but sleeping may be the most important thing we do every day.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Michelle Bridges/ Alamy Stock Photo

Some have days of the week named after them and others are Marvel superheroes, but many Norse gods haven’t been thought about much outside of academic circles since the Icelandic poet and historian Snorri Sturluson wrote about them in the 13th century. That’s a shame, since the pantheon of Norse mythology extends far beyond the likes of Thor and Odin — and includes such deities as the crone who brought the god of thunder to one knee in a wrestling match. (Snorri wrote the Prose Edda, which, alongside the Poetic Edda, whose author is unknown, remains the foundational text for modern understanding of Norse mythology.) Here are five lesser-known mythological figures Snorri wrote about, and why they’re worth knowing — or perhaps even making a movie or comic book — about.

Vintage illustration of Thor being defeated by Elli.
Credit: Michelle Bridges/ Alamy Stock Photo

Elli

Few mythical figures, whether gods or otherwise, can claim to have held their own against Thor. Even fewer can say they beat him, but the giantess Elli is one of those who can rightfully make the boast.

Admittedly, there’s some trickery involved in the story. The giantess, considered the personification of Old Age, is said to beat Thor in a wrestling match while the god of thunder visits the giant king Utgard-Loki in his castle. As part of a series of tests of strength, Thor agrees to wrestle Utgard-Loki’s nurse — a challenge he accepts without realizing his opponent’s true identity. Thor struggles throughout the contest until Elli forces him to one knee, at which point Utgard-Loki declares the match over, and commends Thor for faring as well against old age as he did.
The tale is recounted in the Gylfaginning section of the Prose Edda, and sadly marks Elli’s only mention in the text.

Heart is locked in a cage, similar to Norse mythology Lofn.
Credit: oatintro/ iStock

Lofn

Norse mythology tends to evoke images of strength, battle, and violence. One exception is Lofn, a kind of matchmaker who specializes in forbidden love affairs. She’s described by Snorri as “so gracious and good to call on that she gets permission from Alfodr [Odin] or Frigg for the intercourse of people, men and women, although otherwise it would be banned or forbidden.” Also known as “The Comforter,” the goddess of love and gentleness has a special fondness for small and/or helpless beings. “Lof,” meaning “praise,” is derived from her name.

Víðarr on horseback.
Credit: Historic Images/ Alamy Stock Photo

Víðarr

Sometimes known as the Silent God, Víðarr (also anglicized as Vidar and Vithar) is the son of Odin and the jötunn (a being akin to a giantess) Gríðr — making him Thor’s half-brother. He’s often associated with vengeance, and with good reason: Odin’s ultimate fate is to be killed by the wolf Fenrir during Ragnarök, the “Twilight of the Gods” that’s essentially Norse mythology’s end of the world; Víðarr’s destiny, meanwhile, is to avenge his father by slaying Fenrir. Víðarr is also one of the few gods who survives Ragnarök (at least in some accounts), though little is written about him beyond his actions during these cataclysmic events other than to mention his status as the second-strongest god after Thor.

Víðarr is mentioned in both the Prose Edda and Poetic Edda, with the latter describing his most important deed thusly:

“Then comes Sigfather’s | mighty son,

Vithar, to fight | with the foaming wolf;

In the giant’s son | does he thrust his sword

Full to the heart: | his father is avenged.”

The giantess Angrboda.
Credit: Ivy Close Images/ Alamy Stock Photo

Angrboða

With a name that’s been translated as “she who brings sorrow” and “grief-bringer,” Angrboða has a lot to live up to. For better and (mostly) for worse, she does. A giantess (jötunn) and one of the trickster god Loki’s lovers, she ultimately gives birth to three monsters: Fenrir, the wolf fated to kill Odin during Ragnarök; Hel, who rules over the dead; and Jörmungandr, the serpent who encircles the entire world and is Thor’s archnemesis. The mother of monsters is indirectly responsible for some of Norse mythology’s most catastrophic events, though there’s no indication that Angrboða herself is evil — after birthing that terrible trio, she’s mostly known to reside in Jötunheim (the land of the giants) on her lonesome without any contact with either Loki or the monstrous spawn they had together. Some people’s children, as they say.

Drawing of Hoenir.
Credit: Alto Vintage Images/ Alamy Stock Photo

Hoenir

Hoenir — whose name is spelled several different ways (Hönir is also common) — works alongside Odin and Loki to create the first humans, Ask and Embla, by imbuing two pieces of driftwood with “essential gifts” whose exact properties remain a matter of debate centuries later. Here’s how the moment is described in Völuspá (Prophecy of the Seeress), the first poem in the Poetic Edda:

“They had no breath,

they had no soul,

they had neither hair nor voice,

nor a good appearance.

Odin gave them breath,

Hoenir gave them a soul,

Lodur / Loki gave them hair

and a good appearance.”

Here’s where it gets confusing. Hoenir’s gift imbues the humans with óðr, an untranslatable Old Norse word that can encompass everything from understanding to poetic inspiration to frenzy on the battlefield. But since óðr is the root of Odin’s name and another Norse tale suggests that humans derive their óðr from Odin himself, some consider this mention of Hoenir to be an extension of Odin himself. Hoenir remains important not despite this ambiguity but because of it — much of Norse mythology is murky and ambiguous, with few figures embodying those qualities quite like he does.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by Juanmonino/ iStock

Plenty of dishes have names that have nothing to do with their ingredients: No frogs are harmed in the making of toad in the hole, sweetbreads are neither sweet nor baked, and puppies are definitely not included in hot dogs. Boston cream pie is delicious, but not pie — and Welsh rabbit (aka “rarebit”) is vegetarian. The culinary misdirections continue when it comes to dishes containing place names. Here are five foods with names that are miles from the places where they actually originated.

Aerial view of a Hawaiian pizza slice.
Credit: Vasin Lee/ iStock

Hawaiian Pizza

While the war over pineapple as a pizza topping divides the world, the controversy originated nowhere near the Aloha State. Hawaiian pizza, the savory pie combining the salty umami of ham (or Canadian bacon) with the sweetness of pineapple, was the product of a Greek immigrant operating a restaurant in Ontario, Canada. Sam Panopoulos added Hawaiian-brand canned pineapple as a novelty topping in 1962, and the combination (along with the ’60s fascination with all things “tiki”) slowly gained popularity. In 1999, Hawaiian even became the most popular pizza style in Australia, accounting for 15% of sales.

Meatballs in a pan with cream sauce.
Credit: Yulia_Kotina/ iStock

Swedish Meatballs

Springy and savory, these meatballs are practically synonymous with Sweden — but everyone’s favorite IKEA offering is likely based on a dish from the Ottoman Empire. King Charles XII of Sweden was impressed by the entree while in exile in what is now Moldova during the early part of the 18th century. The meatballs, called kötbullar in Sweden, may be derived from the spiced lamb and beef recipe for köfte, a signature dish in Turkish cuisine. The Swedes substituted pork for lamb, and the dish is traditionally served with a silky sour cream-based gravy atop a bed of mashed potatoes or egg noodles and accompanied with tangy lingonberry jelly.

Baked Alaska Ice Cream Cake.
Credit: Katheryn Moran/ Shutterstock

Baked Alaska

In 1867, the U.S. bought 375 million acres from Russia, land that would become Alaska. The purchase also inspired Delmonico’s chef Charles Ranhofer in New York to create a confection he dubbed “Alaska, Florida.” Spice cake was topped with a dome of banana ice cream — an expensive and exotic luxury at the time — then crowned with a layer of meringue toasted to a golden brown. A simplified version called “Alaska Bake” showed up in a Philadelphia recipe book in 1886, and within a few years “baked Alaska” was being offered on several menus around New York. Since then, baked Alaska has become a celebratory sweet, and the fancy dessert is a favorite for birthdays and other special occasions.

Meat cutlet with boiled egg, pieces on a dark wooden background.
Credit: Iaroshenko Maryna/ Shutterstock

Scotch Eggs

The pub food and picnic staple known as a Scotch egg is a popular snack across the U.K., but its origins may lie far from the British Isles. Along with curries and chutney, British soldiers returning from the occupation of India may have imported nargisi kofta — a dish of shelled hard-boiled eggs wrapped in spiced ground lamb, deep-fried, and served with an aromatic tomato sauce. Iconic department store Fortnum & Mason claims to have invented the British version in 1738, but the northern England county of Yorkshire maintains that the “Scotch” in the name came from eatery William J Scott & Sons, where the original version was wrapped in fish paste and the treats were nicknamed “Scotties.” Modern versions are usually coated in sausage and rolled in breadcrumbs before being deep-fried.

A set of fresh sushi rolls with salmon, avocado and black sesame seeds.
Credit: Andrei Iakhniuk/ Shutterstock

California Roll

Many Americans’ first introduction to sushi comes in the form of a California roll, but the approachable offering probably doesn’t come from Japan via the Golden State (although a couple of Los Angeles chefs do claim credit, and the origin is somewhat uncertain). Chef Hidekazu Tojo studied in Osaka before emigrating to Vancouver, B.C., in 1971. Noting that his new customers were intimidated by raw fish and seaweed, Tojo reversed the traditional roll process, encasing the unfamiliar ingredients inside a layer of rice. The “inside-out” rolls were popular with guests from California and also included avocado — popular in dishes from the state — which led to the name. At Tojo’s own restaurant, they’re simply known as “Tojo rolls.”

Cynthia Barnes
Writer

Cynthia Barnes has written for the Boston Globe, National Geographic, the Toronto Star and the Discoverer. After loving life in Bangkok, she happily calls Colorado home.

Original photo by Science History Images/ Alamy Stock Photo

Louis Armstrong changed the face of jazz in the 20th century, with enduring hits such as “West End Blues,” “Hello, Dolly!” and “What a Wonderful World.” Born in 1901, the influential trumpeter and vocalist started playing gigs as a child in New Orleans, and long after his death in 1971, remains a giant of the genre.

Satchmo (as he was lovingly nicknamed) had a long and rich career, but was he always a singer? Which enduring hit went unnoticed for decades? And how did he revolutionize the trumpet? Take a journey through the “wonderful world” of Louis Armstrong with these seven amazing facts about his life.

Publicity photo of American jazz trumpeter Louis Armstrong.
Credit: JP Jazz Archive/ Redferns via Getty Images

Louis Armstrong’s Childhood Nickname Was “Dippermouth”

Long before “Satchmo” came along, Armstrong was known in his childhood home in the Storyville district of New Orleans as “Dippermouth,” or “Dipper” for short. He supposedly got the moniker from his wide smile as a child, although the nickname later came to be associated with his embouchure (the way a player puts their mouth around an instrument).

Armstrong’s mentor, King Oliver — a fixture in the Storyville jazz scene during Armstrong’s youth — recorded a song in 1923 called “Dippermouth Blues,” which he co-wrote with Armstrong. Dipper himself would later go on to record his own version in 1936.

Armstrong in the band at the Colored Waifs Home in New Orleans.
Credit: Pictorial Press Ltd/ Alamy Stock Photo

Armstrong Honed His Skills in a “Waif’s Home”

After firing off six blanks at a New Year’s Eve party in New Orleans in 1912, 11-year-old Armstrong was arrested and sent to the Colored Waif’s Home for Boys, a facility that was part juvenile detention facility, part orphanage, and part reform school. It was his second stay at the home — according to recently uncovered records, Armstrong did a brief stint there when he was only nine, after he and five of his friends were arrested for being “dangerous and suspicious characters,” a charge used often at the time to detain people without cause.

By the time of his second stay, the Waif’s Home had hired a music teacher and started a band program. Under the tutelage of instructor Peter Davis, Armstrong learned the bugle and the coronet, and spent some time as the bandleader. Early on, he showed a skill for harmonizing and improvising that seemed beyond his years. It was far from his first exposure to the instruments, but it was the first time he received proper training. Armstrong started playing gigs after his release from the home in 1914.

Famed jazz trumpeter Louis "Satchmo" Armstrong.
Credit: Bettmann via Getty Images

Armstrong Didn’t Start Out as a Trumpet Player

While he is remembered today for his distinctive voice and rich trumpet solos, the trumpet wasn’t Armstrong’s original instrument of choice — even years into his career. Satchmo rose to prominence in his mid-teens playing the cornet, which is similar to a trumpet but smaller and with a few subtle differences.

For example, a trumpet is a cylindrical brass instrument, meaning the tube stays the same diameter throughout, but a cornet’s tube tapers off on its way to the mouthpiece, giving it a mellower tone. From the 1800s to the mid-1900s, the cornet was a standard part of a brass or jazz ensemble, as well as a popular solo instrument. While trumpets were also played, they weren’t typically solo instruments.

Armstrong, however, has been credited with reinventing the trumpet in the public consciousness. In the mid-1920s, as Armstrong tells it, the bandleader of an orchestra he played with said he “looked funny… with that stubby cornet.” The band’s other brass player played the trumpet, and Armstrong thought the sound of two trumpets sounded better. He started playing the trumpet as he would the cornet, with extensive improvisation and crowd-pleasing solos. He wasn’t the only jazz musician doing this, but as he rose to national prominence, his inventive style helped change public opinion about what a trumpet could sound like.

Group portrait of American jazz musician Louis Armstrong and his orchestra.
Credit: Charles Peterson/ Archive Photos via Getty Images

He Used to Play in a Silent Movie Orchestra

In the mid-1920s, Armstrong played with Erskine Tate’s Vendome Orchestra — the same band that inspired him to pick up a trumpet. The Big Band ensemble was one of the early players on Chicago’s jazz scene and performed at the Vendome Theatre in Chicago, providing accompaniment and intermission entertainment for silent films. Armstrong not only played jazz solos, but also performed operatic arias.

Louis Armstrong pictured moisturizing his lips while traveling on a train.
Credit: Pictorial Parade/ Archive Photos via Getty Images

Lip Injuries Were a Common Ailment for Armstrong

There was one bad habit Armstrong picked up at the Waif’s Home: a poor embouchure that proved unsafe for his face. Bad form is especially dangerous with brass instruments, including cornets and trumpets, because shifting one’s embouchure is fundamental to playing a melody, requiring near-constant lip and tongue movement.

According to Armstrong, the damage started early in his career. “In my teens, playing in that honky tonk all night long, I split my lip wide open,” Armstrong recalled in a 1966 Life interview. “Split my lip so bad in Memphis, there’s meat still missing. Happened many times. Awful. Blood run all down my shirt.”

While some of his peers sought professional help and even plastic surgery, Satchmo treated his lips using home remedies. He had a special salve he’d apply to his lips, and when callouses built up, he’d shave them down himself with a razor and take some time away from performing. One particularly nasty split in 1935 took him offstage for a year. While embouchure overuse syndrome can be common among brass players, it’s perhaps associated with Armstrong more than any other musician. Some doctors even use the term Satchmo syndrome for a tear in the lip muscle.

Jazz trumpeter Louis Armstrong performs at the Newport Jazz Festival in 1970.
Credit: Tom Copi/ Michael Ochs Archives via Getty Images

Armstrong Insisted on Adding Singing to His Act

Armstrong is almost as well known today for his distinctive, gravelly singing voice as he is for his trumpet skill. While he formed a vocal quartet with other kids in his neighborhood and sang in a choir at the Waif’s Home, Satchmo built his early career on the cornet and later the trumpet, not singing.

In 1924, he joined the Fletcher Henderson Orchestra, then a big name in the New York City music scene, for an engagement at the Roseland Ballroom. Armstrong asked repeatedly to sing, yet recalled that Henderson wasn’t interested. But according to jazz drummer Kaiser Marshall, Satchmo found a way to sneak it in anyway: Roseland would host a Thursday revue of amateur performers (similar to an open mic), and one night Armstrong went on stage and performed “Everybody Loves My Baby,” both on cornet and vocals. Marshall recalled that “the crowd surely went for it … from then on they used to cry for Louis every Thursday night.”

Cover of vinyl album What A Wonderful World by Louis Armstrong.
Credit: EyeBrowz/ Alamy Stock Photo

“What A Wonderful World” Took 20 Years to Reach the U.S. Charts

Armstrong’s most popular song, “What a Wonderful World,” topped the British music charts upon its 1967 release, staying at No. 1 for 13 weeks. The inspiring tune was a hit elsewhere in Europe and South Africa, too, but because the president of Armstrong’s record company, Larry Newton, disliked the song, the record was never actually promoted in the United States. According to the song’s co-writer and producer Bob Thiele, it didn’t even crack 1,000 copies in the U.S. after its initial release.

But in 1987, 16 years after Armstrong’s death, “What a Wonderful World” was featured in the film Good Morning Vietnam. Only then did the song reach the Billboard Hot 100, where it peaked at No. 33. The original album was re-released and certified gold.

Armstrong was drawn to the song because it reminded him of Corona, Queens, where he and his last wife Lucille settled down permanently in 1942. “Lucille and I, ever since we’re married, we’ve been right there in that block,” Armstrong said in 1968, according to the Louis Armstrong House. “And everybody keeps their little homes up like we do and it’s just like one big family. I saw three generations come up on that block. And they’re all with their children, grandchildren, they come back to see Uncle Satchmo and Aunt Lucille. That’s why I can say, ‘I hear babies cry/I watch them grow /they’ll learn much more/than I’ll never know.’ … So when they hand me this ‘Wonderful World,’ I didn’t look no further, that was it.” Since then, the song has become a timeless classic, featured in many other films and shows and covered by artists such as Rod Stewart and Stevie Wonder.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by John Gaffen 2/ Alamy Stock Photo

On October 16, 1923, Walt Disney formally agreed to send a new series of short films to a New York distributor, thereby launching the Disney Brothers Cartoon Studio. Needless to say, a few things besides the company name have changed since then, as Disney has gone from a bare-bones operation to the creator of groundbreaking talking funnies, the stewards of iconic characters and franchises, and, finally, the overlords of a sweeping enterprise with interests all over the globe. With 100 years of movie magic in the rearview mirror, here’s a look at six facts about all things Disney.

a sketch by Ub Iwerks of Mickey Mouse.
Credit: JIM WATSON/ AFP via Getty Images

Animator Ub Iwerks Was Integral to Disney’s Early Success

Disney’s history is filled with the hard work of unsung geniuses, but none was as integral to the company’s foundational success as Ub Iwerks. Indispensable to Walt Disney since their days together at Kansas City’s Laugh-O-Gram Studio, Iwerks joined his friend in Hollywood in the 1920s to produce a groundbreaking live-action/animated series of short films called the Alice Comedies. Iwerks remained loyal to Disney after a distributor stole their creation of Oswald the Lucky Rabbit and hired away the studio’s animators. He’s credited with sketching the very first Mickey Mouse, and single-handedly animated the landmark 1928 Mickey cartoon Plane Crazy, with an output that reached 700 drawings in a single day. Although personal and creative differences prompted Iwerks to branch out on his own in 1930, he returned to the fold 10 years later as a special effects expert, and went on to bolster the studio’s animation capabilities with his innovations in optical printing and xerography.

FROZEN II scene in 2019.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Disney Has Created More Than 800 Films

More than 800 feature films have been made under the Disney banner since Snow White and the Seven Dwarfs hit theaters in 1937. The studio’s first full-length, live-action feature was Treasure Island, in 1950. Its first R-rated flick was Down and Out in Beverly Hills, developed under the then-recently inaugurated Touchstone Pictures subsidiary in 1986. Disney’s highest-grossing entry was (unsurprisingly) a Marvel movie, 2019’s Avengers: Endgame, while its highest-grossing animated feature also arrived that year with Frozen 2.

DER FUEHRER'S FACE.
Credit: RGR Collection/ Alamy Stock Photo

Disney Essentially Served as a Media Branch of the U.S. Military During World War II

After the attack on Pearl Harbor led to the requisitioning of Disney’s Southern California studio as an anti-aircraft base in late 1941, the company turned its focus to supporting the war effort. Several films produced during this time were used to train Army and Navy personnel; others, like Der Fuehrer’s Face (1943), were propaganda fare that portrayed stereotyped and inept versions of enemy leaders such as Adolf Hitler and Benito Mussolini. Additionally, the studio designed more than 1,200 insignia for various military units and helped raise funds by permitting its characters to appear on war bonds. All told, Disney was devoting more than 90% of its output to war-related material by 1943, enabling the studio to weather lean financial times and survive to deliver the next wave of classics, which included Cinderella (1950) and Peter Pan (1953).

Crowds walking around the Disneyland theme park in Anaheim, California.
Credit: Archive Photos/ Archive Photos via Getty Images

Disneyland’s Disastrous Grand Opening Was Dubbed “Black Sunday” by Employees

Although Walt Disney’s long-gestating dream of a theme park was realized with the televised grand opening of Disneyland in Anaheim, California, in July 1955, the disaster that unfolded was better suited for a nightmare. Most attractions remained unopened despite the rushed construction, and the sweltering heat transformed the fresh asphalt of Main Street, USA, into a sticky mess. Meanwhile, overcrowding from thousands of counterfeit tickets contributed to a 7-mile backup on the Santa Ana Freeway, and resulted in the park’s restaurants running out of food. But Disney remained unbowed by what was internally dubbed “Black Sunday,” and apparently so did the paying public: Disneyland surpassed 1 million in attendance just seven weeks later, and the company eventually doubled down on the theme park experience with the unveiling of Florida’s Walt Disney World in October 1971.

Promotional portrait of cast members of The Micky Mouse Club' television show.
Credit: Pictorial Parade/ Archive Photos via Getty Images

Disney Kick-Started the Careers of Numerous Celebrities

The House of Mouse has nurtured an impressive roster of young talents since Annette Funicello emerged as an original Mouseketeer in 1955. A teenaged Kurt Russell became a Disney film regular in the 1960s, before subsequent incarnations of The Mickey Mouse Club fueled the rises of mega pop stars Britney Spears, Justin Timberlake, and Christina Aguilera, along with A-list actor Ryan Gosling. Miley Cyrus and Olivia Rodrigo both starred on their own Disney shows before becoming chart-topping singers, while fellow Disney alums Zac Efron, Demi Lovato, Selena Gomez, and Zendaya achieved fame as musicians, actors, or both. And then there’s Steve Martin, who didn’t appear in a Disney feature until 1991’s Father of the Bride, but nevertheless learned to perform in public as a Disneyland employee from ages 10 to 18.

American film production label owned by Disney & Marvel Studios.
Credit: SOPA Images/ LightRocket via Getty Images

Disney Is a Very, Very Big Business

It’s been a long time since Disney was merely a studio of ink-stained animators and noisy voice actors, but even its visionary founder would likely be staggered by its multifaceted presence across numerous businesses today. Along with resorts in Paris, Tokyo, Hong Kong, and Shanghai, the Mouse Kingdom oversees a line of cruise ships, Hollywood Records, the Adventures by Disney travel company, and the Steamboat Ventures venture capitalist firm. Among its media subsidiaries, Disney owns 20th Century Studios, ABC, National Geographic, LucasFilm, and the massive cash cow that is Marvel Studios. Altogether, the century-old conglomerate was valued at just under $150 billion as of September 2023.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.