Original photo by Laura Stone/ Shutterstock

May Day has been a day of celebration in Europe for centuries, if not longer. The festival most likely arose out of ancient rites asking gods for fertile crops and healthy livestock. In medieval and Renaissance Europe, laborers welcomed spring with a day of drinking and dancing. Today, May Day is a public holiday in more than 180 countries around the world. These facts cover both its ancient origins and more modern symbolism honoring workers.

Young women bathing their faces in dew from the grass on May Day morning.
Credit: Topical Press Agency/ Hulton Archive via Getty Images

The Scots Believed May Day Dew Had Mystical Powers

The Druids, the religious leaders of the ancient Celts, made fires on the hilltops to honor the sun at dawn on May 1 in a festival called Beltane. They sprinkled themselves with May Day dew, considered a kind of holy water that would bring health and good fortune.

Scots, mainly young women, kept part of this tradition alive, ascending the nearest hill around 4 a.m. on May 1 to wash their faces in May Day dew. The desired outcome? A lovely complexion. As late as 1968, some 2,000 people climbed up Arthur’s Seat, an ancient volcano in Edinburgh, for the rite: “The summit of the hill was crowded with people old and young, huddled together trying to keep warm in the crisp, clear morning air,” by one account. In more recent years, however, the numbers who brave the cold for the sake of beauty have dwindled.

Children take part in the annual May Day Fete.
Credit: Bettmann via Getty Images

Maypole Ribbon Dancing Began in Victorian Times

You may have seen modern-day versions of old May Day traditions, with a May Queen crowned and maypole dances. Nowadays, you’ll likely see dancers each holding a ribbon that is interwoven around the pole.

But May Day dancing didn’t always include ribbons. Medieval Celts stripped a tree and wrapped it in flowers on Beltane. In the ensuing British tradition, villagers danced around a tall tree or a pole, still decorated with flowers. The first known maypole dance featuring people holding ribbons appeared in an 1836 play at the Royal Victoria Theatre in London. Afterward, villages picked up the idea and created their own variations.

Governor William Bradford interrupting the revelers at Merrymount.
Credit: Bettmann via Getty Images

May Day Celebrations Were Controversial in the American Colonies

For the devout Pilgrims who came to the New World, debaucherous May Day celebrations were forbidden — which is a large part of why the holiday has never been celebrated in the U.S. as it is in Europe. Yet a small group of traders came over around the same time as the Puritans to make money, not to escape religious persecution. They settled near Plymouth in a camp called Merry Mount. For May Day 1628, they set up an 80-foot-tall pine maypole crowned with deer antlers. One man, Thomas Morton, brewed a barrel of beer and invited Indigenous young women to the celebration.

Scandalized, the colony’s Puritan governor, William Bradford, declared that Morton had “revived and celebrated the feasts of the Roman goddess Flora,” linked to “the beastly practices of the mad Bacchanalians.” He placed Morton in the stocks — a device restraining one’s legs or feet — and sent him home to England. Nathaniel Hawthorne wrote a short story about the event, declaring that “May, or her mirthful spirit, dwelt all the year round at Merry Mount.”

A protest, which became known as Evil May Day.
Credit: Print Collector/ Hulton Archive via Getty Images

London Erupted in Riots on May Day

May Day hasn’t always been merry. In 1517, more than 1,000 angry Londoners rioted on what would be called “Evil May Day.” While the upper classes under King Henry VIII enjoyed luxuries like silks, spices, and oranges imported from abroad, poor Londoners felt that foreign workers were taking their work away. When city and royal officials charged nearly 300 Londoners with treason in the aftermath of the riots, the queen of Aragon begged her husband on her knees to show mercy. Nearly all of the people charged were pardoned.

Illustration of Haymarket Riot in Chicago.
Credit: Bettmann via Getty Images

May Day Commemorates the Haymarket Riot in Chicago

Nearly 400 years after “Evil May Day,” early May saw another worker protest that turned violent. It began with calls for a nationwide general strike (in support of an eight-hour workday), to occur on May 1, 1886. On May 3 in Chicago that year, police attacked and killed several picketing workers at a plant. At a protest meeting in Haymarket Square the following day, someone threw a bomb at the police, who opened fire; in the ensuing riot, seven police officers and four workers died. By August, eight men had been convicted for their supposed role in the bombing. Yet many considered them martyrs to the worker cause.  

To commemorate this occasion, in 1889 an international group of Socialist organizations and unions declared May 1 a day to support workers. The date became especially important in the former Soviet Union and the Eastern Bloc, with major parades. May Day is still celebrated as International Workers’ Day, although in the U.S. workers and laborers are more likely to be honored on Labor Day.

Students observing Vappu outside in Finland.
Credit: Patrick BERTRAND/ Gamma-rapho via Getty Images

The Finns Do It Up

The Finns observe Vappu from April 30 to May 1. The holiday honors St. Walpurga, and is also a celebration of the working class and students. Adults pull out their old white hats from high school graduation, wear costumes, and drink champagne or nonalcoholic mead while eating doughnuts at picnics in the parks. In Helsinki, the fun begins at 6 p.m. on April 30, when students gather at the Market Square to wash and put a white cap on the head of an Art Nouveau statue of a mermaid.

Temma Ehrenfeld
Writer

Temma Ehrenfeld has written for a range of publications, from the Wall Street Journal and New York Times to science and literary magazines.

Original photo by Science History Images/ Alamy Stock Photo

On April 15, 1947, Jackie Robinson trotted out to first base for the Dodgers at Brooklyn’s Ebbets Field, thereby erasing the color line that had kept Black players out of the top level of professional baseball for 63 years. While his success at opening the door for the great Black, Latino, and Asian players who followed is well known — and commemorated every April 15 by Major League Baseball — the achievement tends to overshadow other areas of his remarkable life. Here are six lesser-known facts about this American icon.

Portrait of a American baseball player Jackie Robinson as a young boy sitting on a chair.
Credit: Hulton Archive via Getty Images

A Young Robinson Was Known for Getting Into Trouble

According to Arnold Rampersad’s Jackie Robinson: A Biography, the young Robinson ran with a group of troublemakers in his native Pasadena, California, known as the Pepper Street Gang, who shoplifted and got into fights. He was also once arrested for swimming in the city reservoir. Fortunately, the charismatic teen had enough guidance to avoid spoiling the opportunities his athletic gifts would bring. The responsible adults in his life included a Methodist minister, a local mechanic who organized events to keep kids off the streets, and Robinson’s mother, Mallie, who ran the home as a single mother with a firm but loving hand when not cleaning houses to support her five children.

Jackie Robinson leaps through the air at a college track meet in the Los Angeles Coliseum.
Credit: Bettmann via Getty Images

Jackie Robinson Was the First Student-Athlete to Letter in Four Sports at UCLA

After rewriting the record books at Pasadena Junior College, Robinson continued to dazzle spectators as UCLA’s first four-sport letterman. One of the top football players in the country, he helped the Bruins to an undefeated season in 1939, and twice led the nation in punt-return yardage. Robinson also won the NCAA long-jump title in 1940 as part of the track-and-field squad, and twice topped the Pacific Coast Conference Southern Division in scoring as an undersized basketball star. Surprisingly, baseball was easily his worst sport in college, as Robinson hit just .097 in his lone season for the UCLA baseball team.

 Jackie Robinson testifying in a courtroom.
Credit: Bettmann via Getty Images

Jackie Robinson Was Court-Martialed After Refusing to Move to the Back of a Bus

About 11 years before Rosa Parks made history by refusing to budge from her seat, Second Lieutenant Robinson of Fort Hood, Texas, did the same while riding a military bus in July 1944. (Robinson had been drafted into the Army in April 1942.) The incensed driver summoned military police to settle the matter, and Robinson’s staunch insistence that he had done nothing wrong fueled additional charges of insubordination. Fortunately, the jury determined that he had acted appropriately throughout the incident, thanks in part to the shaky testimony of the prosecution’s witnesses, and Robinson was found not guilty of all charges the following month.

The first scenes in Robinson's picture The Jackie Robinson Story, the documentary film.
Credit: Bettmann via Getty Images

Jackie Robinson Starred in a Movie About Himself (in the Middle of His Career)

Following his successful entry into the major leagues, Robinson sought to capitalize on his fame with a book, 1948’s My Own Story, and a version adapted for the big screen, starring himself. However, the top Hollywood studios were loath to accept a project centered on the triumphant tale of a Black man; one allegedly wanted to add a part in which Robinson learns his exciting style of play from a white coach. The movie rights eventually went to the financially unstable Eagle-Lion Films, and while the untrained star was understandably a little stiff in his non-baseball-playing scenes, The Jackie Robinson Story (1950) proved popular enough with audiences to earn back five times its original investment.

Jackie Robinson and Martin Luther King talking.
Credit: Bettmann via Getty Images

Jackie Robinson Remained Engaged in Civil Rights After Retiring From Baseball

Although Robinson left baseball behind to become a vice president with the Chock full o’Nuts Corporation in 1957, he wasn’t content to simply sit in an executive suite as the battle for equality raged on. Along with authoring a syndicated column that gave him a prominent voice on social and political matters, the ex-athlete became a board member of the NAACP and frequently appeared alongside Dr. Martin Luther King Jr. at fundraising events and demonstrations. Additionally, Robinson sought to help out Black entrepreneurs as a founder of the Freedom National Bank of Harlem in 1964, and he later launched the Jackie Robinson Construction Company with an eye toward building housing for low-income communities.

Robinson standing outside of the Baseball Hall of Fame in Cooperstown, New York.
Credit: Icon Sportswire via Getty Images

Robinson’s Hall of Fame Plaque Initially Left Out His Integration of MLB

When Robinson earned baseball’s highest individual honor with induction into the Hall of Fame in 1962, his Hall of Fame plaque noted several batting and fielding statistics, but nothing about his role in breaking the color barrier. This apparently was at the behest of Robinson himself, who wanted to be judged by his qualifications as a ballplayer and not by any special racial designation. Nevertheless, this glaring omission was corrected in 2008, with the blessing of his wife and daughter, and his plaque inscription was amended to close with: “Displayed Tremendous Courage and Poise in 1947 When He Integrated the Modern Major Leagues in the Face of Intense Adversity.”

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by nemke/ iStock

April is a time of change. With the vernal equinox in the recent rearview mirror in the Northern Hemisphere, nature is slowly stirring from its months-long slumber, preparing to soon be in full bloom. April also has outsized importance compared to other months: The ancient Romans tied the month to the goddess Venus because of its beautiful and life-affirming effects, and for thousands of years the month was seen as the true beginning of the year. Today, April is full of moments of mischief, reverence, and a budding excitement for the warmer times ahead. These six facts explore the history of the month, and why it’s sometimes considered one of the best times of the year.

April month name written on wooden isolated cubes.
Credit: OleynykO/ Shutterstock

The Word April Comes From the Latin “Aperire,” Meaning “To Open”

When it comes to the names of months, April is a bit of an outlier. Other months are either intimately tied to Roman history and culture — whether named after Roman gods (January, March, June, etc.), rituals (February), or leaders (July and August) — or are related to Latin numbers (September to December). April, however, is simply derived from the Latin aperire, which means “to open.” This is likely a reference to the beginning of spring, when flowers “open” as the weather warms.

Although April’s name isn’t etymologically tied to Roman culture, April (or Aprilis, as the Romans called it) was a month dedicated to the goddess Venus, known as Aphrodite in the ancient Greek pantheon. On the first day of April, Romans celebrated a festival known as Veneralia in honor of the goddess of love, beauty, and fertility. This has led some scholars to wonder if the month’s name was actually Aphrilis in reference to the goddess.

Cute little bunny sleeping in the basket and easter eggs in the meadow.
Credit: Sasiistock/ iStock

The Anglo-Saxons Called It “Ēosturmōnaþ” — the Root of “Easter”

One of the most important holidays in April (and occasionally March) is the celebration of Easter, which marks the death and resurrection of Jesus. Much like Christmas, this holiday has pagan origins, and its name is derived from the Anglo-Saxon term for the month, “Ēosturmōnaþ.” That name literally meant “Ēostre’s month,” a reference to the West Germanic spring goddess of the same name. The only known historical text mentioning Ēostre comes from the Venerable Bede, a Christian monk who lived in the eighth century CE and who mentions the goddess (and the festivals dedicated in her name) in his work The Reckoning of Time. Because so little evidence of Ēostre exists, some wonder if the goddess was a complete invention of Bede’s, but whether she was real or not, Ēostre remains the namesake of April’s holiest days for Christians.

April 2022 calendar with pen on multicolored background.
Credit: Gam1983/ iStock

April Used To Be the Second Month of the Year

You know how September, October, November, and December mean the seventh, eighth, ninth, and tenth month, respectively? (If you didn’t, you do now.) There was a time when those names actually made sense, and that time was around the eighth century BCE in ancient Rome, when legend says the 10-month Roman calendar was created by Romulus — the mythical founder of Rome.

Romulus’ calendar began in the month of March (Romulus himself was said to be the son of the Roman god Mars), making April the second month of the year. But Romulus’ calendar had many shortcomings (chief among them being that it was about 60 days too short), so around the eighth century BCE, legendary Roman king Numa Pompilius decided to create two new months, known today as January and February. By at least 450 BCE, both had moved to the start of the year, and April has been the fourth month of the year ever since.

Wet poppy meadow just after an April shower.
Credit: PippW/ Shutterstock

The “April Showers” Saying Dates Back to 16th-Century England

The surprisingly resilient phrase “April showers bring May flowers” first appeared in English poet Thomas Tusser’s 1557 work A Hundreth Good Pointes of Husbandrie, which contained both poetry and practical advice. The book was widely read throughout England, and scholars believe it was possibly the most popular book of poetry during the Elizabethan era. In the book, Tusser writes, “Swéete April showers, Doo spring Maie flowers.” Of course the validity of such a phrase is very much dependent on where you live. In the U.S., for example, April is only the fifth-wettest month on average, with June often being the wettest overall. Today, the phrase is often seen as less of an indictment of April’s proclivity for wetness and more of a statement on the value of patience and persistence.

Lyrids meteor shower over Austria.
Credit: Thomas Kronsteiner/ Getty Images News via Getty Images

The Oldest Recorded Meteor Shower Occurs in April

Occurring between April 16 and 25 (and peaking on April 22), the Lyrids are the oldest recorded meteor shower still visible in the night sky today. Chinese astronomers recorded a Lyrids meteor shower back in 687 BCE, and 10 to 20 meteors can be seen per hour during the shower’s peak. The Lyrids are named after the constellation Lyra, which is where the meteors seem to streak from in the night sky. However, these meteors don’t actually come from that region of space; they’re the result of the Earth crossing the debris-strewn path of comet C/1861 G1 Thatcher, which was named after its discoverer A. E. Thatcher. Although people down south can glimpse a few meteors, the best meteoric displays are reserved for the Northern Hemisphere.

Man wearing sneakers with tied together laces, closeup.
Credit: New Africa/ Shutterstock

We’ll Likely Never Know the Origin of April Fools’ Day

One of the oddest annual traditions on the modern calendar falls on the first day of April, otherwise known as April Fools’ Day. Once a day reserved for harmless pranks pulled on friends and family, April Fools’ Day now reaches into the furthest depths of the internet, with multimillion-dollar brands and corporations getting in on the fun. Although the tradition is certainly an oddity, it’s stranger still that no one is exactly sure where April Fools’ Day comes from. Some historians think when France moved to the Gregorian calendar in the 16th century, those who still celebrated the new year in April (having not gotten the memo, wilfully or otherwise, about the calendar change) were labeled “April fools.” Others have tied the tradition to an ancient Roman festival called Hilaria, which took place in late March, along with many more theories. A more modern version of April Fools’ Day took root in 18th-century Britain before evolving into the mischief holiday we know today.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Gustavo Frazao/ Shutterstock

On April Fools’ Day, it’s hard to separate fact from fiction, as mischievous pranksters pull lighthearted gags around the world. On some occasions throughout history, April Fools’ pranks have even blossomed into ruses that fooled hundreds, if not thousands, of people. Whether it’s corporations, magazines, or broadcasting networks getting in on the fun, here are six famous and funny April Fools’ Day pranks.

Children from St Ann's Primary School have fun recreating the iconic Spaghetti-Tree hoax.
Credit: Handout/ Getty Images News via Getty Images

The Spaghetti Tree

On April Fools’ Day 1957, the BBC informed viewers in England that there was a purported “spaghetti farm” in Switzerland where pasta grew on trees. The network aired a fabricated video featuring Swiss women harvesting spaghetti from an orchard, narrated by war correspondent Richard Dimbleby — a veteran broadcaster known for his straightforward demeanor, which only added to the prank’s believability. In his narration, Dimbleby noted that the annual harvest was expected to be particularly bountiful thanks to the eradication of the “spaghetti weevil,” which was considered to be the plant’s main predator.

At the time, many Brits were unfamiliar with Italian food, and hundreds of viewers called the BBC asking how they could grow spaghetti-producing plants of their own. In response, phone operators instructed the gullible callers to “place a sprig of spaghetti in a tin of tomato sauce and hope for the best.” The prank proved to be so believable that it even fooled BBC staff members, including the network’s general director, who researched the purported farm utilizing several independent sources before concluding that it was all a ruse.

New York Mets in awe of fictional character Hayden Sidd Finch during spring training.
Credit: Lane Stewart/ Sports Illustrated via Getty Images

Sidd Finch, the Flamethrowing Baseball Prospect

In 1985, the sports world was abuzz about a newly discovered pitching prospect for the New York Mets, named Hayden “Sidd” Finch. Finch was the focus of a lengthy Sports Illustrated magazine profile written by George Plimpton, though there was a catch. The article ran on April 1, 1985, and it was entirely fabricated.

Plimpton’s article introduced the nation to a “pitcher, part yogi and part recluse,” nicknamed after Siddhartha Gautama, aka Buddha. Finch’s shocking scouting report claimed that Sidd was capable of throwing a baseball 168 miles per hour, and was also pretty good at playing the French horn. The article was made more believable with quotes from well-known Mets players and coaches, as well as a detailed series of photos showing Finch in full uniform playing his French horn by the seaside and wearing only one shoe on the pitcher’s mound. In reality, photographer Lane Stewart had recruited a friend, Joe Berton, to stand in as Finch for the piece. Berton attended Mets’ Spring Training posing as Finch, interacting with other players and signing autographs for fans who were none the wiser.

Former Pres. Richard Nixon sporting black tie, attending White House dinner.
Credit: Diana Walker/ The Chronicle Collection via Getty Images

Richard Nixon’s 1992 Presidential Campaign

If you were listening to NPR’s Talk of the Nation program on April 1, 1992, then you likely heard the shock of a lifetime. During that episode, disgraced former U.S. President Richard Nixon — who had resigned the presidency in 1974 — announced that he would seek the office in the 1992 election. Listeners were understandably confused, and called in to profess their opposition to the unexpected Nixon campaign. Later, though, they came to realize it was all a prank.

The ruse was concocted by NPR’s John Hockenberry, who had teamed up with political impersonator Rich Little. Little took to the airwaves doing his best Richard Nixon voice to announce his new campaign slogan: “I never did anything wrong, and I won’t do it again.” Given that Nixon had already been elected President twice, he was ineligible to run again, but that didn’t stop mayhem from ensuing before NPR copped to the prank after a few minutes of fun.

Close-up of the Liberty Bell, shown from the cracked side, on its display in Philadelphia, PA.
Credit: Bettmann via Getty Images

The Taco Liberty Bell

On April 1, 1996, the fast-food chain Taco Bell announced that they had acquired one of America’s most historic relics, the Liberty Bell. Taco Bell went all out by purchasing full-page advertisements in seven major local newspapers, including the Philadelphia Inquirer. In those ads, the fast-food chain announced that they had not only bought the Liberty Bell but officially renamed it “The Taco Liberty Bell.” Furthermore, they claimed that they had cut a deal with the government through which the purchase of the landmark would help the country to pay off the national debt.

The prank resulted in furious Americans calling the National Park Service to express their outrage. Despite public backlash, White House press secretary Mike McCurry joined in on the fun, jokingly announcing, “Ford Motor Company is joining today in an effort to refurbish the Lincoln Memorial. … It will be the Lincoln Mercury Memorial.” Joking aside, the gag was no small financial commitment for Taco Bell, as the advertisements cost $300,000 — though the whole prank ultimately generated around $25 million in ads, and sales spiked by $600,000 the next day.

Game show hosts Alex Trebek and Pat Sajak pose on the set of the "Jeopardy!".
Credit: Amanda Edwards/ Getty Images Entertainment via Getty Images

Alex Trebek and Pat Sajak Switch Places

In 1997, game show viewers were understandably thrown off when Alex Trebek, the host of Jeopardy!, switched places with Pat Sajak, the host of Wheel of Fortune. On that evening’s Jeopardy! broadcast, Sajak emerged from behind the curtain to host a traditional episode of the show. Despite the Jeopardy! broadcast’s relative normalcy, it was on Wheel of Fortune where things got particularly wacky.

After Trebek showed up to the mild confusion of the Wheel of Fortune audience, he welcomed Lesly Sajak, Pat’s wife, to assist with flipping numbers. Furthermore, there were no normal contestants that night. Instead, Pat Sajak and his usual co-host Vanna White played for charity that evening. Sajak and White were tasked with solving humorous phrases such as “PAT I’D LIKE TO SOLVE THE PUZZLE” and really long words such as “SUPERCALIFRAGILISTICEXPIALIDOCIOUS.” In 2022, two more television staples followed in their footsteps, as Jimmy Fallon and Jimmy Kimmel switched up hosting duties for their respective late-night comedy shows on April Fools’ Day.

A logo sign is posted in front of a building on the Google campus.
Credit: Justin Sullivan/ Getty Images News via Getty Images

Google Gulp

Many of us have a thirst for knowledge, but Google once bottled that desire inside a seemingly tangible yet entirely fictitious product. On April Fools’ Day 2005, the tech giant unveiled the beta-version of Google Gulp, a soft drink jam-packed with Google’s “Auto-Drink” technology. Google claimed that the drinks were meant to increase each drinker’s cognitive ability, because the beverage had “a DNA scanner embedded in the lip of [the] bottle reading all 3 gigabytes of your base pair genetic data in a fraction of a second… to achieve maximum optimization of your soon-to-be-grateful cerebral cortex.” However, Google made the beverage hard to come by, stating that the only way to get a new bottle was to turn in an already used Gulp Cap — none of which existed, of course.

Google Gulp was just one in a long line of memorable pranks from the company. In 2013, they “unveiled” Google Nose, allowing users to smell what they were searching for. In 2018, Google hid the character Waldo on Google Maps, and the following year they added the mobile game “Snake” to that very same service. Funnily enough, one of Google’s most beloved products, Gmail, was launched on April 1, 2004. The email service seemed too good to be true at first and led many people to believe it was a hoax, before users ultimately realized it was a very real product.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Alberto Masnovo/ Shutterstock

As mid-March approaches, you’ll no doubt hear the oft-repeated saying “Beware the ides of March.” It’s a strangely archaic phrase that doesn’t make much sense to modern ears without knowing some important historical context, as well as the ins and outs of ancient moon-based calendars  — what are “ides,” anyway? Here are six amazing facts about this famous phrase, and its relation to arguably one of the most important moments in Western history.

Soothsayer warning Julius Caesar of the Ides of March.
Credit: UniversalImagesGroup via Getty Images

The Phrase Comes From William Shakespeare

In Act 1, Scene 2 of William Shakespeare’s Julius Caesar, Roman politician (and future assassin) Marcus Junius Brutus and the play’s eponymous character are approached through a crowd by a soothsayer who has a warning — “Beware the ides of March.” The two Romans dismiss the fortuneteller as a “dreamer” and go about their business as usual. Of course, the warning proved deadly accurate; for the Romans, the “ides” was the middle of the month, and Julius Caesar was famously assassinated on March 15, 44 BCE.

Roman historians say that in reality (not just Shakespeare’s fictionalized version), the soothsayer’s name was Spurinna. He was Etruscan, an ancient people often associated with divination, and served as a haruspex — someone who inspects the entrails of sacrificed animals for clues about the future. However, there’s no record of Spurinna pinpointing the ides of March specifically; instead, he warned Caesar to be wary of the next month generally, a period that would end on March 15. Scholars believe this was likely just a calculated guess, as Roman politicians were already turning against Caesar, who had been named dictator for life, and the famed military leader was leaving the capital for another military campaign on March 18. If Caesar was going to be assassinated, it would likely be in the month of March.

Ancient calendar in the National Roman Museum.
Credit: Lev Tsimbler/ Alamy Stock Photo

The “Ides” Were Part of Rome’s Archaic Lunar Calendar

Although the phrase “the ides of March” carries with it a sinister connotation because of the bloody business done on that day two millennia ago, the ides — along with the nones and kalendsare simply ancient markers of the moon’s phases that were part of Rome’s lunar calendar. “Kalends” referred to the new moon (or first of the month), “ides” meant the middle of the month (the 13th in some months and the 15th in others), and “nones” referred to the quarter moon. For a time, the ides of March was actually the beginning of the new year in Rome.

A statue of the famous Roman ruler Julius Caesar.
Credit: Bettmann via Getty Images

Caesar Himself Got Rid of Ides Entirely

Although the ides of March is closely related to Julius Caesar, the famous Roman leader was directly responsible for tossing out the old, lunar-based calendar entirely. In 45 BCE, Caesar — after consulting top mathematicians and astronomers — instituted the solar-based Julian calendar, a timekeeping system remarkably similar to the calendar we use today.

To implement the new system, Caesar created what has since become known as “the year of confusion,” in which the year 46 BCE lasted for 445 days so the new Julian calendar could begin on January 1. One scholar even argues that this drastic change could’ve been seen by conspiratorial senators as an attack on Roman tradition, and the assassins might’ve purposefully selected the “ides of March” as a symbolic gesture against Caesar and his reforms.

Reenacting the assassination of Julius Caesar (Ides of March) in Rome, Italy.
Credit: Anadolu Agency via Getty Images

Every Year Romans Reenact Caesar’s Assassination on March 15

Every year (barring worldwide pandemics) Romans reenact the murderous drama that unfolded near the Curia of Pompey two millennia ago. (A curia is a structure where Roman senate members would meet.) However, it wasn’t until 2015 when members of the Roman Historical Group got the chance to recreate Caesar’s final moments on the exact spot where it happened, after finally getting access to the ruins of the curia itself.

The reenactment generally unfolds in three parts — first with the senators’ accusations, followed by Caesar’s actual assassination, and then concluding with speeches from both Brutus and Mark Antony justifying their actions. In an interview with NBC News, the Caesar impersonator said this annual bit of theater is about honoring the ancient leader, because “Rome wouldn’t have been as great without him.”

An engraving of a bust of Julius Caesar.
Credit: Bettmann via Getty Images

Caesar Was Deified as a Roman God

Although the Roman pantheon was largely borrowed from ancient Greece, Rome added a few deified originals of its own. One of the most important was the two-headed Janus, the god of doorways and transitions and the namesake of the month of January. But Rome also deified many of its most important leaders, and named months after some of them. After Caesar’s death on the ides of March, a Roman cult known as divus Julius pushed for Caesar’s official divinity. Caesar’s adopted heir, Octavian (known to history as Augustus), later became Rome’s first emperor and similarly received the divinity treatment. The effects of this Roman imperial cult can be seen in today’s calendar, as July and August are named for the two ancient rulers.

The death of Julius Caesar in the Roman Senate.
Credit: Leemage/ Corbis Historical via Getty Images

The Location of Caesar’s Murder Is Now a Cat Sanctuary

The Curia of Pompey used to be home to the hustle and bustle of toga-wearing senators going about the business of empire, but it’s now the domain of cats. First excavated during the reign of Benito Mussolini in 1929, the Largo di Torre Argentina houses the remains of the curia where Caesar met his end, as well as the ruins of several temples. However, today the Colonia Felina di Torre Argentina takes care of more than 100 cats that prowl the ancient grounds. Although visitors can glimpse the ruins from street level some 20 feet above ground, only cats are usually allowed to slink among the grounds where the ides of March earned its infamous reputation.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Selene Da Silva/ Shutterstock

Humans have been building structures pretty much since the evolution of our species. Every corner of the planet has seen houses, temples, mounds, palaces, or pyramids in a variety of shapes and sizes, but few of them have withstood the elements over the centuries. These seven human-made structures are currently considered to be the earliest on each continent. But with ongoing archaeological excavation and advances in dating techniques, the list will likely change in the near future.

borchgrevink's hut, among the Adele penguin colony.
Credit: Farjana.rahman/ Shutterstock

Antarctica — Huts on Cape Adare (1899 CE)

Two tiny huts were imported from Norway to Antarctica near the turn of the 20th century by polar explorer Carsten Borchgrevink during his Southern Cross expedition. The wood-framed huts were anchored to the ground with cables so they didn’t topple over in the Antarctic wind. Borchgrevink never dismantled the huts, which were reused by other explorers over the years. In 1990, the huts were repaired and documented, and the earliest structures on the frozen continent still stand.

The inland fort on West Wallabi island.
Credit: stewart allen/ Alamy Stock Photo

Australia — Wiebbe Hayes Stone Fort (1629 CE)

Inhabited for at least 40,000 years, Australia boasts numerous archaeological sites with evidence of cave dwellings, fish traps, burials, and more. But the oldest extant building created by settlers is a fort built in 1629 by the survivors of the Batavia shipwreck. After a mutiny and a massacre, some passengers were marooned on West Wallabi Island. Fashioning defensive walls and shelter out of local stone, they overpowered the mutineers. After executing the traitors, the remaining Batavia passengers were rescued, and the fort was abandoned.

Step pyramid of Djoser.
Credit: Liya_Blumesser/ Shutterstock

Africa — Pyramid of Djoser (c. 2650 BCE)

This continent is the cradle of humanity, where numerous human ancestor species arose. While new research suggests our ancestors might have created wood structures there as far back as 500,000 years ago, the best-preserved example of African architecture is much more recent. A six-tier, four-sided pyramid was created as a tomb for Pharaoh Djoser and his family, becoming the inspiration for all subsequent royal Egyptian burials. The pyramid remains today, in the cemetery complex at Saqqara, which was used for over three millennia.

A huge mound in Louisiana, United States.
Credit: BHammond/ Alamy Stock Photo

North America — Watson Brake (c. 3500 BCE)

Although archaeologists are still debating what constitutes the oldest structure in North America, all agree that Indigenous peoples’ mound-building tradition dates back thousands of years. Watson Brake in northern Louisiana is a series of 11 mounds connected by ridges, likely used for several centuries as a kind of base camp for hunter-gatherers in the warmer months of the year. Attempts to date mounds have only recently become possible, though, so many of these structures may be even older than Watson Brake, as suggested by controversial new research claiming a 9000 BCE date for mounds on the campus of Louisiana State University.

Mnajdra Neolithic Temple in Malta.
Credit: Cavan-Images/ Shutterstock

Europe — Megalithic Temples of Malta (c. 3600 BCE)

The oldest extant structures in Europe are a dozen enormous stone temples discovered on the tiny island nation of Malta in the middle of the Mediterranean Sea. Ġgantija, the earliest of the temples, was built by a giantess, according to Maltese folklore, and there is some evidence that animal sacrifices were made there to a fertility deity. Little else is known about this prehistoric temple-building culture, but thanks to its central location, Malta has been a key naval base throughout recorded history.

A mound with a natural lagoon from Huaca Prieto.
Credit: Library Book Collection/ Alamy Stock Photo

South America — Huaca Prieta (c. 5700 BCE)

Long before pyramids were created in Egypt, people were erecting them on the north coast of Peru. In 2017, archaeologists dated the early layers of the site of Huaca Prieta to about 14,500 years ago. The 100-foot-tall ceremonial structure appears to have been created there 7,800 years ago, suggesting people began settling in. Other contenders for earliest architecture on the continent include the Peruvian sites Sechin Bajo, whose plaza and pyramids go back to about 3500 BCE, and the Citadel of Caral, a large urban center inhabited around 3000 BCE.

Ancient Site of Gobekli Tepe in SanliUrfa, Turkey.
Credit: acsen/ Shutterstock

Asia — Göbekli Tepe (c. 9500 BCE)

Often called the world’s oldest temple, the ancient site of Göbekli Tepe was discovered in southeastern Turkey in 1994. Linked to the emergence of agriculture and the move from foraging to farming, the site includes circular structures with massive stone pillars, many of which were decorated with human and animal figures. There are also domestic structures, quarries, and cisterns or wells at the site. Although it is unclear if the prehistoric stone structures were actually temples, it is clear that Göbekli Tepe is the oldest evidence yet of permanent human settlement.

Kristina Killgrove
Writer

Kristina Killgrove is a science communicator with a Ph.D. in anthropology. She has written for numerous media sites, including Live Science, Mental Floss, and Forbes.

Original photo by 5 second Studio/ Shutterstock

Who was St. Valentine? Although we don’t know for sure, we do know he wasn’t a patron of romantic love. St. Valentine’s Day was originally an occasion honoring one of several Christian martyrs. It took centuries before the day was linked to love — and even longer before anyone dreamed up a heart-shaped box of chocolate.

Saint Valentine (approx. 176 - Rome, 14 Feb. 273) Bishop of Terni and Martyr.
Credit: Fototeca Storica Nazionale/ Hulton Archive via Getty Images

Historians Don’t Know for Sure Who St. Valentine Was

The history we have comes from the Bollandists, an order of Belgian Jesuit monks who began publishing an encyclopedic text called the Acta Sanctorum in 1643. They searched archives to document the lives of past saints, organizing their research as part of a calendar that associates each day with a saint. For February 14, they listed several martyrs called “Valentini.”

The Bollandists uncovered tales of two Christians who were decapitated on February 14 during the reign of the Roman Emperor Claudius Gothicus in the third century. One legend concerns a Roman priest named Valentinus, and the other a bishop of Terni in the province of Umbria, Italy. As the Bollandists note, it’s likely the two tales actually refer to the same person. There’s also a legend that a St. Valentine performed illegal marriages for the emperor’s soldiers, but no real evidence to back it up.

A skull enshrined in a glass reliquary and attributed to St. Valentine.
Credit: Anadolu Agency via Getty Images

Churches and Monasteries Say They Have Bits of St. Valentine’s Body

An entire skull purported to be St. Valentine’s is still on display in one church, Santa Maria in Cosmedin in Rome. Churches in Madrid, Dublin, Prague, Malta, Birmingham, Glasgow, and the Greek isle of Lesbos have also claimed to have a bit of the saint’s skull, or other relics (including his heart) that once belonged to his body.

Relics of martyrs allow Christians to experience a sense that these great souls are still in their community, according to Lisa Bitel, a historian of medieval Europe and religious studies at the University of Southern California. But these bones did not (and don’t) have a special significance for lovers.

The feast of Lupercalia.
Credit: DEA / ICAS94/ De Agostini via Getty Images

Rome Had a Raucous Fertility Festival in Mid-February

The pagan festival of Lupercalia began as a rural ritual of sacrificing goats and dogs. It turned into an urban carnival in Rome, with young men running through the streets whipping onlookers with strips of new goat leather. Romans believed this whipping brought luck to pregnant women.

Around 496 CE, a new pope abolished the festival, which had lasted despite a long series of laws over the previous 150 years banning pagan religious rites. He also declared February 14 to be a celebration of St. Valentine. However, Bitel argues that “there is no evidence that the pope purposely replaced Lupercalia with the more sedate cult of the martyred St. Valentine.”

A scene from The Canterbury Tales by Geoffrey Chaucer.
Credit: Fine Art Photographic/ Corbis Historical via Getty Images

So how did Valentine’s Day become associated with romance? You can thank (or blame) famed English poet Geoffrey Chaucer, whose 14th-century poem “Parliament of Fowls” describes a group of birds gathering on “seynt valentynes day” to select their mates.

Within a few decades, fashionable people had begun sending love notes and, soon, original poems, to their sweethearts on the saint’s day. By 1415, a French duke imprisoned in the Tower of London called his wife his sweet Valentine. Nearly 200 years after that, Shakespeare had Ophelia sing a song based on the folkloric idea that the first girl a man saw on Valentine’s Day would be his true love. By the sentimental Victorian era, Valentine’s Day had become a time to shower each other with cards and gifts, decorated with hearts, rosebuds, and baby Cupids symbolizing romance.

Visual drawing of Cupid circa, 1807.
Credit: Heritage Images/ Hulton Fine Art Collection via Getty Images

Cupid Was Originally a Hunk

The chubby winged child we often see today first appeared in art and poetry as a handsome youth — the Greek god of love, Eros. Sometimes considered the son of the god of war, Eros caused lots of trouble.

The Romans were the ones who reimagined him as Cupid, the son of Venus (goddess of love and beauty; known in ancient Greece as Aphrodite). The enchanted gold-tipped arrows of his bow, bringing love and lust, were said to pierce both humans and gods alike. Picturing him as a child was a way of  “limiting the power that love was thought to have over us,” says Catherine Connors, a classics professor at the University of Washington.

Chocolate hadn’t yet entered the picture, though.

Valentines day chocolate gift box and rose.
Credit: Afro Newspaper/Gado/ Archive Photos via Getty Images

Valentine’s Chocolates Arrived in the 19th Century

Chocolate did have an early link to marriage. The ancient Maya, who first brewed cacao beans, used their unsweetened drink in marriage ceremonies. Sweet chocolates for Valentine’s Day began just as you’d guess — as a way to sell chocolate.

Candymaker Richard Cadbury saw a marketing opportunity in the mid-19th century. His company had developed a manufacturing process that made drinking chocolate tastier. It produced leftover cocoa butter, which he used for bonbons, then called “eating chocolate.” Cadbury himself designed gorgeous boxes to put them in. Later, he put the symbolic Cupids and roses on a heart-shaped box, and a tradition was born.  

In the United States, Russell Stover and Whitman’s brought Valentine’s chocolates within reach for millions. Russell Stover bought out Whitman’s, and they now sell the “Secret Lace Heart,” chocolates that come in a heart-shaped box covered with red satin and black lace reminiscent of lingerie.

Temma Ehrenfeld
Writer

Temma Ehrenfeld has written for a range of publications, from the Wall Street Journal and New York Times to science and literary magazines.

Original photo by Heritage Image Partnership Ltd/ Alamy Stock Photo

There’s nothing like experiencing the sun, sand, and surf of a really good beach vacation — and if there’s some history to explore along the way, even better. Some of the most iconic beaches in the United States have weird origins, forgotten wonders, quirky curiosities, and stories that changed the shape of the country.

Which seaside locale once had an elephant-shaped red-light district? What formerly domesticated animals run wild in Key West, Florida? How did Atlantic City get its famous boardwalk? These facts might make your next beach trip just a little more fascinating.

A view of the giant elephant on Coney Island.
Credit: Alexander Alland, Jr. / Corbis Historical via Getty Images

Coney Island Once Had a 12-Story Elephant-Shaped Building

Today, Coney Island’s most recognizable features include the Wonder Wheel and the defunct Parachute Jump, but for several years, its most unmistakable structure was a 12-story wooden elephant dubbed Elephantine Colossus. The structure reached its full height (between 122 and 175 feet tall, depending on the source) in 1884, a couple of years before the Statue of Liberty went up, making it one of the first things to greet visitors approaching the New York shoreline. An outsized howdah — a kind of saddle with a canopy, frequently seen on elephants — served as its observation deck, and visitors could peer out of its glass eyes.

The building had other functions, too. Its front legs, each 18 feet in diameter, housed a tobacco shop and a diorama, while its back legs held spiral staircases to the upper floors. The Elephantine Colossus had originally been built as a hotel, with 31 rooms filling up its body, including a grand hall, a museum, and a gallery, although it was more useful as a concert hall and general amusement destination. Eventually, the elephant developed a seedy reputation, with rent-by-the-hour rooms in what one historian called a “tin, elephant-shaped red-light district.

Sadly, the elephant’s reign over Coney Island was short-lived, and it burned down spectacularly in September 1896, although with little damage to the surrounding businesses. According to a Brooklyn Daily Eagle article at the time, witnesses first noticed the blaze through the Colossus’ eyes. “[In] twenty minutes, the huge beast of wood and tin collapsed,” read the report, “first falling to its knees with what sounded almost like a groan of agony, and then rolling over into a shapeless mass, where it smouldered and burned until it was finally drowned into submission by the fire department.”

A rooster in a shady alley of travel destination, Key West, Florida.
Credit: Boogich/ iStock

Key West Is Full of Feral Chickens

A number of famous folks have lived on the distant Florida island of Key West, including the writers Ernest Hemingway, Shel Silverstein, and Tennesee Williams, musician Jimmy Buffett, and President Harry S. Truman. But locally, its most famous residents might be the feral chickens, which make up a significant part of everyday life on the key.

Key West, which is close to Cuba, saw a huge influx of residents from Cuba and the Caribbean in the 1800s. They brought along their Cubalaya chickens, which are equally useful for meat, eggs, and fighting. Even more chickens came with displaced families during the Cuban Revolution in the 1950s.

Key West banned cockfighting in 1970, around the same time meat and eggs were becoming more widely available in grocery stores. Cubalaya chickens are excellent foragers, and as keeping them in coops became less of a necessity, a feral population grew around the island. The chickens are now a beloved staple of local life.

The local government has tried to address the chicken population a couple of times; once, they hired a chicken wrangler to capture the animals and transport them to free-range farms. After suspicions grew that the chickens were actually being killed at the farms, that plan was scrapped, and residents threw a four-day ChickenFest to celebrate their neighbors. Now, the island’s wildlife center sees to the welfare of the chickens.

A vintage postcard of the Atlantic City boardwalk.
Credit: Nextrecord Archives/ Archive Photos via Getty Images

The First American Boardwalk Was in Atlantic City

The boardwalk is an American beach staple, inspiring popular songs and gracing iconic beaches on the Pacific and Atlantic alike. But the beach boardwalk as we know it started in Atlantic City, New Jersey, which erected its first boardwalk in 1870.

The town got an influx of tourists thanks to its beautiful beaches and a railway line, and a hotelier and a railroad conductor had the idea to build a wooden walkway to keep sand out of both the hotels and railcars. The first boardwalk was about a mile long, 8 feet wide, and just a foot off the ground; it was designed to be packed up at the end of the season and put back out once the tourists came back. As Atlantic City grew to be more than a summer destination, the boardwalk grew bigger and more permanent. A second boardwalk, built 10 years later, was 14 feet wide. By 1890, the boardwalk was a full 24 feet wide with railings to keep pedestrians from falling to the beach below, and by 1900, Boardwalk was considered an official city street.

Today, the Atlantic City Boardwalk is more than 4 miles long and is part of 32 miles of boardwalk in New Jersey alone.

Haystack Rock on Cannon Beach in Oregon.
Credit: Barbara Alper/ Archive Photos via Getty Images

Cannon Beach Was Named for a Disappearing Cannon

Cannon Beach, Oregon, is one of the Pacific Northwest’s most recognizable beaches thanks to its starring role in several films, most notably The Goonies (1985). It’s also located near one of the most dangerous stretches of water: the Columbia River Bar, where the massive river meets the Pacific Ocean. Large ships need to hire a bar pilot to guide them through safely. In the 1840s, the captain of a ship sent by President James K. Polk got impatient and decided to try it on his own. The ship wrecked, and people started picking up the goodies that washed up to shore.

A local mail carrier found one of the ship’s cannons, pulled it up, and left it on the beach so he could come back to it — only to find the tide had changed by the time he came back, taking the cannon with it. A few years later, someone else found the cannon, but before they could pull it up from the beach (it weighed about a ton), the cannon disappeared again. This happened several times over the next 50 years or so.

At the time, Cannon Beach was called Ecola, which is “whale” in Chinook, but it kept getting confused for another town in Oregon called Eola. In 1922, the beach was rechristened with its current name.

A view of part of Venice's Canal Historic District.
Credit: ferrantraite/ iStock

Venice Beach Was Developed With Miles of Canals

Venice Beach has its own rich history and identity today, but at the turn of the 20th century, it was a resort modeled off Venice, Italy — hence the name. “Venice of America,” developed by tobacco millionaire Abbot Kinney, originally had seven canals dredged from the marshlands, wrapped around four islands and meeting in a saltwater lagoon at one end. (Six more canals popped up to the south, in a development unrelated to the resort.) Like its namesake, this Venice had gondola rides, and, since the development was not designed for automobiles, homeowners along the canals navigated with their own watercraft.

Eventually, visitors expected to be able to reach Venice by car, and after a lengthy legal battle, the city of Los Angeles filled in all the original canals and turned the former lagoon into a traffic circle. Six southern canals, which encircle three rectangular islands, were eventually preserved as a historic district in 1982, and still survive today. They’re not always easy to spot from the surface, but they’re easy to see on Google Maps.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Masheter Movie Archive/ Alamy Stock Photo

While film historians differ on the exact years of Hollywood’s golden age (it may have stretched from the late 1920s to either the 1950s or ’60s), what is clear is that these years weren’t always so golden for Black actors, who often struggled to earn opportunities and recognition. But despite the tough environment of those times, some gifted artists still made a name for themselves with standout performances as singers, dancers, comedians, and more. Here’s an assortment of fascinating facts about nine Black stars from yesteryear who faced serious obstacles in their paths yet lived lives worthy of the Hollywood spotlight.

Hattie Mcdaniels With Academy Award.
Credit: Bettmann via Getty Images

Hattie McDaniel Accepted Her Historic Oscar in a Segregated Nightclub

Two months after she was forced to miss the December 1939 Atlanta premiere of Gone With the Wind (no Black actors were allowed to enter the segregated venue), Hattie McDaniel became the first Black actor to claim an Academy Award with her acceptance of the Best Supporting Actress prize. But even that historic moment nearly failed to come to fruition, as producer David O. Selznick reportedly had to intervene to get the actress accepted into the segregated Cocoanut Grove nightclub in Los Angeles, where the Oscars were held. These two events perfectly encapsulated McDaniel’s career — she made the best of the maid and mammy roles that came her way, but was constantly reminded of her second-class standing in a pre-civil rights America. Ignoring her critics in the Black press, McDaniel later made more history in 1947 by taking over the lead role of The Beulah Show to become the first Black star of a radio program.

Publicity Still Of Stepin Fetchit.
Credit: John Kisch Archive/ Moviepix via Getty Images

Lincoln Perry, aka Stepin Fetchit, Was the First Black Actor to Become a Millionaire

With his appearance in the 1927 silent feature In Old Kentucky, Lincoln Perry delivered a performance as a comically sluggish character who was too clueless to handle even the simplest tasks. Later adopting the stage name Stepin Fetchit, Perry continued performing his “laziest man in the world” bit to great fanfare as Hollywood transitioned to talkies, his success making him the first Black actor to earn more than $1 million from his craft. Both the stardom and fortune were gone by World War II, and the Stepin Fetchit schtick today can seem an all-too-painful reminder of the degrading subservience forced on Black folks from an earlier time. Still, Perry has his defenders, who argue that his signature character was more of a trickster than a doormat, and a lengthy list of credits underscores that the man behind the inertia was a legitimate movie star.

Actress and singer Etta Moten Barnett.
Credit: Afro Newspaper/Gado/ Archive Photos via Getty Images

Etta Moten Barnett Was One of the First Black Artists to Perform at the White House

While Etta Moten Barnett’s windowsill performance of “My Forgotten Man” in Gold Diggers of 1933 spanned just 80 seconds, it was an eye-opening moment from an industry that rarely provided such a dignified platform for its Black contributors. It also earned her a formal invite to sing at President Franklin Roosevelt’s birthday party in 1934, an event often erroneously reported as the first time a Black artist had been summoned to perform at the White House (though it may well have been the first such invitation since the 19th century). While her career in Hollywood was relatively brief, Barnett later shone on Broadway in what became her signature role in Porgy and Bess, before spending her later years as an unofficial ambassador to Africa and celebrated philanthropist.

Dancers Harold and Fayard Nicholas on the Eiffel Tower.
Credit: Bettmann via Getty Images

The Nicholas Brothers Had No Formal Dance Training

If you’re unfamiliar with the acrobatic theatrics of the Nicholas Brothers, take a few minutes to watch their famous number from Stormy Weather. Amazingly, this dazzling duo had no formal training as dancers. Fayard, the older of the two, studied the techniques of star performers during a childhood spent following his musician parents on the vaudeville circuit, and he later taught the younger Harold how to dance and sing. While the brothers’ full array of talents never received a proper showcase on the big screen, their peerless moves ensured their visibility in high-profile vehicles for stars like Gene Kelly, and led to fulfilling careers that stretched well beyond Hollywood’s golden age.

Photo of Ethel WATERS.
Credit: GAB Archive/ Redferns via Getty Images

Ethel Waters Was the First Black Performer to Star in a TV Show

In June 1939, executives at the National Broadcasting Company decided to test the public appetite for the fledgling medium of television with a variety special. The result, The Ethel Waters Show, made its 42-year-old headliner the first Black performer to star in a TV program. Waters, who had previously helped integrate Broadway in 1933 with Irving Berlin’s As Thousands Cheer, went on to become the second Black woman to earn an Academy Award nomination following her performance in 1949’s Pinky. She then briefly starred in the TV adaptation of Beulah, before her guest role in a 1961 episode of Route 66 made her the first of her race and gender to snag a Primetime Emmy nomination.

Photo of American actor and singer Paul Robeson.
Credit: GAB Archive/ Redferns via Getty Images

Paul Robeson Sang in 25 Languages

There were few things Paul Robeson couldn’t do in the public sphere; a multisport star at Rutgers University and pioneering member of the National Football League, he eventually became an in-demand leading man in stage productions of The Emperor Jones, Show Boat, and Othello, along the way bringing his drawing power to Hollywood. Topping his list of talents may well have been his powerful baritone singing voice and linguistic capabilities, which enabled him to deliver a vast repertoire of songs in as many as 25 languages to audiences around the world. However, Robeson’s global voice was silenced when his passport was revoked in 1950 because of his barely disguised communist sympathies, and the aging performer never again reached the commanding heights of his younger years.

American blues singer and actress Lena Horne sings on a stage.
Credit: Hulton Archive via Getty Images

Lena Horne Was the First Black Actress to Sign a Long-Term Contract With a Major Hollywood Studio

Two years after McDaniel’s historic walk from the back of the Cocoanut Grove, a signal that Hollywood’s stiff codes of segregation were loosening came when Lena Horne became the first Black actress to sign a long-term contract with a major studio (MGM). Although she refused to play the stereotypical domestics and prostitutes, Horne still found most of her roles lacking. Except for some performances among all-Black casts in films like 1943’s Cabin in the Sky and Stormy Weather, the actress generally landed isolated parts that could easily be cut from airings in the Jim Crow South. Adding to her discontent was the strain of activism that led to her being blacklisted from film and television for much of the 1950s. But unlike her friend Robeson, Horne survived the lean years by keeping her recording career alive, and she returned to the public eye in the 1960s to chart her own distinct course as a singer, actress, and activist.

Bill 'Bojangles' Robinson in the musical film 'Stormy Weather'.
Credit: Underwood Archives/ Archive Photos via Getty Images

Bill “Bojangles” Robinson’s Final Film Was Loosely Based On His Life Story

Another sign of progress came when venerable song-and-dance star Bill “Bojangles” Robinson headlined Stormy Weather, a major studio production based on his life. Granted, the story was short on the specifics of a performing career that began in the 19th century, made him a vaudeville star by World War I, and thrust him into the Hollywood spotlight as Shirley Temple’s sidekick by the mid-1930s. Regardless, the feature provided a showcase for his still-formidable dancing abilities, as well as the talents of other Black performers such as Horne, Cab Calloway, Fats Waller, and the Nicholas Brothers. It was a fitting final film for Robinson, a beloved entertainer who reportedly drew half a million visitors to witness his funeral parade after his death in 1949.

Portrait of Actor Sidney Poitier.
Credit: Bettmann via Getty Images

Sidney Poitier Was Kicked Out of His First Audition Because of a Heavy Bahamian Accent

Born in Miami but raised in the Bahamas, Sidney Poitier was sent packing from his first audition for New York City’s American Negro Theater (ANT) in 1946 because of his inexperience and heavy accent. Fearful of being stuck in his dishwashing job, the teenager saved up to buy a radio, and spent his free time mimicking the voices he heard from news broadcasts and advertisements. Poitier eventually latched on with the ANT, his improved diction and natural charisma helping to override the missteps of his on-the-job training as an actor. By 1950, when he wowed national audiences with his film debut in No Way Out, the thick accent and deer-in-the-headlights expression had been replaced by the easy eloquence and steely presence that would make Poitier the first Black Best Actor Academy Award winner in 1964.

Tim Ott
Writer

Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.

Original photo by SERGEI BRIK/ Shutterstock

Since it was first celebrated in the late 19th century, Groundhog Day has been a fun — albeit scientifically dubious — annual tradition. Every February 2, revelers gather to learn whether we’re in for a lengthy winter or early springtime, a verdict determined by several “prophetic” rodents around the country. Whether you place your trust in Punxsutawney Phil or Staten Island Chuck, here are six facts about Groundhog Day that bear repeating.

Punxsutawney Phil is held up to see during the ceremonies for Groundhog day.
Credit: Brett Carlsen/ Getty Images News via Getty Images

Punxsutawney Phil Holds Several Official Titles — and a Royal Namesake

There’s no more celebrated creature on Groundhog Day than Punxsutawney Phil, the most popular resident of the small Pennsylvania town for which he’s named and where the first official Groundhog Day festivities occurred in 1887. While “Punxsutawney Phil” may be a mouthful to say all by itself, the rodent’s official name is actually “Punxsutawney Phil, Seer of Seers, Sage of Sages, Prognosticator of Prognosticators and Weather Prophet Extraordinary.” According to the Punxsutawney Groundhog Club, the name has a regal connotation: Phil was named after a “King Phillip,” although it’s not clear which one. However, it’s entirely possible that Phil was actually named after Queen Elizabeth II’s husband, Prince Philip. In 1953, Punxsutawney buried a pair of groundhogs that had been named Elizabeth and Philip, after the royal couple, and it was eight years later that the name “Punxsutawney Phil” first appeared in local records.

While it seems logical to assume that there have been many groundhogs named Punxsutawney Phil since then, local lore tells a different story. Tradition has it that each summer, at the town’s Groundhog Picnic, Phil is fed a magical elixir known as “Groundhog Punch” that’s said to extend his life for another seven years. And when he’s not making annual weather forecasts, Phil relaxes at home in the town library with his wife, Phyllis.

People gathered together for Candlemas ceremony.
Credit: Keystone-France/ Gamma-Keystone via Getty Images

Groundhog Day Stems From a Holiday Called Candlemas

Though Groundhog Day was created on American soil, it was inspired in large part by an ancient Christian tradition known as Candlemas, which was brought to the Pennsylvania region by German settlers. Like Groundhog Day, Candlemas is annually celebrated on February 2; it commemorates the day the Virgin Mary went to Jerusalem’s holy temple to be purified 40 days after the birth of Jesus, and to present Jesus to God as her firstborn. Candlemas also features the blessing and distribution of candles, which burn to represent the length of the winter each year.

Likewise, Candlemas was associated with the prognostication of spring’s arrival. One old English rhyme states, “If Candlemas be fair and bright / come, Winter, have another flight; If Candlemas brings clouds and rain / go, Winter, and come not again.” One major difference between Candlemas and Groundhog Day, however, is that the former was known for a creature called the Candlemas Bear, whose emergence from hibernation meant the coming of spring. Germans also originally used hedgehogs for the same purpose. The bears and hedgehogs were later changed to a groundhog during the establishment of the newer holiday in America.

A groundhog eating grass.
Credit: Alice Cahill/ Moment via Getty Images

Groundhog Day Celebrations Once Involved Eating Groundhogs

When the first Groundhog Day occurred on February 2, 1887, in Gobbler’s Knob, Pennsylvania, groundhogs were celebrated not only for their predictive abilities but also for their delicious flavor. In the 1880s, groundhog meat was the preferred cuisine at the local Punxsutawney Elks Lodge, the same lodge responsible for conceiving of the original Groundhog Day ceremony — and a summer hunt. Locals loved the taste of the small rodent, saying it was “like a cross between pork and chicken.” They would also indulge in celebratory potables like “Groundhog Punch,” an unusual concoction known to contain vodka, milk, eggs, and orange juice.

Groundhog meat continued to be served as a regional delicacy into the 20th century, with a recipe for “Groundhog, Punxsutawney Style” published in a 1958 cookbook to raise money for a local hospital. However, the hunting portion of the holiday ultimately faded in popularity, as locals opted to enjoy the animal more for its cuteness than for its taste.

A groundhog casting his shadow.
Credit: Bettmann via Getty Images

Groundhog Day Predictions Were Censored During WWII

According to Bill Cooper of the Punxsutawney Groundhog Club, the only year Groundhog Day hasn’t been celebrated since its inception is 1942. During World War II, Americans were cautious to not potentially divulge favorable weather forecasts to their enemies. The rule was a nationwide edict that even prevented newspapers from printing sky conditions, instead forcing them to be vague about how certain days were nicer or gloomier than others. That mandate extended to the 1942 Groundhog Day celebration; that year, the prediction stated, “War clouds have blacked out parts of the shadow.”

A groundhog predicting an early spring.
Credit: Bettmann via Getty Images

Punxsutawney Phil’s Predictions Are Less Accurate Than a Coin Flip

He may be heralded as the most prophetic rodent in the world, but Punxsutawney Phil’s annual predictions are far from accurate. According to records, Phil has predicted 107 forecasts of a longer winter compared to just 20 early springs (nine additional years lack records on file). When taking into account the historic weather data that followed Phil’s predictions, he’s been correct only around 39% of the time — making him a less reliable barometer than a coin flip.

Phil has a bit of competition when it comes to weather forecasting. Staten Island Chuck — a resident of New York’s Staten Island Zoo — has a prediction rate over 80%. Chuck went on a hot streak and made a correct weather prediction every Groundhog Day between 2010 and 2021, with the exception of 2017. So while Phil is undeniably more famous, Chuck may have the edge when it comes to actually foreseeing the future.

Bill Murray and Andie MacDowell in a scene from the film 'Groundhog Day'.
Credit: Archive Photos/ Moviepix via Getty Images

Tom Hanks Was Considered for the Lead Role in the Film “Groundhog Day”

The 1993 film Groundhog Day established the holiday as a nationwide phenomenon, and while it’s hard to imagine anyone but Bill Murray in the lead role, he was nearly beaten out by another famous actor. Director Harold Ramis wanted Tom Hanks to portray newsman Phil Connors, though it was ultimately concluded that Tom Hanks was “too nice” to play the curmudgeonly part. Other actors considered for the role included Chevy Chase, Kevin Kline, and Michael Keaton, the latter of whom was offered the part but turned it down because he didn’t “understand” the movie. Keaton later expressed regret for that decision in a 2014 interview, though given Murray’s memorable performance, it all worked out to the delight of audiences.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.