Original photo by 66 north/ Unsplash

There’s more to Antarctica than cold weather and penguins, though it does have plenty of both. And while we’ve learned much about the elusive continent since it was first discovered around 200 years ago, it maintains an air of mystery unlike few places on Earth. Interesting facts abound when it comes to Antarctica — here are eight of them.

A section of the West Antarctic Ice Sheet with mountains.
Credit: Mario Tama/ Getty Images News via Getty Images

Antarctica Is the World’s Largest Desert

The word “desert” tends to evoke images of extreme heat, cacti, and vast expanses of sand. The technical definition is less fanciful: an area that receives no more than 25 centimeters (10 inches) of precipitation per year. With that in mind, it’s perhaps less surprising that Antarctica is the world’s largest desert. At 5.5 million square miles, it edges out both the Arctic (5.4 million square miles) and Sahara (3.5 million) deserts, with the Arabian and Gobi deserts rounding out the top five. Antarctica only receives about 6.5 inches of precipitation in a given year, almost all of it as snow.

King penguins in the snow in South Georgia, Antarctica.
Credit: elmvilla/ iStock

Antarctica Is Also the Coldest, Windiest, Driest, and Highest Continent

Antarctica is a land of extremes, and it ranks first among the seven continents on several scales. In addition to being the coldest continent, it’s also the windiest, driest, and highest one. The coldest Antarctic temperature (and thus the coldest on Earth) was recorded at Vostok Station in July 1983 at -128.6°F. The highest wind speed recorded on the continent was at the Dumont d’Urville station in July 1972 at 199 mph. The average elevation is 8,200 feet — by comparison, the average elevation in the U.S. is a measly 2,500 feet.

Clocks in an airport showing the time of different major cities.
Credit: ymgerman/ Shutterstock

There’s No Official Time Zone in Antarctica

What time is it in Antarctica right now? There are a lot of different ways to answer that question, as the world’s fifth-largest continent doesn’t have an official time zone. Instead, some research stations (there are about 50 permanent stations on the continent) are synched up to the local time in the countries they hail from, while others observe the local time of whichever country is closest (for example, the Palmer Station, an American outpost, keeps Chile Summer Time, or CLST). Daylight Saving Time complicates matters further, with stations such as Troll (from Norway) switching from Greenwich Mean Time (GMT) to Central European Summer Time (CEST) when the clocks change in Europe.

Emilio Marcos Palma is cradled by his father after becoming the first baby born in the Antarctic.
Credit: Horacio Villalobos/ Corbis Historical via Getty Images

At Least 11 Babies Have Been Born in Antarctica

On January 7, 1978, something happened that had never happened before: A human was born in Antarctica. His name was Emilio Marcos Palma, and his parents were part of Esperanza Base, an Argentine research station. Ten more babies came into the world there throughout the rest of the decade and into the ’80s, all of them either Argentine or Chilean, with some commentators suggesting this was a concerted effort from both countries to strengthen their respective claims to Antarctica. Because all 11 survived, Antarctica technically has the lowest infant mortality rate of any continent: 0%.

A view of a crystal blue water flooding an ice cave.
Credit: DCrane/ Shutterstock

Antarctica Has a Lake So Salty It Doesn’t Freeze

Antarctica is known for its permafrost, but at least one part of it never freezes: Deep Lake, which is so salty — 10 times more than the ocean, which puts it on a similar level as the Dead Sea — that it stays liquid even at extreme temperatures. It’s considered one of the planet’s least productive ecosystems, as the cold and hypersalinity prevent almost all life from thriving there (although it is home to a collection of extremophiles — organisms that thrive in the most extreme conditions on Earth). Deep Lake is 180 feet below sea level and only gets saltier at increased depths.

A rocky ice mountain in the Weddell Sea of Antarctica.
Credit: 66 north/ Unsplash

Antarctica Is Bigger Than the United States

Though most map projections don’t convey it very well, Antarctica is big — really big. With an area of 5.4 million square miles, it’s both the fifth-largest continent (ranking ahead of both Europe and Australia) by size and roughly one-and-a-half times the size of the United States.

Big cruise ship in the Antarctic waters.
Credit: Volodymyr Goinyk/ Shutterstock

No One’s Sure Who Discovered Antarctica

Long before a human set foot on Antarctica, explorers were obsessed with learning more about the Antarctic Circle. The circle was first crossed in 1773 by Captain James Cook, but it took another 47 years before Antarctica was actually seen by human eyes. The question of who can actually lay claim to that achievement remains disputed more than 200 years later, with Russian explorer Fabian Gottlieb von Bellingshausen reporting having seen “an ice shore of extreme height” on January 27, 1820 and Edward Bransfield of the Royal Navy describing “high mountains covered with snow” on January 30 of the same year.

What’s known as the Heroic Age of Antarctic Exploration wouldn’t begin until the end of the 19th century, with Norwegian explorer Roald Amundsen and his team first reaching the South Pole on December 14, 1911 — a feat matched just five weeks later by Brit Robert Falcon Scott.

Flags of original signatory nations of the Antarctic Treaty at the South Pole, Antarctica.
Credit: Colin Harris / era-images/ Alamy Stock Photo

Antarctica Is Officially Dedicated to Peaceful Purposes

Though some countries have tried to claim it for their own, Antarctica doesn’t belong to any nation, government, or other entity. That was made official when 12 countries — Argentina, Australia, Belgium, Chile, France, Japan, New Zealand, Norway, South Africa, the Soviet Union, the United Kingdom, and the United States — signed the Antarctic Treaty on December 1, 1959. That this happened during the Cold War is no coincidence — the treaty was, among other things, an arms-control agreement setting aside the entire continent as a scientific preserve where no military activity is allowed.

A total of 54 countries now abide by the agreement, which has three key provisions: “Antarctica shall be used for peaceful purposes only,” “freedom of scientific investigation in Antarctica and cooperation toward that end … shall continue,” and “scientific observations and results from Antarctica shall be exchanged and made freely available.” All signatories have abided by the treatment, and Antarctica remains a hub of important research today.

Michael Nordine
Staff Writer

Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.

Original photo by chris_tina/ Shutterstock

From Grape Nuts to Lucky Charms, breakfast cereal exists on a broad spectrum that ranges from nutritious to cavity-inducing. These toasty, ready-to-eat grains have been around since 1860 and have been a staple of American mornings since the 1950s. Between convenience and clever marketing, they’ve become ubiquitous among children and adults alike. But there’s a lot you might not know about them. For example, what’s the deal with cereal box prizes? How did Cheerios get their name? Do Froot Loops actually have different flavors? These eight facts are part of a complete breakfast.

A little boy pouring cereal into a bowl.
Credit: PeopleImages.com – Yuri A/ Shutterstock

The First Cereal Prize Was (Probably) a Book

Prizes included with boxes of sugary cereals used to be a mainstay of product marketing. Notable examples include cheap plastic toys, baseball cards, and even a video game on CD-ROM — a Doom mod called Chex Quest.

What’s likely the earliest example was a little more literary, though. That honor is usually given to Kellogg’s, which offered a book to customers who checked out at the grocery store with two boxes of Corn Flakes in 1910. The book, The Funny Jungleland Moving Picture Book, featured horizontal flaps that could be moved to create different pictures and stories.

Prizes started appearing inside cereal boxes in the 1920s, when Malt-O-Meal began packaging whistles at the bottom of the box.

Aerial view of Cheerios in a bowl of milk.
Credit: freestocks/ Unsplash

Cheerios Used to Be Called Cheerioats

There are few, if any, cereals more iconic than Cheerios, but if you thought the name came from their round shape, you’d be mistaken. When the brand originally launched in 1941, they were called Cheerioats. In 1945, Quaker Oats claimed that it had exclusive rights to “oats” for its oatmeal — laughable in today’s oat-heavy market — and General Mills dropped the “at” from the end of the name. As of 2018, Cheerios is the bestselling cereal in the United States (just above Honey Nut Cheerios in second place), so General Mills really came out ahead in the end.

Cereal bowl and spoon with milk.
Credit: Linda Studley/ Shutterstock

Some Common Cereal Is Magnetic

It’s incredibly common for cereal to be fortified with extra vitamins and minerals, including iron. Just like any other iron — whether it’s in a skillet or a fence — the iron added to breakfast cereal is magnetic. Cereals with a lot of iron in them (like fortified cornflakes) even react to magnets when they’re floating in liquid. While the iron in some whole cereal is enough to be magnetic on its own, for a more in-depth, science fair-style experiment, you could try crushing up cereal and seeing how much pure iron you can pull out of it.

A bowl full of Froot Loops cereal.
Credit: Sergio Rojo/ Shutterstock

Froot Loops Are All the Same Flavor

The O’s of Froot Loops come in a variety of fruity colors, as if they each represent a different fruit flavor. However, the color is the only real difference between those O’s, because the flavor is the same throughout the box. You may still taste a difference between the colors, but it’s probably because your vision tells you to expect something different.

Speaking of fruity misconceptions, it’s always been spelled “Froot Loops” — contrary to a popular belief that the name changed because of a lawsuit over the cereal’s lack of real fruit.

Boxes of Kellogg's Frosted Flakes cereal.
Credit: SAUL LOEB/ AFP via Getty Images

Tony the Tiger Beat Other Animals to Become the Frosted Flakes Mascot

Imagine for a second that the Frosted Flakes slogan isn’t “they’re grrrrrreat,” because the mascot is not a tiger, but a kangaroo, and the kangaroo makes more of a coughing sound. When Kellogg’s launched Frosted Flakes in 1952, it experimented with several mascots — including Katy the Kangaroo, Elmo the Elephant, and Newt the Gnu — to see which one would be more popular with consumers. Tony turned out to be more popular across demographics, and Katy, Elmo, and Newt are now just distant memories.

Close-up of lucky Charms cereal marshmallows.
Credit: Sergio Rojo/ Shutterstock

Pink Hearts Are the Only Original Lucky Charm Marshmallow Left

If you haven’t had Lucky Charms since you were a kid, you may be in for a surprise, because General Mills makes adjustments to its lineup every so often. With a whopping eight marshmallow shapes (they’re called “marbits”) in today’s cereal, when a new one comes along, another steps out. But Lucky Charms launched in 1964 with just four marbits: green clovers, pink hearts, orange stars, and yellow moons. Now the moons are blue, the stars are yellow-and-orange shooting stars, and the green clovers are part of a hat. The pink hearts are the only ones that remain close to their original form.

Other shapes have come and gone completely, like the blue diamond, pot of gold, crystal ball, and green tree. The most recent addition is the purple unicorn, which replaced the hourglass.

Wheaties featuring Michael Jordan.
Credit: Keith Homan/ Alamy Stock Photo

Michael Jordan Has Appeared on More Wheaties Boxes Than Anyone Else

Wheaties, aka the Breakfast of Champions, has existed since 1924 and has featured athletes on its boxes since 1934; Lou Gehrig was the first. Over 90 years of sporty branding, there have been a few repeats, but Michael Jordan has graced the front of the box the most, at 19 times over 30 years. The five-time NBA MVP and Space Jam star most recently appeared on a box design commemorating the cereal’s 100th anniversary.

An advertising for Rice Krispies ceral.
Credit: Picture Post via Getty Images

There’s a Secret Fourth Rice Krispies Elf

Everybody knows about elves Snap, Crackle, and Pop, named for the sounds the cereal makes when it mingles with milk. The trio have been promoting Rice Krispies in one form or another since the 1930s, starting with Snap as a solo act, before Crackle and Pop joined him in 1941. But few remember the fourth cereal brother, a nonverbal space-elf named Pow, who appeared for a very brief time in the early 1950s. He appeared in only two commercials, riding a hovercraft and drawing attention to the cereal’s “power from whole grain rice.” According to Kellogg’s, he was never meant to be an “official character.”

Image Ad
Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by PictureLux / The Hollywood Archive/ Alamy Stock Photo

Judy Garland, sometimes billed as “the world’s greatest entertainer,” accomplished a lot during her short and storied life, from her childhood vocal performances and breakout role in The Wizard of Oz to her dramatic comeback in A Star Is Born. Known for her giant voice, even at an early age, and magnetic stage presence, Garland won the hearts of showbiz executives, other entertainers, and adoring fans alike.

Today, Garland is well known not just for her performances but for a volatile behind-the-scenes life that made her career both successful and inconsistent — leading to tabloid scandals, breaks from public life, and, eventually, epic resurgences. Despite those slower periods, Garland managed to create an outsized legacy in just 47 years alive on this planet. Where did her stage name come from? Why is her only Academy Award pint-sized? What happened later in her career? These six facts about Judy Garland will have you strolling down a yellow brick memory lane.

Judy Garland costumed in ruffles for her first major role in the Kiddie Revue in Los Angeles.
Credit: Bettmann via Getty Images

She Started Performing When She Was 2 Years Old

Judy was born to a pair of vaudeville performers and theater operators, and by the time she came along, her two older sisters had already started appearing onstage — so in some ways, showbiz was inevitable. After begging her parents to let her perform, she got her big debut at the family’s theater when she was just 2 years old. She had been tasked with singing her favorite holiday song, “Jingle Bells,” and got so excited that she sang it more than once in a row.

This started a new era of Judy and her sisters performing as a trio, although she emerged quickly as the standout of the group. While all three were talented, and even appeared together in the 1929 short film The Big Revue, it was Judy who caught the attention of performers and promoters on the road. At just 13 years old, she signed to Metro-Goldwyn-Mayer, and started going to school on the MGM lot with other child stars, including Mickey Rooney.

Hoagy Carmichael, the film actor is shown here at the piano.
Credit: Bettmann via Getty Images

The Stage Name “Judy” Came From a Hoagy Carmichael Song

Judy’s legal name was Frances Ethel Gumm, after her parents, Frank and Ethel. The couple had expected a boy after having two girls, and planned to name him Frank Jr., so Frances was both a compromise and an inside joke. In everyday life, she was simply known as “Baby” or “Baby Gumm.”

The last name “Garland” came about while she and her sisters, then known as the Gumm Sisters, were touring. ”Gumm Sisters” didn’t exactly roll off the tongue, and a popular comedian emceeing a series of performances came up with “Garland Sisters.”

“Judy,” however, didn’t come until later, and for a time she was known professionally as Frances Garland. The first name came along after one of her older sisters decided to go by a stage name. Sick of both “Baby” and “Frances,” she picked her own fresh moniker from Hoagy Carmichael’s latest hit, “Judy.” She was especially drawn to one line: “If she seems a saint but you find that she ain’t, that’s Judy.”

She encountered some family resistance to the new name, but refused to respond to anything but “Judy” as soon as she’d made her decision, so it stuck pretty quickly.

Photo of Garland singing on a MGM show.
Credit: Bettmann via Getty Images

MGM Made Her Wear Nose-Altering Accessories

Garland rose to superstardom with her doe-eyed look, but in her days at MGM, she was considered, however unfairly, a kind of ugly duckling compared to the more willowy starlets in the MGM stable. In her earlier years, when the priority was preserving her childlike look, she carried rubber discs in a small carrying case, along with caps for her teeth. She’d insert the discs in her nose to give it a more upturned look. Because the studio wanted to keep her looking as young as possible, her breasts were also often bound.

Once she was a little older and starring in less-childlike roles, such as Esther Smith in Meet Me in St. Louis, she started wearing a canvas and metal corset that required two people on either side to pull the strings tight. (It’s a wonder she was still able to sing.)

The Tin Man (Jack Haley), Dorothy (Judy Garland) and the Scarecrow in a feature film.
Credit: FPG/ Moviepix via Getty Images

“The Wizard of Oz” Helped Earn Her an Itty-Bitty Academy Award

While she was nominated a few times, Garland’s only Academy Award came in 1940, and it was actually a miniature version of the iconic statuette. Garland was one of just a handful of people to win the special award known as the “Juvenile Oscar,” first awarded to six-year-old Shirley Temple in 1935.

The award typically celebrated a young actor’s achievement in the previous year, and in 1939 Garland had starred in two films: Babes in Arms and The Wizard of Oz. At the time she accepted the award, presented by her former classmate and previous Juvenile Oscar recipient Mickey Rooney, she was just a few months shy of her 18th birthday. The award really does look tiny with a teenager holding it — and even tinier next to full-size Academy Awards, like the one her daughter Liza Minnelli won for Cabaret in 1973.

The Juvenile Oscar wasn’t awarded every year, so it took a special situation to warrant the special trophy. Just 12 were awarded in the 26 years it existed; the last one was awarded in 1961 to Hayley Mills, who appeared in Pollyanna the year before. A 16-year-old Patty Duke won a regular Best Supporting Actress award two years later.

Photograph of Judy Garland and Bing Crosby singing into a microphone in a studio.
Credit: Bettmann via Getty Images

She Was Under 5 Feet Tall

Another thing that set her apart from other singer-actresses at the time was her height. Garland stood just 4 feet, 11 inches. When she was a child actress, she was still around the same height as her frequent costar Mickey Rooney, but was already noticeably shorter than other MGM stars such as Deanna Durbin. Ever notice how the famous ruby slippers from The Wizard of Oz have enough of a heel to add a couple of inches?

And Garland didn’t get any taller with age. Because she had a particularly short upper body, the height difference was a little more noticeable when she was seated next to someone — high heels can’t do much for you if you’re not standing up.

Judy Garland performing on stage at the Fontainebleau Hotel.
Credit: Ray Fisher/ The Chronicle Collection via Getty Images

In 1959, She Was Told She’d Never Work Again

After decades of overwork, substance abuse, and mental health struggles, Garland was in pretty rough shape by the time her late 30s rolled around. In late November 1959, she was admitted to New York’s Doctors Hospital with a barrage of symptoms both physical and mental, severe enough to be life-threatening. Over the next three weeks, her future seemed uncertain, and doctors drained 20 whole quarts of fluid from her body. After a bevy of tests, the lead physician on her case said that this would be the end of her career, permanently — she was to limit physical activity for the rest of her life, and “under no circumstances” could she work again.

Garland apparently responded “whoopee!” before collapsing back into bed. She’d later tell LIFE that the news made her feel “relieved”: “The pressure was off me for the first time in my life.”

She did stop working… for a time. She was in the hospital for a total of five months, then four more recovering at her Beverly Hills home. By the end of the summer, she was excited about music again.

In a career full of comebacks, this may have been her greatest. Garland embarked on a national tour that got rave reviews, and took a serious, non-singing role in the drama Judgment at Nuremberg that earned her an Oscar nomination. After delivering consistent, showstopping performances across the country, Garland recorded her famous live album, Judy at Carnegie Hall. The show was legendary: Despite a near lack of promotion, it sold out within hours. Audience members left their seats to crowd the stage (she asked them to sit down so others could see, and they obliged). After multiple encores, fans lingered at the stage entrance for an hour and a half.

After it was released, the album won four awards at the 1962 Grammys, two attributed to its star performer: Album of the Year (beating the soundtracks to both West Side Story and Breakfast at Tiffany’s), Best Female Solo Vocal Performance, Best Engineering, and Best Album Cover.

Garland would continue to have ups and downs for the last several years of her life, but the proclamation her doctor made in 1959 certainly didn’t bear out.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by ronstik/ iStock

Houseplants adorn our homes with beautiful foliage, but they’re not just for decoration. They enrich our lives by giving us something to care for and can be a rewarding hobby, especially for those who focus on complex plants like orchids. It’s possible that houseplants even reduce indoor air pollution (though this is controversial) and can dampen noise from the street.

While plants may seem pretty straightforward from a distance, we’re learning more every day about their inner workings — like if and how they respond to music. Hone in on specific species and there are all kinds of things to learn, such as how some trick pollinators into coming their way, while others can survive for years without water. These six facts about houseplants might have you running to the nursery for some new additions, or at least thinking differently about your leafy roomies.

Close-up of a girl spraying water in her lucky bamboo plant.
Credit: sansubba/ iStock

Lucky Bamboo Is Not Real Bamboo

Lucky bamboo has great cultural significance, especially when given as a gift or placed to improve feng shui. It’s often sold in stunning sculptural arrangements, like braids and twirls, and adorned with colorful ribbon, although it grows just fine as straight stalks in soil or plain water. Lucky bamboo thrives in all kinds of settings, and even does fine in low light, so it’s a mainstay of bookshelves and cubicles all over the world.

While it looks a little like true bamboo, it’s not related to the panda food. Its scientific name is Dracaena sanderiana, and it’s more closely related to dragon trees, ti plants, and snake plants. Like real bamboo, it’s super easy to grow, but unlike real bamboo, it’s not going to invade your entire garden.

close up of a decomposed fly in half opened venus flytrap.
Credit: Andreas Häuslbetz/ iStock

Poaching a Venus Flytrap Is a Felony

Venus flytraps are incredibly popular houseplants for both pest control and sheer curiosity appeal, but in the wild, their habitat is itty-bitty. They grow only in a 75-mile radius in damp savannas, mostly located in North Carolina, and they’re in danger of losing their very specific habitat, which needs fire to thrive. Venus flytraps are considered a “species of special concern” in North Carolina, and people digging them up for use as houseplants is a major threat to their population. As of 2015, only about 35,000 of them were left growing in the wild.

In 2014, the state made poaching them a felony, punishable by up to 29 months in prison and steep fines. The U.S. Fish and Wildlife Service recommends checking flytraps before you buy them; plants grown in a nursery will likely be consistent in size and have uniform soil free from weeds.

Young tangerine or kumquat tree with fruits in a wicker pot.
Credit: t.sableaux/ Shutterstock

Citrus Plants Can Grow Indoors

Those of us who don’t live in perpetually sunny climates may not be able to grow lime trees in our backyards, but that doesn’t mean giving up on a citrus tree dream. In colder climates, many varieties make great houseplants, as long as they’re a variety that takes to it; you’re not going to grow a giant orange tree in your living room, but dwarf trees can be a great option near sunny windows. Acidic fruits, like lemons and limes, need less heat to ripen, making them slightly better candidates for indoor growing, but some varieties of satsuma, citron, and kumquat are also great choices. Speaking of fruit…

Monstera deliciosa plant closeup including fruit bodies.
Credit: prill/ iStock

Monstera Plants Produce Tasty Tropical Fruit

As a houseplant, Monstera deliciosa (also known as split-leaf philodendron or Swiss cheese plant) is better known for its giant, heart-shaped leaves marked by beautiful patterns of notches and holes. Indoors, they typically stop there, but in their native Central American tropical forests, these already larger-than-life plants eventually climb nearby trees and reach heights of up to 70 feet, with large blossoms that resemble peace lilies. Eventually, those flowers become scaly fruit that tastes like a cross between a banana, pineapple, and mangohence, deliciosa.

Above view of woman hands holding Rose of Jericho, Selaginella lepidophylla.
Credit: FotoHelin/ iStock

Resurrection Plants Can Go Seven Years Without Water

Even for those who have never been able to keep a houseplant alive, Selaginella lepidophylla — also known as a resurrection plant, the Rose of Jericho, or, confusingly, the False Rose of Jericho — is a pretty low lift. They’re native to the Chihuahuan Desert in northern Mexico and parts of Texas, New Mexico, and Arizona, and have an incredible survival strategy for the dry heat. They allow themselves to dry out, and then bounce around as tumbleweeds until they find somewhere damp to settle down. Once hydrated, they spread out their fern-like fronds and turn green. If that area dries out, they just curl back up into a ball and repeat the cycle. Once dormant and dead-looking again, the plant can survive for up to seven years.

Selaginella lepidophylla does not need to be rooted to come back to life, and it only takes a couple of hours to go from dormant to vibrant. As houseplants, they are nearly impossible to kill. Just make sure nobody mistakes them for actually being dead — their biggest household hazard is becoming accidental trash.

Close-up of female hands holding an orchid plant.
Credit: Maryviolet/ iStock

Some Orchids Use Trickery To Reproduce

Orchids are bizarre plants, and there are a lot of them. With more than 25,000 varieties, they make up around 10% of the world’s plant species, and many of them have their own set of tricks. When it’s time to spread pollen, a number turn to mimicry to lure in beneficial insects. Some release pheromones that smell like female insects, while others go so far as to visually imitate their pollinators — that is, specialized petals lure in male bees or wasps, who “mate” with the flower before moving onto another, picking up pollen in the process.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by Dom Slike/ Alamy Stock Photo

Few figures in American history are as well regarded as Martin Luther King Jr. is today. A civil rights leader who worked tirelessly in the fight for justice and equality, even as he was threatened and attacked for doing so, King organized and participated in countless marches and protests to combat racial discrimination, laying the groundwork for important victories such as the Civil Rights Act of 1964 and the Voting Rights Act of 1965. His name is synonymous with the movement, and his message — most famously expressed in his “I Have a Dream” speech at the March on Washington for Jobs and Freedom in 1963 — continues to resonate and inspire today. But for all we know about his trailblazing activism, there are still a few details about his life that may surprise you. Here are six lesser-known facts about MLK.

Close-up of the Reverend Dr. Martin Luther King, Jr.
Credit: Bettmann via Getty Images

His Birth Name Was Michael

When Martin Luther King Jr. was born on January 15, 1929, his name wasn’t what we know it to be today. According to MLK’s original birth certificate, filed on April 12, 1934, his given name was Michael King Jr. His switch to a new name had to do with his father, who served as senior pastor at Atlanta’s Ebenezer Baptist Church. In 1934, King Sr. traveled to Germany, where he witnessed the budding rise of hate-fueled Nazism throughout the country. Germany was also where, in 1517, theologian and monk Martin Luther wrote his Ninety-Five Theses, which in turn inspired the Protestant Reformation. That movement held great significance to King Sr., who, upon returning to the states, chose the name “Martin Luther” for both himself and his son. MLK Jr. would rise to prominence under this new name, though he didn’t officially amend his birth certificate until July 23, 1957, when the name “Michael” was crossed out and the words “Martin Luther Jr.” were printed next to it.

Dr. Martin Luther King, Jr. receives honorary degree at Hofstra University.
Credit: Newsday LLC via Getty Images

He Received a “C” at School for Public Speaking

Although he’s known now for being a prolific public speaker, MLK Jr. wasn’t always appreciated for his eloquence. In fact, while attending Crozer Theological Seminary in Chester, Pennsylvania, King received a “Pass,” a “C+,” and a “C” in his public speaking course during the 1948-49 school year. This proved to be an anomaly, though; by the end of King’s time at the seminary, he was a straight-A student, class valedictorian, and student body president. He later attended Boston University, where he got his Ph.D. in systematic theology at the age of 25 in 1955, thus earning the title of doctor.

Slideshow of Martin Luther King Jr. at the Grammy's.
Credit: Kevin Winter/ WireImage via Getty Images

He Was a Three-Time Grammy Nominee

King was not a musician, but the spoken-word recordings of his most famous speeches earned him several Grammy nominations. The first came in 1964 at the 6th Annual Grammy Awards, where “We Shall Overcome (The March On Washington… August 28, 1963)” was nominated for Best Documentary, Spoken Word, Or Drama Recording (Other Than Comedy). Two other nominations were bestowed upon him posthumously, at the 11th Grammy Awards in 1969 for his recording of “I Have A Dream” (in the Best Spoken Word Recording category), and in that same category for “Why I Oppose the War in Vietnam” at the 13th Grammys in 1971. The latter was his first and only win, but his “I Have A Dream” speech was later voted into the Grammy Hall of Fame in 2012.

Martin Luther King, sculpted by Tim Crawley.
Credit: John Stillwell – PA Images via Getty Images

London’s Westminster Abbey Features a Statue of MLK

In 1998, a statue honoring Dr. King was unveiled at Westminster Abbey in London, a city where he famously spoke in 1964 while visiting Europe to accept his Nobel Peace Prize. The statue was among a group of 10 of the 20th century’s most celebrated Christian martyrs, which were installed above the Great West Door in niches that had stood vacant for 35 years. Queen Elizabeth II presided over the unveiling, which also honored notable religious figures such as El Salvador’s Archbishop Oscar Romero and Franciscan friar Maximilian Kolbe of Poland. Designed by Tim Crawley, the statues are made of French Richemont limestone and weigh almost a ton each.

Of course, Westminster Abbey is far from the only place to honor King artistically. There are several statues and memorials in the U.S., too, though perhaps none is more prominent than the Stone of Hope, a 30-foot-tall granite statue of King unveiled on D.C.’s National Mall in 2011.

The Star Trek crew.
Credit: Bettmann via Getty Images

He Was a Huge Fan of “Star Trek”

MLK was not only a huge fan of Star Trek but a pivotal figure in the career trajectory of one of the show’s most beloved actors. Star Trek was the only program King allowed his children to stay up late to watch, in large part because of the character Uhura, played by African American actress Nichelle Nichols. King viewed Nichols’ role as one of the few examples of equality on television — a belief that he expressed to Nichols upon meeting her at a fundraiser for the NAACP. After the show’s first season ended in 1967, Nichols had been leaning toward departing Star Trek for a role on Broadway. In the end, however, she was swayed by King’s passionate words about her power and influence as a role model for Black women, and decided to remain a member of the cast.

Civil rights activist Dr Martin Luther King with his wife Correta Scott.
Credit: Hulton Deutsch/ Corbis Historical via Getty Images

King and His Wife Spent Their Honeymoon at a Funeral Parlor

MLK met the woman who would become his wife, Coretta Scott, in Boston in January 1952. They married the next year on June 18, 1953, on Scott’s parents’ lawn in Alabama, though their ensuing honeymoon took an unusual turn. After being denied at several whites-only hotels throughout Marion, a town that held many segregationist beliefs, MLK and his wife were invited by a friend to spend their wedding night in the back room of a funeral parlor. It wasn’t until five years into their marriage that the pair took a more traditional honeymoon trip to Mexico.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by e'walker/ Shutterstock

More than 2 million people visit Mount Rushmore each year, making the towering presidential monument South Dakota’s biggest tourist attraction. The sculptor behind the visionary memorial, Gutzon Borglum, called it “a shrine to democracy,” with the carved granite faces of Presidents George Washington, Thomas Jefferson, Theodore Roosevelt, and Abraham Lincoln symbolizing the founding of the nation and its timeless values. From its controversial location to its starring role in a classic Hitchcock caper, here are six fascinating facts you might not know about Mount Rushmore.

View of Sculptured Faces of Former Presidents at Mount Rushmore.
Credit: Bettmann via Getty Images

Mount Rushmore Has Been Controversial From the Start

The idea for a monument in the Black Hills of South Dakota generated controversy even before the first blast of dynamite took place. The 1868 Treaty of Fort Laramie, signed by the U.S. government and Sioux nations, reserved the Black Hills for the exclusive use of the Sioux peoples. The mountain that became Mount Rushmore is a sacred site in Lakota culture (part of the Sioux); its name translates to “the Six Grandfathers,” representing the supernatural deities that the Lakota believe are responsible for their creation. But within a decade, the U.S. had broken the treaty, leading to skirmishes including the U.S. defeat at the Battle of Little Bighorn in 1876. The federal government used its loss in the battle to justify its occupation of the Black Hills (which was later found to be unconstitutional).

Despite this bloody history, South Dakota state historian Doane Robinson proposed the Black Hills as the site for a new tourist attraction. He contacted sculptor Gutzon Borglum, who had recently been working on a monument to Confederate leaders on Stone Mountain in Georgia, and invited him to South Dakota. In 1925, Borglum began designing the colossal monument at Mount Rushmore (renamed for New York lawyer Charles E. Rushmore, who traveled to the Black Hills to review legal titles of properties in 1884). Once the project was approved and funded by Congress, carving commenced in October 1927.

Workman on Mt. Rushmore repairing Lincoln's nose.
Credit: Bettmann via Getty Images

Each Head on Mount Rushmore Is Over Five Stories Tall

With such a large canvas, chipping away with chisels was not going to cut it. Borglum used dynamite to blast away 90% of the granite rock face, which left between 3 and 6 inches of granite to be carved more finely. Workers suspended in slings from the top of the monument drilled a series of closely spaced holes to weaken the rock, then removed the excess chunks by hand. The detailed facial features were achieved with hand tools that left perfectly smooth surfaces.

In all, nearly 400 men and women spent over 14 years working on the monument. When the entire sculpture was completed in 1941, each presidential head measured about 60 feet tall. The Presidents’ eyes are roughly 11 feet wide, their noses measure about 21 feet long, and their mouths are around 18 feet wide. And because Mount Rushmore’s granite erodes extremely slowly — about 1 inch in 10,000 years — those features won’t shrink any time soon.

Faces of Presidents George Washington, Thomas Jefferson, Theodore Roosevelt & Abraham Lincoln.
Credit: Harold M. Lambert/ Archive Photos via Getty Images

The Four Presidents Represent Specific Aspects of American History

The four Presidents depicted on Mount Rushmore were chosen for their key roles in American history. The carved face of George Washington, completed in 1930, is the most prominent figure on the memorial and represents the founding of the nation. Thomas Jefferson, dedicated in 1936, stands for the growth of the United States, thanks to his authorship of the Declaration of Independence and his roles in the Louisiana Purchase and the Lewis and Clark expedition. Borglum chose the figure of Abraham Lincoln, dedicated in 1937, to represent American unity for his efforts to preserve the nation during the Civil War. Theodore Roosevelt, finished in 1939, symbolizes the development of the United States as a world power (he helped negotiate the construction of the Panama Canal, among other achievements) and champion of the worker as he fought to end corporate monopolies.

View of Mount Rushmore.
Credit: Bettmann via Getty Images

Some Have Proposed Additions to Mount Rushmore

Various additions to Mount Rushmore’s pantheon have been suggested over the decades. In 1937, a woman named Rose Arnold Powell enlisted the help of First Lady Eleanor Roosevelt in her proposal to add the head of Susan B. Anthony, but Congress ultimately refused to allocate funds. Other suggestions have included the additions of Presidents John F. Kennedy, Ronald Reagan, Franklin D. Roosevelt, and even Barack Obama. However, the mountain’s original sculptor Gutzon Borglum insisted that the rock was unable to support further carving, and modern engineers have backed up that claim.

View Mount Rushmore Hall of Records.
Credit: Science History Images/ Alamy Stock Photo

There’s a Semi-Secret Vault Behind the Sculpture

Borglum’s initial plans for Mount Rushmore called for a grand Hall of Records within the mountain behind Lincoln’s head. The 100-foot-long hall was planned to house the Declaration of Independence, U.S. Constitution, and other documents key to the nation’s history. Borglum envisioned the hall to be decorated with busts of notable Americans and a gigantic gold-plated eagle with a 38-foot wingspan over the entrance. But the twin tragedies of Borglum’s death in 1941 and the start of World War II put an end to his vision. The half-excavated chamber sat empty until 1998, when officials placed 16 panels explaining the history of the monument and its sculptor — along with the words to the Bill of Rights, U.S. Constitution, and Declaration of Independence — in a box and sealed it inside the vault. The chamber remains closed to the public.

Hanging from a cliff at Mount Rushmore in 'North By Northwest', directed by Alfred Hitchcock.
Credit: Silver Screen Collection/ Moviepix via Getty Images

The National Park Service Didn’t Want Hitchcock to Film “North by Northwest” at Mount Rushmore

The climax of Alfred Hitchcock’s 1959 thriller North by Northwest has Roger Thornhill (played by Cary Grant) and Eve Kendall (played by Eva Marie Saint) dangling over the precipice of Mount Rushmore as a villain tries to push them off. However, prior to filming, the National Park Service worried that the filmmakers would desecrate the monument. The agency made Hitchcock agree to strict rules, including promising not to shoot any violent scenes at Mount Rushmore or have live actors scrambling over the Presidents’ faces, even on a soundstage mock-up.
But the director failed to keep his word. Days before the film’s premiere, the park service issued a terse statement calling the movie “a crass violation of its permit” and wanting to “make the record clear on what the agreement provided and who failed to live up to it.” Still, the editor of the local Sioux Falls newspaper recognized the movie’s legacy. “Obviously,” he wrote, “this picture is worth its weight in gold to Rushmore from a publicity viewpoint.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Lifestyle pictures/ Alamy Stock Photo

Few films have had as profound an impact on cinema as the original Star Wars and the multibillion-dollar franchise it inspired. For nearly 50 years, fans have been dressing up as Jedi, stormtroopers, and Sith, and imagining their own adventures in a galaxy far, far away. In fact, the films have had such a cultural impact that May 4 (“May the Fourth Be With You”) is essentially an official holiday for Star Wars fans the world over. Here are seven little-known facts about Star Wars, exploring both the production of the films and the inspiration behind the saga’s most iconic characters.

Star Wars movie site in Tunisia.
Credit: Education Images/ Universal Images Group via Getty Images

Filming the Original “Star Wars” Almost Caused an International Conflict

Although Star Wars is famously set in a galaxy far, far away, George Lucas used real-world sets and locations to stand in for extraterrestrial locales throughout the original trilogy. The ice planet Hoth in Empire Strikes Back was filmed near the town of Finse, Norway, while the forest moon of Endor scenes made use of the giant redwoods near Crescent City, California.

One of the most iconic locations in all nine films is the Skywalker homestead on the desert planet of Tatooine. Lucas decided to shoot these scenes, which kick off the entire Star Wars franchise, in the desert of Tunisia (though parts were also filmed in Death Valley, California). In the mid-1970s, Tunisia had a tense relationship with the Libyan government, run by Muammar Gaddafi. Star Wars filmed in Nefta, Tunisia, not far from the Tunisian-Libyan border. The biography George Lucas: A Life details how the Libyan government originally perceived the production as a military buildup along the border, mistaking a Jawa Sandcrawler for military hardware. Libyan inspectors even crossed the border to confirm that these otherworldly vehicles posed no real military threat. Thankfully, the matter ended smoothly.

Darth Vader at the European premiere of "Star Wars: The Rise of Skywalker".
Credit: Gareth Cattermole/ Getty Images Entertainment via Getty Images

Darth Vader’s Look Is Based on a Real Japanese Samurai

The inspiration behind the original Star Wars is famously pulled from a variety of sources. The iconic title crawl that sets up the space drama in the film’s opening seconds can be found in 1930s adventure serials like Flash Gordon and Buck Rogers. The space battles between TIE fighters and X-Wings are a direct reference to WWII dogfighting, and the concept of the Jedi is likely lifted from the pages of Frank Herbert’s Dune.

But the most iconic character in the entire saga is undoubtedly Darth Vader, and his look is based on a very real historical figure — a Japanese samurai warlord named Date Masamune. Ralph McQuarrie, the concept artist behind the original trilogy of films, was influenced by Japanese samurai armor, and especially the jet-black armor of Masamune, who was born in 1567. The helmets are the most alike, but McQuarrie also borrowed the extended neck piece from Masamune’s armor. Vader’s helmet includes additional influences from helmets worn by the German army during WWII, all used to create the most ominous villain the galaxy (and moviegoers) have ever seen.

Harrison Ford, as Hans Solo, on the set of Star Wars: Episode IV.
Credit: Sunset Boulevard/ Corbis Historical via Getty Images

“I Have a Bad Feeling About This” Is Said in Every “Star Wars” Film

The entire Star Wars saga is filled with little Easter eggs and references to other characters and events throughout the franchise. One that can be easily missed is the phrase “I have a bad feeling about this,” said in every single Star Wars film (and sometimes even uttered multiple times). The phrase is also found in one-off live-action films, animated TV shows, video game series, and novels, and has become a kind of “in-joke” among Star Wars creators.

Notably, The Last Jedi, the eighth film in the Star Wars saga, appears to be the only exception, as no character seemingly utters the famous phrase on screen. But director Rian Johnson confirmed that BB-8 actually delivers the line in binary, after which Poe Dameron, played by Oscar Isaac, retorts, “Happy beeps here, buddy, come on.”

a close up of Porg in Star Wars.
Credit: PictureLux / The Hollywood Archive/ Alamy Stock Photo

“The Last Jedi” Invented Porgs To Digitally Mask Real-Life Puffins

One of the most important locations in Rian Johnson’s The Last Jedi is the remote island on the planet Ahco-To, where a disgruntled Luke Skywalker spends his self-imposed exile and subsequently trains an adamant Rey. These scenes were shot on a very real Irish island called Skellig Michael. Although perfect for creating a much-needed sense of isolation, the island is also a wildlife preserve for puffins. The puffins became a real problem during the many scenes filmed on the island, as they constantly flew into shots and disrupted production. By law, The Last Jedi crew couldn’t mess with them, so according to Jake Lunt Davies, a creature concept designer on the film, the team decided to design an in-universe creature that lived on the island and digitally replaced any puffins that got in the shot with them. Hence, Porgs were born.

Members of the American Pop group N'Sync pose in front of a green screen.
Credit: Mikki Ansin/ Archive Photos via Getty Images

‘N Sync Was Almost in “Attack of the Clones”

Turn back the clock to 2001, and pop culture was obsessed with both the new Star Wars prequel franchise and the boy band ‘N Sync. At the behest of George Lucas’ daughter (along with the daughter of producer Rick McCullum), the members of ‘N Sync were offered minor roles during the final battle on Geonosis. Justin Timberlake and Lance Bass declined the invitation, supposedly too tired from touring, but the other three band members — Joey Fatone, JC Chasez, and Chris Kirkpatrick — donned Jedi robes and shot their scenes for the film. The moment was particularly special for Fatone, who had an entire room of his house dedicated to Star Wars memorabilia. Sadly, the footage wasn’t used in the final cut, and the blink-and-you’ll-miss-it cameo instead became a little-known piece of Star Wars history.

Darth Vader accepts the Ultimate Villain award from George Lucas onstage.
Credit: Kevin Winter/ Getty images Entertainment via Getty Images

The Original “Star Wars” Almost Wasn’t Made

It’s almost unfathomable that a movie studio would pass up the opportunity to make Star Wars, but in the mid-1970s, George Lucas’ little indie film was perilously close to never being made. Lucas first tried to get the rights to Flash Gordon in order to make his own big-screen version, but when he was unable to secure a deal, he decided to make his own space adventure. Once he had the idea, he needed the money, but United Artists, Universal, and even Disney (which later bought the franchise rights for $4.05 billion in 2012) all passed on funding the film.

Finally, 20th Century Fox agreed to finance the project, not because they thought the film would be any good, but mostly to secure a relationship with the up-and-coming director. With an initial budget of only $8 million (eventually bumped up to $11 million) and plenty of disasters during filming and post-production, Star Wars was born from both financial and artistic adversity, yet it has gone on to inspire generations of fans around the globe.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Diana Parkhouse/ Unsplash

Whether you love bugs or they give you the heebie-jeebies, they are everywhere — and they’re fascinating. Some are smaller than a grain of sand, while others can be mistaken for a sizable stick. While many are major nuisances, plenty of them are cute, beautiful, or even helpful. Each one has a unique life cycle and thrives in a different environment.

Which common pollinator communicates by shaking its booty? How far can butterflies travel? How much can ants really carry? What bug has the highest body count? These seven intriguing insect facts will have you thinking differently about your exoskeletal friends… for better or for worse.

Two ants on a branch lifting a heavy plant.
Credit: dikkyoesin1/ RooM via Getty Images

Ants Can Carry 10 to 50 Times Their Body Weight

Estimates vary on how much hardworking ants can actually carry, but the consensus is that it’s a lot — anywhere from 10 to 50 times their own body weight. They’re so tiny that their muscles are thick compared to their body size, leading to a disproportionate amount of strength for their size. One 2014 study suggests that an ant’s neck joint can withstand pressure from up to 5,000 times their own body weight.

In the big picture, however, the numbers are still pretty small: Individual worker ants generally weigh 1 to 5 milligrams, so while it’s pretty impressive that a 5-milligram creature can carry perhaps 250 milligrams (about a quarter of the weight of a jellybean), they’re not exactly going to be robbing any museums.

Bees on a honeycomb.
Credit: BigBlueStudio/ Shutterstock

Honeybees Communicate With Dance

When honeybees find a really, really great stash of nectar, they’re eager to share the news with their hivemates, and they give their directions in a very cute (and stunningly accurate) way. Once a worker bee finds an ideal flower, she returns to the hive and performs the “waggle dance.”

After getting her siblings’ attention by standing on top of them and vibrating, she hops down and wags her abdomen while walking a straight line, then circles around and repeats the movement. The direction of the line communicates the direction of the source in relation to the sun, and the length the distance from the hive. Her fellow bees sense every vibration, and get a secondary signal from the lingering scent of the pollen.

The dance can reference distances nearly 4 miles away with surprising accuracy, although it’s more difficult to give precise directions when the source is relatively nearby, the bee is sleepy, or because of human interference. Fortunately, one study suggests that the bees may be able to assess the reliability of each dance, and lose interest if the dancer seems disoriented.

Chan's megastick (Phobaeticus chani) resting on a leaf in the jungle of Borneo.
Credit: Paolo Pako/ Shutterstock

The Longest Insect Measures Nearly 2 Feet Long

Stick bugs, sometimes known as walking sticks, tend to be bigger than other insects, but in parts of Southeast Asia, that can be a bit of an understatement. The world’s longest known insect, Phobaeticus chani, familiarly called Chan’s Megastick, measures 22 inches long with its legs outstretched, and 14 inches in its body alone.

The only known specimen was found around the 1970s by a local collector in Borneo, but it wasn’t acknowledged as a possible new species until a Malaysian naturalist saw the collection in 1989. It was passed off to British scientists soon after (and now lives at the Natural History Museum in London), but wasn’t recognized as a record-holder until 2008. It’s a testament to the insect’s camouflage abilities that it took so long for it to be discovered; Chan’s Megastick likely lives high up in the forest canopy, easily blending in as, well, a very large stick.

While it’s the longest insect recorded, it’s not alone in its giganticness. The previous record-holder, also a stick bug from Borneo, was less than an inch shorter. Currently in second place is a 21-inch stick bug discovered by Belgian entomologists at Vietnam’s Tay Yen Tu Nature Reserve in 2014.

Close-up of a ladybug on a plant.
Credit: Diana Parkhouse/ Unsplash

One Ladybug Can Eat 75 Insects Per Day

Lady beetles may be one of the most adorable insect species on the planet, but they’re also very effective predators. A single adult ladybug can eat up to 75 insects a day (up to 5,000 in its lifetime), and during the two-week larval stage, each one eats around 350 to 400.

Their absolute favorite food is aphids, a common garden pest that, in large numbers, can spread disease and cause major damage to plants — and attract droves of ants, who farm aphids for their sugary excretions — but they’ll also eat other pests like fruit flies, mites, and thrips. Because of this, ladybugs are one of the more common “beneficial insects” used by gardeners as natural pest control.

luna moth on a lilac.
Credit: Kevin Collison/ Shutterstock

Luna Moths Have No Mouths

Luna moths can be stunning creatures, instantly recognizable for their wide span of pale green, almost iridescent wings. What’s not quite as obvious is that they have no mouth, and no digestive system, either.

As caterpillars, they eat ravenously and spend a month munching on leaves before building up their cocoon, where they spend three weeks. In their adult stage, they need to rely on the food stores they ate as caterpillars, and they live for only about a week. During this time, their top priority is mating — although tricking bats out of eating them is a close second.

Portrait of a butterfly on a butterfly-bush against a clear blue sky.
Credit: Nick Biemans/ Shutterstock

Painted Lady Butterflies Can Travel 7,500 Miles in a Single Migration

Painted lady butterflies (Vanessa cardui), sometimes known as cosmopolitan or thistle butterflies, can be found all over the world — and each year, their colonies travel an impressive distance. In the spring, they fly northward to Europe, and in the late summer, they start their journey back down to sub-Saharan Africa. The whole journey is around 7,500 miles round-trip, and involves crossing both the Sahara and the Mediterranean Sea. Like the similar but not-quite-as-long migration of monarch butterflies, the trip occurs over several butterfly generations, although the occasional extra-sturdy bug stays alive for the whole return trip.

The American lady (Vanessa virginiensis), a similar species of butterfly that’s also known as the American painted lady, travels impressive distances as well, sometimes overwintering in the American South and traveling well into Canada during warmer months. On the West Coast, they’re known to travel from western Mexican deserts all the way up into the Pacific Northwest.

Little girl has skin rash allergy and itchy on her arm from mosquito bite.
Credit: Kwangmoozaa/ Shutterstock

Mosquitoes Are the World’s Deadliest Animal

Which animal counts as the most dangerous in the world depends on which metric you’re using, but going by pure annual body count, mosquitoes win by a large margin. By transmitting severe diseases such as malaria, dengue fever, Zika virus, and West Nile virus as they feed on human blood, the tiny pests are responsible for around 725,000 deaths each year. Certain mosquitoes even prefer humans to other animals and, unsurprisingly, these insects end up being the ones that tend to spread diseases that affect humans.

It’s not just semantics: Mosquitoes have been called the world’s deadliest animal by both the CDC and the Gates Foundation. Some argue that mosquitoes should be disqualified from the list because they don’t exactly attack humans, per se — they don’t turn to deliberate violence because of a perceived threat, and technically it’s the pathogens they carry that are doing the killing — but the issue of mosquito culpability is perhaps more of an existential quibble.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by stellalevi/ iStock

Alcatraz Island, known colloquially as “The Rock,” was once the most notorious prison in the United States. Located 1.25 miles offshore from San Francisco, the island saw Civil War prisoners in the 1860s, mob bosses in the 1930s, and much more. Today, it’s one of the Bay Area’s most popular tourist attractions, and an on-island museum tells the story of the prison’s past. These seven facts span the many ages of Alcatraz and reveal how it became one of the most infamous sites in American history.

Landing pelican with extended wings and mountains in the background.
Credit: Sebastian Jakimczuk/ iStock

The Word “Alcatraz” Means “Pelican” in Archaic Spanish

In 1775, Spanish explorer Juan Manuel de Ayala became the first European to sail into San Francisco Bay. He named the bay and its islands, including one he called “Alcatraces.” Although the island’s name was anglicized over the decades, its origin is widely believed to mean “pelican” or “strange bird.” The island was once a particular hot spot for California brown pelicans (Pelecanus occidentalis californicus), which were so plentiful in the 19th century that one French observer noted that when a group of pelicans took off in flight, it created winds like a hurricane. Although the birds’ numbers dwindled sharply due to hunting and the use of DDT over the decades, the pelican rebounded in the latter part of the 20th century, and was removed from the Endangered Species List in 2009.

Military ship at Alcatraz.
Credit: CHROMORANGE / Bastian Kienitz/ Alamy Stock Photo

Before Becoming a Prison, Alcatraz Was a Military Outpost

Although Alcatraz is known as one of America’s most infamous prisons, its first official U.S. role was as a military outpost. With California joining the U.S. in 1850 after being ceded from Mexico two years prior, and with hundreds of thousands of people flooding the state as part of the California Gold Rush, the U.S. military needed to protect San Francisco Bay. Alcatraz, along with Fort Point and Lime Point, formed a “triangle of defense” that guarded the bay’s entrance. At one point, the U.S. even installed 100 cannons on the 22-acre island, making it the most heavily armed military outpost in the Western U.S. But by the decade’s end, the first prisoners had been brought to the island, and Alcatraz played host to both Confederate prisoners and Union deserters during the Civil War.

Ruins of the Warden's House stand beside Alcatraz Island Lighthouse.
Credit: Robert Alexander/ Archive Photos via Getty Images

Alcatraz Was Home to the First Lighthouse on the U.S. West Coast

During the island’s days as a military outpost, the U.S. constructed a lighthouse to serve vessels crisscrossing the busy shipping lanes of San Francisco Bay. Although the lighthouse tower was built by 1852, the Fresnel lens — a compact lens designed to make lighthouses brighter — didn’t arrive until 1854. Luckily, the delay didn’t cost the lighthouse the impressive accolade of being the first lighthouse constructed on the West Coast of the United States. Sadly, the structure was damaged beyond repair following the catastrophic 1906 San Francisco earthquake. It was rebuilt, however, and still operates to this day.

View of a long corridor inside a cell block at Alcatraz penitentiary.
Credit: Handout/ Archive Photos via Getty Images

Prison Life at Alcatraz Wasn’t Always Bad

Alcatraz became a federal prison in 1934, after being transferred to the U.S. Department of Justice and the Federal Bureau of Prisons. It was designed as a maximum security penitentiary meant for the most difficult inmates in the federal system, and was partly an attempt to show the public that the government was being tough on the widespread crime of the 1920s and ’30s.

Although Alcatraz cut an intimidating figure, some prisoners reported that the experience wasn’t so bad. The first warden of Alcatraz made sure the food was good to dissuade rioting, and a menu in the 1940s even included “bacon jambalaya, pork roast with all the trimmings, or beef pot pie Anglaise.” Prisoners lived one man to a cell, which wasn’t a certainty in other federal prisons, and had basic rights to food, shelter, clothing, and medical care. Through good behavior, prisoners could earn privileges that included work on the island and even playing music. In fact, Alcatraz’s reputation far surpassed those of some other federal prisons, and occasionally inmates around the country even requested transfers to “The Rock.”

Gangster Al Capone wearing an overcoat in Chicago.
Credit: Bettmann via Getty Images

Al Capone Wrote Love Songs While an Inmate at Alcatraz

Arguably the prison’s most famous inmate was Al Capone, who was known at Alcatraz as Prisoner 85. Although a ruthless mob leader who ran the Italian American organized crime syndicate known as the Chicago Outfit, Scarface was finally put behind bars for tax evasion in 1931. In a few instances, he resorted to violence when provoked, but he mostly spent time playing banjo in the prison band the Rock Islanders, and writing love songs. In 2017, Capone’s handwritten lyrics to one song, titled “Humoresque,” sold at auction for $18,750. The lyrics included such memorable lines as “You thrill and fill this heart of mine, with gladness like a soothing symphony, over the air, you gently float, and in my soul, you strike a note.” Capone was eventually released from prison in November 1939, after more than seven years behind bars, by which time he was in ill health due to an untreated case of syphilis.

Aerial view of Alcatraz.
Credit: Chris Szwedo/iStock

No One Has Ever Escaped From Alcatraz (Probably)

Of the 14 escape attempts at Alcatraz, all failed — except one daring attempt (forever immortalized in the 1979 film Escape From Alcatraz). On June 12, 1962, an early morning bed check at the prison revealed that three inmates were missing from their beds — and in a made-for-Hollywood twist, they’d been replaced by papier-mâché heads constructed in secret to fool the night guards.

While hacking together homemade life vests (an idea they got from the DIY magazine Popular Mechanics), the escapees tried their luck across the bay toward San Francisco. The FBI discovered the vests on Cronkhite Beach and found other bits of evidence (including letters sealed in rubber) scattered throughout the bay — but the authorities never found any evidence of the men living in the U.S. or abroad, and believed they actually drowned in the bay’s frigid waters. The FBI closed the case on December 31, 1979, but the U.S. Marshals Service has continued to investigate.

Native Americans occupying Alcatraz Island.
Credit: Bettmann via Getty Images

Native Americans Occupied Alcatraz

One problem with running a prison on an island is that it can be pretty expensive to maintain, and so in March 1963, the century-old military outpost-turned-penitentiary closed its doors — but that wasn’t the end of its story.

In November 1969, a group of Native Americans led by activist Richard Oakes traveled to Alcatraz and began an occupation of the island that lasted 19 months. The group referenced the 1868 Treaty of Fort Laramie, which allowed Native people to repossess retired or abandoned federal land, as the basis for their seizure. They issued a proclamation that included a letter to the “Great White Father and All His People,” which highlighted the hypocrisy of the U.S. government’s treatment of Native Americans both past and present. Over the following months, the occupation grew in size to as many as 600 people, before numbers began to dwindle in January 1970. The government cut off electrical and water supplies to the island, food became scarce, and in June 1971 U.S. marshals forcibly removed the final 15 occupiers from the island.

A highly publicized moment of Indigenous activism, the protest brought considerable attention to the plight of America’s Native peoples. In 1970, President Richard Nixon even ended the U.S.’s decades-long termination policy — an effort to forcibly eliminate tribes and assimilate Native Indians into American society. The occupation of Alcatraz was the first intertribal protest, and part of a rich history of modern Native American activism.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Photo 12/ Alamy Stock Photo

Upon its premiere in 1951, I Love Lucy became an immediate hit. Audiences were charmed by the sitcom, which revolved around Lucy’s hairbrained schemes to enter show business, much to her husband Ricky’s chagrin. To this day, the show remains as iconic as ever, cementing Lucille Ball as a comedy legend and television pioneer. And though she may be best remembered for her wacky on-screen antics, fiery red hair, and larger-than-life comedic presence, Ball was equally influential for her work behind the scenes as the first female head of a Hollywood production company. Here are some little-known facts about the life of the legendary trailblazer.

Film still from Gone With the Wind.
Credit: United Archives/ Hulton Archive via Getty Images

Lucille Ball Auditioned for the Role of Scarlett O’Hara in “Gone With the Wind”

In 1939, Ball — along with 1,400 other hopefuls — auditioned for one of the most celebrated roles in Hollywood history. Her audition proved to be disastrous, however, as she showed up soaking wet and tipsy, the result of running through a rainstorm after having one too many drinks to ease her nerves. But that isn’t Ball’s only Gone With the Wind connection. In an ironic twist, she would go on to own many of the movie’s sets. In 1957, her production studio, Desilu Productions, purchased 33 soundstages (among other things) from RKO Pictures, including the exterior of the Tara plantation.

Actress Lucille Ball and her husband actor Desi Arnaz circa 1950's.
Credit: Archive Photos via Getty Images

Lucy and Desi Were Television’s First Interracial Couple

When CBS first offered Ball an opportunity to star in a TV show based on her radio program “My Favorite Husband,” she agreed to do so under one condition: She insisted her real-life husband, Desi Arnaz, play her television spouse. The network initially refused, claiming that audiences wouldn’t be receptive to an interracial couple, especially given Arnaz’s thick Cuban accent. But Ball proved them wrong by embarking on a nationwide tour with Arnaz. The pair charmed crowds around the country with their vaudevillian act. Only then did CBS agree to cast the couple, since fans couldn’t get enough of the duo.

Lucille Ball & Desi Arnaz In 'I Love Lucy'.
Credit: Hulton Archive via Getty Images

Lucille Ball Was One of the First Women To Appear Pregnant on Network TV

Pregnant characters are commonplace now, but in the 1950s, Lucy’s television pregnancy was groundbreaking. Both CBS and the show’s sponsor, Philip Morris, were so concerned about airing this seemingly suggestive idea that they had the production studio work with various religious organizations to determine how to most sensitively express this supposedly controversial plot point. Ultimately, the producers agreed to avoid the word “pregnant,” going with the euphemism “expecting” (and similar terms) instead. The then-radical six-episode pregnancy arc paid off, as over 44 million people tuned in on January 19, 1953, to see Lucy welcome her son Little Ricky. The episode, titled “Lucy Goes to the Hospital,” aired the same day Ball actually gave birth by planned cesarean section to Desi Arnaz Jr.

Lucy Almost Drowned Filming the Famous Grape Scene

Plenty of people are familiar with the classic grape-stomping episode of I Love Lucy. But not everyone knows that filming the scene proved dangerous. The Italian actress who appeared alongside Lucy spoke little English. She was given instructions to act out a fight via an interpreter, but the details may have gotten lost in translation. As Ball recounted on The Dick Cavett Show, “I got into the vat … and she had been told that we would have a fight. I slipped and, in slipping, I hit her accidentally and she took offense, until she hauled off and let me have it…. She’d get me down by the throat! I had grapes up my nose, in my ears, and she was choking me, and I’m really beating her to get her off… she didn’t understand that she had to let me up once in a while. I was drowning in these grapes!”

On the set of the TV series Star Trek.
Credit: Sunset Boulevard/ Corbis Historical via Getty Images

She Helped Get “Star Trek” on TV

As the first female head of a major Hollywood studio — Desilu Productions, which she formed with Arnaz but took over by herself after their divorce in 1960 — Ball helped produce some of the most influential television shows of all time. She was particularly instrumental in getting Star Trek on the air. There was apparently some trepidation by Desilu board members when it came to the budget of the ambitious series, leaving Ball to personally finance not one but two pilots of the science fiction mainstay. One studio accountant, Edwin “Ed” Holly, even claimed: “If it were not for Lucy, there would be no Star Trek today.” Lucille Ball truly allowed the show to live long and prosper.

A newspaper story about actress Lucille Ball's 1936 voting registration as a Communist.
Credit: Bettmann via Getty Images

She Had a Communist Past

During the 1950s, amidst Joseph McCarthy’s congressional reign, many celebrities faced accusations of Communist loyalties. Many had their reputations tarnished, but amazingly, Ball emerged unscathed despite being questioned by both the FBI and the House Committee on Un-American Activities in 1953. It turns out Ball had identified as a Communist when she registered to vote in 1936. However, she only did so to pacify her grandfather and his political leanings. Desi Arnaz came fiercely to his wife’s defense, and even phoned FBI Director J. Edgar Hoover directly to clear her name. Arnaz once famously quipped, “The only thing red about her was her hair, and even that was not legitimate.”

American actress Lucille Ball (1911 - 1989) at the wheel of a white convertible, circa 1955.
Credit: Silver Screen Collection/ Moviepix via Getty Images

She Starred in 5 Different TV Shows

While she’s best known for her role as Lucy Ricardo in I Love Lucy, Ball starred in four other television shows over the course of her career. Following I Love Lucy, there was The Lucy-Desi Comedy Hour, which ran from 1957 to 1960. Then she worked with former co-star Vivian Vance over six seasons on The Lucy Show, which premiered in 1962. Next came Here’s Lucy in 1968. The show was a true family affair, as it starred her real-life children Lucie Arnaz and Desi Arnaz Jr. In 1986 she staged a TV comeback on Life With Lucy, one of the most poorly received sitcoms of all time, where child star and future musician Jenny Lewis played her granddaughter. Only eight episodes made it to air before the show was canceled — a rare misstep in an otherwise illustrious career.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.