Original photo by ANURAK PONGPATIMET/ Shutterstock
In many countries, a baby’s first birthday marks a joyous milestone for parents, honoring the many months of sleepless nights and hard work involved in welcoming a new family member. But in some places — like South Korea — babies are already considered 1 year old at birth. Korean culture calculates age in three different ways, and the oldest and most traditional way (often called “Korean age”) may have gotten its start by accounting for the time spent in utero, rounding up a nine-month gestation to a full year.
“Happy Birthday to You” was originally written for kindergarteners.
Teacher Patty Smith Hill and her sister, pianist Mildred Jane Hill, are credited with composing the song, though its original words were “good morning to you,” welcoming students to class. The duo crafted the simple melody so it would be easy for young singers to learn.
Under this measurement, everyone gains another year of age on January 1, regardless of their actual birth date — meaning it’s possible for a baby born on December 31 to turn 2 years old the following day. Yet individual birthdays are still recorded and celebrated; in fact, South Korea has used the “international age” system that counts age by date of birth for medical and administrative purposes since 1962. A third age-counting method acts as a compromise between accuracy and culture: Babies are born at age 0, but gain a year on New Year’s Day.
Knowing someone’s age is culturally important in Korea; it’s tied to language, impacting how people address their elders and interact on social occasions. However, the traditional method of determining age does cause some confusion when it comes to administering medications, vaccinations, and health care procedures that are determined by one’s years, and has also caused issues with legal disputes. In December 2022, the South Korean government passed laws that standardized the use of international age, meaning many Koreans will technically become one to two years younger.
In Korean culture, blood types are used to determine compatibility.
Knowing your blood type is just as important as knowing your age in South Korea, where many people believe it can make or break a relationship. For nearly 100 years, Koreans have associated personality traits with blood types in the same way believers of astrology use birth dates to understand someone’s identity. People with Type A blood supposedly have a hard time trusting others but are highly creative, while Type Bs are known for being passionate and independent. People with AB blood types are categorized as rational introverts, while Type Os are often considered natural leaders. Many scientists say there’s no known link between a person’s blood type and their personality, though the idea has taken hold in Korean pop culture, featured as a plot point in books and movies.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Most rats live their lives entirely unseen by humans. As kings of the background, they often scurry through human environments just out of sight or after dark, looking for leftover morsels. But researchers believe rats might be picking up more than just our food crumbs — they could also be picking up on the beats in our music. A study published in 2022 suggests rats may have a humanlike sense of rhythm, which they express by bopping their heads to the beat. Scientists once believed that few animals were beat-sensitive (aside from some birds), but rats exposed to music made microscopic head movements that were picked up by tiny, wireless motion detectors. The researchers hypothesized that rats would prefer faster jams thanks in part to their rapid heartbeats, though surprisingly, lab rats synced up best with music in the 120 to 140 beats-per-minute range — just like humans.
Magawa, a trained African giant pouched rat, sniffed out more than 100 explosives in Cambodia before dying in 2022. Five years of mine-detecting work landed him a medal, though he wasn’t the only rat with a job; an organization called APOPO has trained hundreds to detect land mines.
Humans have long wondered if animals respond to music the way we do. Charles Darwin’s studies examined the relationship between animals and music, believing rhythm could be found throughout nature and may have been the precursor to music. Today, some experts believe only certain species have the ability to really respond to changing beats — notably bats, birds, dolphins, and elephants, which all have the complex ability to learn and repeat new sounds. However, some studies show other animals interact with music, too; one experiment found that pigs exposed to certain music became playful and wagged their tails. Additionally, many farmers report their cows are calmer when the radio is on, with a supporting study reporting that dairy cows produce 3% more milk while listening to slow tunes (fast music had no effect). And when it comes to our best pet companions, music is known to soothe anxious dogs in shelters and adoption centers, though felines — known for being a bit finicky — couldn’t care less about human music. However, they do respond positively to tailored tunes that use beats and frequencies similar to their own meows.
Rats are found on every continent except Antarctica.
Advertisement
Rats are picky eaters.
If you’ve ever been hesitant about trying a new food, you have something in common with rats. That’s because rats are known for being picky eaters. These discerning rodents are cautious for good reason — they’re unable to vomit, so avoiding potential poisoning is top of mind with every new food they find, since they can’t purge toxins the way humans do. Wild rats are known to test out new foods in small amounts, taking a few nibbles and waiting hours to see if they have any unfortunate side effects before diving into their scavenged meals. And just like humans, rats appear to gain more sophisticated palates as they age; younger rats seem to prefer sugary treats, though they eventually learn to enjoy more bitter flavors as they get older.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Heritage Image Partnership Ltd/ Alamy Stock Photo
For centuries, getting around by horse and cart was the standard mode of transportation. By the 1800s, however, these hay-powered haulers were causing problems on busy city streets. As more people moved into cities, the number of horses dramatically increased, and with so many equines on the roads — New York City had around 150,000 horses in 1890 — public health concerns emerged over disease and mountains of manure. Horse travel, frankly put, was dirty in comparison to making way by horseless carriage, aka the first electric vehicles. Marketed as clean, quiet, and easy to drive, early electric cars, which resembled traditional carriages, became so popular that by 1900 they accounted for around one-third of all automotive vehicles on roadways.
Ferdinand Porsche, inventor of the luxury car brand, created the first hybrid vehicle powered by both gas engine and battery. Dubbed “Semper Vivus” (“Forever Alive” in Latin), the car sported a 926-pound battery and was renamed “Mixte” when it hit the market in 1901.
The earliest known full-sized electric car was designed by Robert Anderson, a Scottish inventor who built his version in the 1830s, though that car (and many of its successors) didn’t go very far; at the time, batteries were rudimentary and couldn’t be recharged. It took about three decades for electric car batteries to improve, and starting in 1881, battery-operated buses began ferrying passengers in Paris, Berlin, London, and New York. A few years later, Iowa chemist William Morrison applied for a patent for his electric carriage, which could travel around 50 miles on one charge at a top speed of 20 miles per hour. By 1897, the top-selling car in the U.S. was powered by battery, though electric vehicles would hold the market for a relatively short time. By 1913, manufacturer Henry Ford had fine-tuned the mass production of gas-powered cars, dropping their price and helping to usher in a new era of private transportation.
The first vehicle Henry Ford built was called the “Quadricycle.”
Advertisement
Henry Ford’s wife, Clara, preferred driving electric cars.
While anyone of means could purchase an electric car at the turn of the 20th century, many models were particularly advertised to women as “ladies’ cars,” tied to a belief (however offensively) that they were easier to drive than steam- and gas-powered alternatives. Early advertisements appealed to social norms of the time, suggesting that women could attend to their errands and social events without dirtying their attire. Ads had an element of truth — electric cars didn’t produce fumes and were quieter than gas-powered vehicles. That’s part of the reason even Henry Ford’s wife, Clara, preferred to drive one. (Clara set about her business in a Detroit Electric car, and purchased a new model every two years.) Despite the gendered advertising, electric vehicles did offer women the freedom to travel without anyone’s help, and many high-profile women carried keys to their own battery-powered vehicles, including five first ladies: Helen Taft, Ellen Wilson, Edith Wilson, Florence Harding, and Grace Coolidge.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Cracker Jack’s early marketing warned prospective customers about the effects of the product. “Do not taste it,” one 1896 article cautioned. “If you do, you will part with your money easy.” Some historians believe that the caramel-coated popcorn and peanut treat jump-started the American snack food industry around the turn of the 20th century. It may even hold the title of the country’s first junk food, though the types of junk food popular today didn’t make their appearances until the 1950s. It all started with Chicago candy and popcorn peddlers Frederick and Louis Rueckheim, German immigrants who crafted a nonsticky caramelized popcorn as a way to stand out from other popcorn vendors. Their version — with a sweet, crunchy coating that was different from the salted popcorn and kettle corn available at the time — became a hit after it was mass-produced in 1896.
The “Take Me Out to the Ball Game” songwriter had never seen a baseball game.
Jack Norworth’s most famous tune was scrawled on an envelope during a subway ride in 1908, though the writer wasn’t on his way to a baseball game — he’d never even been to a professional one. Music publisher Albert von Tilzer, who penned the melody, also hadn’t seen a game.
It was a song, however, that helped cement Cracker Jack’s snack status. In 1908, songwriter Jack Norworth — entirely unknown to the Rueckheims — composed “Take Me Out to the Ball Game” after seeing an advertisement for an upcoming game. The song, which mentions the snack by name, led to a surge in sales that forever linked Cracker Jack with sports. Four years later, the Rueckheims sweetened their popcorn business with a marketing gimmick that would eventually be replicated by cereal brands, fast-food restaurants, and candymakers for decades to come: a toy in every box. By 1916, Cracker Jack was the bestselling snack worldwide.
Before Sailor Jack, Cracker Jack’s original mascots were teddy bears.
Advertisement
Popcorn was once banned in movie theaters.
It may feel like popcorn and movies have always gone hand-in-hand — except that at one time, they were a contested combo. Americans experienced a 19th-century popcorn boom; by 1848, it was common fare at circuses and street fairs. But when movie theaters emerged in the early 1900s, they tried to align themselves with upscale stages, creating lavish interiors with fine carpets and furniture that could be messied by patrons’ snacks. Plus, theater owners believed popcorn munching would be too distracting at a time when filmgoers were focused on reading silent film subtitles. (Neither argument seemed to deter theatergoers, who snuck in snacks anyway.) Everything changed when the Great Depression hit; by the 1930s, theaters were seeing as many as 90 million visitors per week, and serving inexpensive popcorn was one way to generate extra money in a rough economy. That change worked in favor of big-screen owners (and popcorn producers): By 1945, more than half of popcorn eaten in the U.S. was sold at movie theaters.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The jaws of a crocodile are an amazing specimen of evolution. With a second jaw joint unlike anything found in mammals, a crocodile can spread the force of its tremendous bite throughout its mouth. In fact, crocodiles have the most powerful chomp in the animal kingdom, at 3,700 pounds per square inch for a saltwater crocodile — 30 times the force of a human bite. But that’s not the only interesting thing about a crocodile’s mouth: Their tongues are incapable of getting between those devastating jaws thanks to being permanently rooted to the floor of their mouths. A crocodile’s tongue is also held in place by a membrane attached to the roof in the back of the mouth, which keeps the throat closed when the animal is submerged.
Crocodiles are found on every habitable continent except Europe. As the planet started cooling some 50 million years ago, cold-blooded crocodiles retreated to warmer climates, leaving Europe (and also Antarctica) without any native crocodilians.
A crocodile’s immobile mouth muscle isn’t a new trait — its most famous ancient ancestor, the Tyrannosaurus rex, also couldn’t move its tongue (a fact Jurassic Park got very wrong). Researchers in 2018 compared the T. rex’s hyoid bones, the bones responsible for supporting the tongue, to those of modern birds and alligators, and found they exhibited tongue inhibition like the kind seen in modern crocodilians. The king of dinosaurs likely had an immovable tongue for similar reasons. With a bite that delivered 12,800 pounds of force per inch — four times that of even the crocodile — T. rex biology made sure to keep crucial body parts (i.e., the tongue) out of the way of the most powerful bite to ever walk the Earth.
Giraffe tongues are unique in the animal kingdom because they’re prehensile.
Advertisement
Crocodiles actually do cry “crocodile tears.”
When someone is feigning sadness, they’re sometimes said to be “crying crocodile tears.” This phrase linking crocodiles to their often teary-eyed display occurs in literature over the past several centuries. One of its earliest mentions appears inThe Voyage and Travels of Sir John Mandeville, published in the 14th century, which says, “these serpents slay men, and they eat them weeping.” Even William Shakespeare makes note of crocodile tears in Othello. Crocodiles do “cry,” but it’s mainly to keep their eyes lubricated if they’ve been out of water for long periods. In 2007, a zoologist from the University of Florida also proved that crocodiles weep when snacking, but theorized that the tears come from forced airflow (from a croc’s copious hissing and huffing), which in turn affects the reptile’s tear glands.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The history of technology is filled with happy accidents. Penicillin, Popsicles, and Velcro? All accidents. But perhaps the scientific stroke of luck that most influences our day-to-day domestic life is the invention of the microwave oven. Today, 90% of American homes have a microwave, according to the U.S. Bureau of Labor Statistics, but before World War II, no such device — or even an inkling of one — existed.
Chocolate originated in what is now modern-day Belgium.
Although Belgium is now known for chocolate, the story of the sweet stuff begins in Mexico. Around 1900 BCE, the Mokaya, a pre-Olmec civilization, was likely the first to turn the cacao plant into chocolate. However, they consumed it as a drink — chocolate bars arrived much later.
During the war, Allied forces gained a significant tactical advantage by deploying the world’s first true radar system. The success of this system increased research into microwaves and the magnetrons (a type of electron tube) that generate them. One day circa 1946, Percy Spencer, an engineer and all-around magnetron expert, was working at the aerospace and defense company Raytheon when he stepped in front of an active radar set. To his surprise, microwaves produced from the radar melted a chocolate bar (or by some accounts, a peanut cluster bar) in his pocket. After getting over his shock — and presumably cleaning up — and then conducting a few more experiments using eggs and popcorn kernels, Spencer realized that microwaves could be used to cook a variety of foods. Raytheon patented the invention a short time later, and by 1947, the company had released its first microwave. It took decades for the technology to improve, and prices to drop, before microwaves were affordable for the average consumer, but soon enough they grew into one of the most ubiquitous appliances in today’s kitchens.
The first microwave oven was called the RadaRange and weighed 750 pounds.
Advertisement
The discovery of evidence for the Big Bang was also an accident.
In 1964 at Bell Labs outside Holmdel, New Jersey, radio astronomers Arno Penzias and Robert Wilson were frustrated with their antenna. The sensitive equipment was picking up a persistent buzzing noise that the pair first thought might be coming from the machine itself, nearby New York City, or even pigeons nesting in the antenna. However, once every explanation appeared to be accounted for, the two astronomers still detectedthe hum no matter where they pointed the antenna in the sky. After speaking with astronomers at Princeton University, the duo realized that they had actually detected the cosmic microwave background, which is leftover radiation from the Big Bang and evidence for the very beginning of the universe some 13.8 billion years ago. Fourteen years later, Penzias and Wilson were awarded the Nobel Prize in physics for their groundbreaking — and serendipitous — discovery.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Run your fingers along the edge of a dime and you’ll notice the coin has a crimped edge. If you were to count those tiny grooves and valleys, you’d find 118 ridges, one less than is found on a quarter. But not all American coins have ridges, and here’s why. Rippled edges on larger-denomination coins have been a part of American currency since the U.S. Mint’s early days, and they were a clever solution to a massive currency conundrum: counterfeiting and fraud. Around the 1700s, coins were an easy target for money-generating schemes, including coin clipping: People would clip or shave off slim portions of a coin’s outer edge, cashing in the scraps of precious silver and gold. In Great Britain, coin clipping was so common that the crown deemed it a form of treason. In early North American settlements, “coining” — the actual production of fake coins — was equally problematic.
Quarters have always been the largest American coin.
Pennies first began circulating in March 1793, though many Americans were disgruntled with their initial design. The supersized coins were larger than today’s quarters, and some believed the Lady Liberty depicted looked inelegant. The coins were soon discontinued.
Knowing this, American coin makers added grooved bands — called reeded edges — to the thin sides of dimes, quarters, and larger coins then made from silver, in order to prevent shaving and make fraud more difficult. (Pennies and nickels remained smooth-sided since they were pressed from less-valuable copper and nickel.) But reeded edges lost some of their utility when the Coinage Act of 1965 changed the composition of dimes and quarters from silver to a copper-nickel blend. Reeded edges remain today as a design choice, and because they help people with visual impairments differentiate among coins.
Collecting and studying coins and other forms of money is called numismatics.
Advertisement
Coins were once commonly carved into “love tokens.”
Not all coins were saved or spent — in centuries past, some became tiny testaments to love. Love tokens, an idea that likely originated in Great Britain around the 13th century, were crafted by sanding away the faces of coins and using the precious metal as a blank canvas, which was then often engraved with memorable dates, a loved one’s initials, or romantic sentiments. After spreading to North America, love tokens reached peak popularity in the late 1800s, in part because they were used to memorialize family members lost during the Civil War. While they were initially etched by hand, professional carvers were sought out for more intricate designs, and the trend became so popular that crafters even set up booths at the 1893 world’s fair in Chicago. Many surviving love tokens are difficult to appraise or date because of their lack of detail and highly personal meaning, yet their greatest value may be the reminder that love can withstand the test of time.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
You know your head, shoulders, knees, and toes (knees and toes), but has anyone ever introduced you to the glabella? This isn’t some hidden-away body part like the back of the elbow or something spleen-adjacent — it’s smack dab in the middle of your face. Latin for “smooth, hairless, bald,” the glabella is the small patch of skull nestled in the middle of your two superciliary arches (also known as your eyebrow ridges). Many people know of the glabella because of the wrinkles, or “glabellar lines,” that can appear in the area.
Eyebrows are a uniquely human feature. As the forehead ridge of Homo sapiens receded to make room for our brain, eyebrows developed to keep moisture out of our eyes while also blocking sunlight.
Although smooth and hairless today, the glabella wasn’t always so. Our human ancestors, including Neanderthals, instead sported formidable brow ridges that likely evolved to display social dominance. As the brain of Homo sapiens grew, this brow receded until only the smallest of ridges survived — along with the smooth bit of bone in between. But the fortunes of this little piece of anatomical real estate weren’t just tied to evolution. Women in ancient Greece saw the unibrow as a beautiful feature, so much so that they’d paint soot on their glabellas to form a faux unibrow. Throughout the following centuries, fashion’s notion of the ideal eyebrow changed, but the glabella remained more or less true to its smooth, hairless name. Unless Frida Kahlo’s famous unibrow becomes a modern fashion trend, it’ll likely stay that way.
In Japan’s Heian era (794 to 1185 CE), some women removed their eyebrows and repainted them with ink.
Advertisement
Eyebrows played a crucial role in the rise of Homo sapiens.
Eyebrows are an often-overlooked asset of human beauty. Folks write poetry about gorgeous eyes and ballads on beautiful smiles, but eyebrows, while certainly an obsessed-over feature in modern beauty trends, rarely receive as much adoration. Yet according to anthropologists, the fuzzy caterpillars on our foreheads are vital to the survival of our species. A study in 2018 found that eyebrows figure prominently in human social interaction and aided early humans in forming large, complex social groups. One of these interactions occurs when people see each other at a distance — in that situation, people unconsciously raise their eyebrows in a way that apparently shows they’re not a threat. Eyebrows similarly raise toward the middle to signal sympathy, and their micro-movements can also play a key role in expressing trustworthiness or deception. With this ability to convey subtle emotions in only an “eyebrow flash,” humans formed larger and more diverse social groups on our journey toward becoming the dominant animal on the planet.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The wheel is credited as one of humankind’s most important inventions: It allowed people to travel farther on land than ever before, irrigate crops, and spin fibers, among other key benefits. Today, we often consider the wheel to be the ultimate civilization game-changer, but it turns out that creating the multipurpose apparatus wasn’t really on humanity’s immediate to-do list. Our ancient ancestors worked on other ideas first: boats, musical instruments, glue, and even alcohol. The oldest evidence of booze comes from China, where archaeologists have unearthed 9,000-year-old pottery coated with beer residue; in contrast, early wheels didn’t appear until around 3500 BCE (about three millennia later), in what is now Iraq. But even when humans began using wheels, they had a different application — rudimentary versions were commonly used as potter’s wheels, a necessity for mass-producing vessels that could store batches of brew (among other things).
The potato-based alcohol is popular, but baijiu, a traditional Chinese liquor, accounts for nearly 25% of all spirits sold globally. Brewed from sorghum or rice, baijiu (meaning “clear liquor”) has an alcohol content of 50% to 65% and is meant to be imbibed as a shot.
Some researchers believe our long-standing relationship with alcohol began 10 million years ago thanks to a genetic mutation that allowed our bipedal ancestors to consume overly ripe fruit. Alcohol consumption eventually transitioned from a snack-time byproduct to a purposefully crafted, fermented beverage, and different cultures began to create their own brews independently. After China’s beer and wine appeared around 7000 BCE, early vintners in the Caucasus Mountains followed 1,000 years later. Sumerian brewers crafted beer around 3000 BCE, while Indigenous communities in the Americas, such as the Aztecs and Incas, later made their own alcoholic drinks from agave and corn. It may seem surprising that ancient humans were so fermentation-focused, but early alcohols played a major role in prehistoric communities: Booze was often the center of religious and social celebrations, and could serve as a go-to cure for illness and pain. In some cases, it even acted as a nutritious, probiotic boost during times of food scarcity. With their many uses, both lifesaving and life-enhancing, brewed beverages have withstood the test of time.
The world’s oldest wooden wheel was uncovered from a marsh in Slovenia.
Advertisement
It takes eight years to grow agave plants for tequila.
When European colonists first encountered Mexico’s native agave plants, they were intrigued by the succulents the Aztecs had been using to make clothing, rope, and intoxicating drinks. The spike-tipped plants, which grow as tall as 20 feet, were dug up and transplanted to greenhouses and botanical gardens throughout Spain, Portugal, and other parts of Europe starting in the 16th century. But most agave plants struggled to flourish in areas lacking their natural arid climate; in cooler countries, they were dubbed “century plants,” because those that survived the overseas journey didn’t bloom for nearly 100 years. Agave plants mature much faster when left in their natural habitats, but growing the crop for today’s tequila production is still a time investment. It traditionally takes about eight years before the plants are ready to harvest, though some agave crops are left to grow even longer.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Sandwiched along the border of Zambia and Zimbabwe in Africa is one of the greatest natural wonders of the world (literally). Nearly twice as tall as Niagara Falls, Victoria Falls plunges some 350 feet down into the basin of a vast gorge below. The steady stream of water is so powerful, it’s created a rainforestlike microclimate, because its voluminous spray blankets the surrounding area continuously. Although the world largely knows this wonder as Victoria Falls (after Queen Victoria, naturally), locals have traditionally called it Mosi-oa-Tunya, meaning “The Smoke that Thunders.” The name is arguably a better fit, as the “thunder” of this massive waterfall can be heard from 25 miles away, and its “smoke” (aka water plume) can be seen even farther.
Victoria Falls is the widest waterfall in the world.
Victoria Falls isn’t the highest waterfall (that’s Venezuela's Angel Falls) or the widest, an accolade that belongs to the Khone Phapheng waterfall in Laos. This waterfall stretches more than 6 miles wide, whereas Victoria Falls is only a little more than a mile wide.
Victoria Falls is actually several waterfalls in one. On the Zimbabwe side, there’s Devil's Cataract, Main Falls, Rainbow Falls, and Horseshoe Falls, and on the Zambia side lies the Eastern Cataract. If you want to see the waterfall at its most dramatic, visit between February and May when the summer rains, and by extension Victoria Falls itself, are at their highest volume. However, other times of year have plenty to offer, too. Between mid-August and mid-December, daring adventurers can take a dip in Devil’s Pool, a swimming hole that brings those unburdened by acrophobia — fear of heights — to the very edge of the Fall’s dizzying plunge.
Victoria Falls is part of the Zambezi River, the fourth-largest river in Africa.
Advertisement
Another amazing water feature lies on the border of Zimbabwe and Zambia.
Some 150 miles northeast of Victoria Falls is yet another impressive body of water on the border of Zimbabwe and Zambia — an especially notable feat for two landlocked countries. Filled between 1958 and 1963, Lake Kariba is thelargest reservoir by volume in the world, stretching a staggering139 miles long and 25 miles wide. The lake serves as a reservoir that’s created by the Kariba Dam, the largest dam in Africa, and power plants on the dam provide hydroelectric energy forboth Zimbabwe and Zambia. Zimbabwe, for example,receives 57% of its annual electricity from the dam. In addition to providing much-needed power, the lake is also a tourist destination for both countries.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.