Tiny, hidden survival tools packed into the waistband of your pants may sound like something fantastical from a spy movie, but in the case of British wartime pilots, they were a reality. During World War II, the Royal Air Force sent its aviators skyward with all the tools they’d need to complete a mission, along with a few that could help them find their way home if they crash-landed behind enemy lines. One of the smallest pieces of survival gear pilots carried was a compass built into the button of their trousers.
Britain was the first country to create an air force.
Established in April 1918, seven months before the end of World War I, Great Britain’s Royal Air Force was the first of its kind. The sky-patrolling force launched with 300,000 service members nearly three decades before the U.S. designated its own Air Force branch in 1947.
Three months after entering World War II, the British military launched its MI9 division, a secret intelligence department tasked with helping service members evade enemy forces or escape capture. Between 1939 and 1945, masterminds at MI9 created a variety of intelligence-gathering and survival tools for troops, such as uniform-camouflaging dye shaped like candy almonds, ultra-compressed medications packed inside pens, and button compasses. The discreet navigational tools were typically made from two buttons, the bottom featuring a tiny needle. When balanced on the spike, the top button acted as a compass that rotated with the Earth’s poles; two dots painted on the metal with luminous paint represented north, and one indicated south.
MI9 distributed more than 2.3 million of its button compasses during the war. They could be paired with secretive maps that were smuggled to captured service members inside care packages delivered to prisoner-of-war camps. Often printed on silk for durability and waterproofing, the 44 different maps (sent to different camps based on location) were tucked discreetly into boot heels and board games. The ingenuity worked — by the war’s end, MI9 was credited with helping more than 35,000 Allied soldiers escape and make their way home.
Historians believe the first magnetic compasses were invented in China.
Advertisement
Some American colonists were banned from wearing fancy buttons.
Buttons can be an innocuous way to add panache to a piece of clothing … unless you were a colonist living in Massachusetts during the 17th century, that is. Choosing the wrong type of buttons for your garment during that time could have landed you in court and required paying a fine. Puritans in Massachusetts during the 1600s were ruled by a series of sumptuary laws, aka legal codes that restricted how people dressed and interacted with society based on moral or religious grounds. Massachusetts passed its first sumptuary law in 1634, prohibiting long hair and “new fashions” (aka overly swanky clothes), and five years later even banned short-sleeved garments. By 1652, the colony further restricted lower-wage earners from wearing “extravagant” accessories such as silks, fine laces, and gold or silver buttons, unless they had an estate valued at more than 200 British pounds — more than $38,000 in today’s dollars. However, the law did include some loopholes: Magistrates, public officers, and militia members (and their families) were free to choose buttons and other adornments without fear of penalty, as were the formerly wealthy and those who had advanced education.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Uniforms convey a sense of competency across professions ranging from delivery person and airline staff to chef and firefighter. The psychological implications may be even stronger when it comes to matters of health: According to one study published in the medical journal BMJ Open, doctors who don the traditional white coat are perceived as more trustworthy, knowledgeable, and approachable than those who administer to patients in scrubs or casual business wear.
Physicians at the acclaimed Mayo Clinic do not wear white coats.
The Mayo Clinic's founders felt that the white coat would create an unnecessary separation between doctor and patient, and thus established a tradition in which physicians wear business attire to demonstrate respect for the people they serve.
The 2015-16 study drew from a questionnaire presented to more than 4,000 patients across 10 U.S. academic medical centers. Asked to rate their impressions of doctors pictured in various modes of dress, participants delivered answers that varied depending on their age and the context of proposed medical care. For example, patients preferred their doctors to wear a white coat atop formal attire in a physician's office, but favored scrubs in an emergency or surgical setting. Additionally, younger respondents were generally more accepting of scrubs in a hospital environment. Regardless, the presence of the white coat rated highly across the board — seemingly a clear signal to medical professionals on how to inspire maximum comfort and confidence from their patients.
Yet the issue of appropriate dress for doctors isn't as cut and dry as it seems, as decades of research have shown that those empowering white coats are more likely to harbor microbes that could be problematic in a health care setting. In part that’s because the garments are long-sleeved, which offers more surface area for microbes to gather — a problem that’s compounded because the coats are generally washed less often than other types of clothing. Although no definitive link between the long-sleeved coats and actual higher rates of pathogen transmission has been established, some programs, including the VCU School of Medicine in Virginia, have embraced a bare-below-the-elbows (BBE) dress code to minimize such problems. Clothes may make the man (or woman), but when it comes to patient safety, the general public may want to reassess their idea of how our health care saviors should appear.
The term for anxiety-induced high blood-pressure readings in a doctor's office is “white coat syndrome.”
Advertisement
Western doctors dressed in black until the late 1800s.
If the idea of a physician or surgeon wearing black seems a little morbid, well, that may have been part of the point in the 19th century. After all, the medical field had more than its share of undertrained practitioners who relied on sketchy procedures such as bloodletting, and even the work of a competent doctor could lead to lethal complications. However, Joseph Lister’s introduction of antisepsis in the 1860s dramatically cut the mortality rate for surgical patients, and with it, the perception of the possibilities of medicine underwent a major shift. While black had once been worn to denote seriousness, doctors began wearing white lab coats like scientists to demonstrate their devotion to science-based methodology, a sartorial presentation that also reflected an association with cleanliness and purity. By the turn of the century, the image of the black-clad physician was largely consigned to the remnants of an unenlightened age.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
April Fools’ Day is a time when dubious pranksters run rampant, so it’s wise to be a little more skeptical of everything you see and hear on April 1. But when the clock strikes midnight and the calendar flips over to April 2, we celebrate the polar opposite of trickery and deception: International Fact-Checking Day. This global initiative is spearheaded by the Poynter Institute’s International Fact-Checking Network (IFCN) — a community established in 2015 to fight misinformation and promote factual integrity. IFCN works with more than 170 fact-checking organizations to encourage truth-based public discourse, debunk myths, and ensure journalistic accountability.
Taco Bell once claimed to have purchased the Liberty Bell.
On April 1, 1996, Taco Bell pulled off a legendary April Fools’ prank, claiming it had purchased the Liberty Bell and renamed it the “Taco Liberty Bell.” Full-page ads ran in seven major newspapers, leading to hundreds of Americans calling the National Park Service to express outrage.
The organization held the first official International Fact-Checking Day in 2016 as part of an effort to highlight the importance of fact-checkers’ role in a well-informed society. The occasion has been celebrated on April 2 every year since, standing in stark contrast to the prior day’s foolishness. Celebrants may choose to honor the holiday however they please, be it by promoting trusted sources on social media or by taking advantage of the events and information produced by IFCN. In past years, the organization has held a public webinar to discuss the “State of the Fact-Checkers” report, and published lists of relevant essays written by fact-checkers around the world. Be sure to capitalize on this opportunity to get your facts straight, as April 4 ushers in yet another day of duplicity: National Tell a Lie Day.
A 1957 BBC April Fools’ broadcast pranked viewers into thinking spaghetti grows on trees.
Advertisement
The Spanish-American War was fueled by “yellow journalism.”
In the late 19th century, the American press found itself in the grip of a phenomenon known as “yellow journalism” — a sensationalized form of reporting that prioritized eye-catching headlines ahead of the cold, hard facts. These unverified claims sometimes had serious consequences, most notably in the case of the Spanish-American War.
On February 15, 1898, the USS Maine battleship exploded and sank in Havana Harbor in Cuba (a country controlled by Spain at the time). Within days, major newspapers including William Randolph Hearst’s New York Journal and Joseph Pulitzer’s New York World published accusations that Spain was responsible for the sinking, despite a lack of evidence. But the exaggerated headlines still swayed public opinion, fueling a desire to go to war.
Tensions escalated to the point that on April 20, the U.S. Congress issued an ultimatum for Spain to withdraw from Cuba, which Spain declined to do, opting to sever diplomatic ties with the U.S. instead. Spain declared war on the U.S. on April 23, with Congress following suit two days later.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
What do you call someone who’s fallen for a prank? There’s no punchline here — in most English-speaking places, you’d probably just call them gullible. But in France, you might use the term poisson d’avril, meaning “April fish.” The centuries-old name is linked to a 1508 poem by Renaissance composer and writer Eloy d’Amerval, who used the phrase to describe the springtime spawn of fish as the easiest to catch; young and hungry April fish were considered more susceptible to hooks than older fish swimming around at other times of year. Today, celebrating “April fish” in France — as well as Belgium, Canada, and Italy — is akin to April Fools’ Day elsewhere, complete with pranks; one popular form of foolery includes taping paper fish on the backs of the unsuspecting.
April Fools’ celebrations last two days in Scotland.
In Scotland, April Fools’ is also called “Huntegowk,” a day where “gowks” (the Scottish word for cuckoo birds) are sent on phony errands. The second day of celebrations, called Tailie Day, is a bit more mischievous, when tails or “kick me” notes are placed on people’s backsides.
While the first reference to poisson d’avril comes from d’Amerval’s poem, historians aren’t sure just how old the April Fools’ holiday is. It’s often linked to Hilaria, a festival celebrated by the ancient Romans and held at the end of March to commemorate the resurrection of the god Attis. However, many historians believe that while Hilaria participants would disguise themselves and imitate others, there’s little evidence that it’s the predecessor of April Fools’. Other theories suggest that April 1 trickery stems from switching to the Gregorian calendar. One such explanation dates to 1564, the year French King Charles IX moved to standardize January 1 as the start of the new year, which had often been celebrated on Christmas, Easter, or during Holy Week (the seven days before Easter). Despite the royal edict, some French people kept with the Holy Week tradition and celebrated the new year in late March to early April, becoming the first “April fools.”
An 1878 newspaper hoax reported Thomas Edison’s newest invention could turn dirt into food and wine.
Advertisement
The BBC once claimed spaghetti noodles grew on trees.
The most convincing April Fools’ pranks often come from the most unexpected sources, which could be why the BBC has a history of successful hoaxes. This includes a 1957 joke, considered to be one of the first April Fools’ TV pranks, wherein the British broadcaster aired a two-and-a-half-minute segment claiming spaghetti noodles grew on trees in Switzerland. Footage showed Swiss noodle harvesters on ladders collecting noodles and drying them in the sun before dining on a large pasta dinner. While the prank likely would have fallen flat today, spaghetti wasn’t commonly eaten in the U.K. during the 1950s, which meant the dish was entirely unfamiliar to most viewers. But the hoax didn’t just prank viewers. Many BBC staffers were also fooled after being purposefully kept in the dark about the fictitious story — the production brainchild of cameraman Charles de Jaeger and a small crew — and were taken aback by a deluge of callers looking to acquire their own spaghetti trees.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
If you think meeting new people nowadays is difficult, imagine doing it without a phone, a car, or even a bike. This was the case for much of human history, of course, and it had such a profound effect on social interactions thatthe average distance between the birthplaces of spouses in England was just 1 mile prior to the invention of bicycles. Those calculations come from Yale’s Stephen Stearns, a professor of evolutionary biology, who studied parish registries of birth, marriage, and death. He found that most couples weren’t just from the same village — they were often from neighboring farms.
Karl von Drais’ original invention was known as a Laufmaschine (“running machine”), and early bikes were also known as velocipedes and dandy horses.
The bicycle was invented in 1817, which geneticist Steve Jones called “the most important event in recent human evolution” in his book The Language of the Genes. In addition to increasing genetic diversity by making it easier for couples who lived farther away from each other to meet and reproduce, bicycles were also intrinsically linked to women’s rights because they allowed women greater autonomy in the early 20th century. No less an authority than Susan B. Anthony proclaimed in an 1896 interview with journalist Nellie Bly that bicycling “has done more to emancipate women than anything else in the world. I stand and rejoice every time I see a woman ride by on a wheel.”
The world’s largest bicycle manufacturer is Giant.
Advertisement
In the Netherlands, there are more bicycles than people.
With a population of 18.3 million, the Netherlands is the 69th-most-populous country in the world. Yet with 23.9 million bicycles in the country, Holland contains quite a few more bikes than people. It’s widely considered the most bicycle-friendly country on Earth, as 43% of the population rides at least once per day. Amsterdam in particular has been deemed the bike capital of the world, with one expert estimating that more than 60% of all trips in the city center take place on two wheels.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
If asked what a dinosaur sounded like, many people would likely recall the roaring T. rex of Jurassic Park. However, that earth-shaking bellow seems to be a case of Hollywood exercising some creative liberty. While we don't know what these reptiles really sounded like, since they mostly died out some 66 million years ago, scientists at least have some reasonable ideas based on the anatomical structures of well-preserved fossils, combined with studies of the dinosaurs and their close relatives that exist today.
Although it was long believed that non-avian dinosaurs lacked the flying capabilities of early birds and Pterosaurs, newer research has shown that other small, feathered dinosaurs evolved the ability for powered flight multiple times.
Yes, dinosaurs do still exist, in the form of birds, which branched off from non-avian dinosaurs around 160 million years ago. Although birds mainly produce noises via a soft-tissue organ called the syrinx, which has yet to be uncovered from a non-avian dinosaur fossil, many of our feathered friends also engage in closed-mouth vocalization, in which sounds are pushed out from a pouch in the neck area. Another modern animal that utilizes closed-mouth vocalization is the crocodile, which just so happens to share a common ancestor with dinosaurs. Given the family ties, it's logical to conclude that some dinosaurs emitted something resembling the cooing of a dove, the booming of an ostrich, or the rumbling of a croc. Since larger animals with longer vocal cords produce lower-frequency sounds, it's also likely that enormous sauropods like Brachiosaurus delivered noises that, to our ears, would dip into an octave of infrasound — felt and not heard. On the other hand, the ear structures of the dinosaur-crocodile predecessor indicate a sensitivity to high-pitched noises, possibly the chirping of babies.
The field continues to evolve as new information comes to light; the recent discovery of the first known fossilized dinosaur larynx, from an ankylosaur, suggests these creatures were able to modify noises in a birdlike way despite the lack of a syrinx. And none of this even touches on the sound capabilities of hadrosaurs like Parasaurolophus, which almost certainly delivered a distinct call from the air passages that funneled through a conspicuous head crest. All in all, while a roar from a Jurassic-bred beast may have been the work of a Hollywood studio, there’s no movie magic needed to recognize that Earth’s prehistoric hills were alive with all sorts of reptilian sounds of music.
Some musicians are using dinosaur skull replicas as instruments.
Given the ever-increasing understanding of dinosaur sounds, it’s appropriate that a few experimental musicians have found ways to make music out of these potentially noisy monsters from yesteryear. One such artist is Southern Methodist University assistant professor Courtney Brown, who initially attached a mouthpiece and a synthetic larynx to the replica of a hadrosaur skull for an interactive exhibit, before recording the sounds produced by this creation alongside other instruments and vocals for her Dinosaur Choir project. Similarly, composers Anže Rozman and Kara Talve devised replica skull-based instruments as part of their assignment to score the Apple TV+ series Prehistoric Planet. Thanks to their atmospheric arrangements on innovations including the Triceratone, which fuses an electric double bass with a triceratops skull, and the Fat Rex, a combined frame drum and cello fingerboard topped with a 3D-printed T. rex skull, the Rozman-Talve-led team claimed Best Original Score for a Documentary Series honors at the 2023 Hollywood Music in Media Awards.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
For at least 180 years, an electric bell at Oxford University has been ringing continuously— and no one knows exactly how it works. The university has covered the bell with a double-paned glass dome (who could study with such noise?), but the mystery of its battery has gone unsilenced. Built by a London instrument manufacturing firm in 1825 and acquired by Oxford in 1840, the item consists of two brass bells set below two batteries that look a bit like big wax candles. Between these bells is a small lead sphere, or clapper, that shutters back and forth, creating a near-constant ring. Estimates suggest the bell has likely rung more than 10 billion times.
In the mid-1700s, Benjamin Franklin experimented with devices for storing static electricity known as Leyden jars. Franklin called groupings of these jars a “battery,” referencing a military term for artillery working in unison.
This strange bell is powered by what’s called dry pile batteries, which use an incredibly small amount of electrostatic energy to move the clapper back and forth — so small that the two batteries have yet to run out of charge. Although no one’s sure exactly what’s inside the batteries (dissecting them would disrupt the bell’s historic run), the best guess is that they’re full of several thousand quarter-sized discs, made with metal foil and paper that has zinc sulfate and manganese dioxide added to it, all coated in sulfur. Oxford believes the bell has another five to 10 years of life left, as the ringing has slowed considerably in the past 40 years. (These days, it’s inaudible.) That hasn’t kept the Oxford Electric Bell — also known as the Clarendon Dry Pile — from being recognized as the “world’s most durable battery” by the Guinness Book of World Records. After more than 180 years in service, it’s an accolade that’s well deserved.
The earliest evidence of metal bells comes from China.
Advertisement
One of the world’s largest batteries is inside a mountain.
A problem like climate change requires inventive solutions — including completely reimagining the battery. At its most basic, a battery is just a bundle of stored energy. As the world looks toward solar and wind for the renewable energy sources of the future, it helps to have a backup plan for providing power when the sun isn’t shining or the wind isn’t blowing. One of the most impressive energy-making schemes has already existed for nearly 40 years: England’s Dinorwig Power Station. Nestled inside Elidir Fawr, a mountain in north Wales, Dinorwig uses giant turbines to capture energy from water flowing from a lake at the top of the mountain to a lower lake, effectively creating a giant battery. When energy demand is at its lowest, the water is then funneled back up the mountain, ready to supply power at a moment’s notice.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Alligators and crocodiles have a lot in common. They’re both beefy reptiles with a serious set of teeth and strong Triassic vibes. However, there are some big differences between them: Alligators usually have a more U-shaped snout, whereas crocodiles sport a more V-shaped schnoz; alligators stick to fresh water, while crocodiles live in salty environments; alligators are blacker, while crocs prefer earth tones like brown. However, the biggest difference is usually in the locations these two gargantuan reptiles call home. American crocodiles (Crocodylus acutus) can be found in Cuba, Jamaica, southern Mexico, Central America, Ecuador, and elsewhere. Meanwhile, the American alligator (Alligator mississippiensis) mostly sticks to the southeastern U.S. You’ll likely never be tasked with differentiating the two creatures in the wild — that is, unless you find yourself in south Florida.
The saltwater crocodile is the world’s longest reptile species.
The saltwater croc (Crocodylus porosus) is the world’s heaviest reptile, but the title for the longest belongs to the reticulated python (Python reticulatus), found in Southeast Asia. At 32 feet long, this sizable snake far exceeds the saltwater crocodile’s 20-foot-long stature.
On the tip of the Florida peninsula lies the U.S.’s third-largest national park — the Everglades. It’s here that the southern extreme of the American alligator’s range overlaps with the northern extreme of the American crocodile’s range. The 7,800-square-mile expanse of wetlands has both brackish and saltwater environments that create a perfect home for crocs, while fresh water supplied by lakes, rivers, and rainfall provides the preferred habitat for alligators. Alligators vastly outnumber crocodiles in the U.S., with about 200,000 alligators in the park alone. And while crocodiles are considered more aggressive, the two rarely fight with each other or with humans. Still, it’s probably a good idea to keep a minimum safe distance between you and their frighteningly numerous teeth.
South America’s Pantanal, the world’s largest wetland, is 20 times larger than the Everglades.
Advertisement
The crocodile is the closest living relative of birds.
Although your typical croc and warbler appear to have nothing in common, the duo share a common ancestor that roamed the Earth some 240 million years ago. Both birds and crocodiles (as well as alligators and gharials) descend from a group of reptiles known as “archosaurs,” which literally means “ruling reptiles.” From this group came dinosaurs — the ancestors of birds — and crocodilians. While the common ancestor of these two disparate animals existed a long time ago, birds underwent a drastic evolution, whereas crocodiles remained relatively similar over time. In fact, in 2014 scientists discovered that crocodiles have the slowest molecular change of any known vertebrate genome, meaning crocodiles have remained the same — more or less — for millions of years.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
The Ottoman Empire feels like an entity of a time long past, while the name Nintendo conjures up images of modernity — electronics, video games, arcades, and mustachioed plumbers. However, Nintendo was actually founded before the Ottoman Empire ended, and this period of overlap isn’t measured in a matter of months or even a few years. When the Ottoman sultanate was eliminated in 1922 after the widespread geographic shuffle that followed World War I, Nintendo had already been in business for 33 years.
The Ottoman Empire was a tapestry of ethnicities that were not exclusively Turkish — in fact, the entire empire wasn’t even exclusively Muslim. As the BBC explains, calling the Ottoman Empire “Turkish” is similar to calling everything that made up the British Empire “English.”
Of course, this wasn’t the Nintendo that many of us know today — Nintendo didn’t make its first electronic video game until 1975. Founded on September 23, 1889, Nintendo began with a humble mission: selling playing cards, specifically Japanese-style cards called Hanafuda. The company did pretty well, but decided to expand further in later decades. Nintendo struck a deal with Disney in 1959 to create playing cards with Disney characters on them, and in the 1960s, Nintendo sold a series of successful children’s toys, including Ultra Hand and Home Bowling, before becoming the official Japanese distributor of the Magnavox Odyssey — the first commercial home video console. Seeing the promise of such a machine, Nintendo threw its weight behind this emerging entertainment category. The rest, as they say, is history.
The Ottoman Empire is named for the Turkish Muslim prince Osman I (1259–1326).
Advertisement
If Mario were a real person, his jump would be 25 feet high.
On February 7, 2021, an American man named Christopher Spell jumped 1.70 meters (roughly 5.5 feet) when standing still, clinching the title of highest standing jump in Guinness World Records. Although an impressive feat, it’s nothing compared to Mario’s jumping prowess. According to physics calculations conducted by the website TechRadar, Nintendo’s overalled mascot could miraculously jump 25 feet into the air if he were a real person — that’s five times his overall height (at 5 feet, 1 inch). However, Mario’s impressive strength isn’t just in his legs. In the original Super Mario Bros., the titular character can rip through a brick block, which is about four bricks high, with ease. It would take an estimated 16,681 newtons of force to achieve such a feat, but when martial artists break through a single brick, they produce roughly 3,000 newtons of force. These calculations prove that Mario isn’t just a plumber — he’s a superhuman.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
True ambidexterity is the ability to use one’s left and right hands equally well when it comes to tasks such as writing or throwing a ball. This rare trait naturally occurs in roughly 1% of the global population (around 82 million people). In a 2019 study, 1.7% of respondents said they could use both hands interchangeably, far fewer than the 89% who were right-handed and the 9.3% who were lefties.
The study also found a higher predisposition for ambidexterity among males than females (2.1% and 1.4%, respectively), which some theorize may have to do with the effect higher testosterone levels have on brain development (though this has yet to be conclusively proved). This is just one of severalstudies on handedness — the tendency to use one hand over another — and while the exact percentages of right-handed, left-handed, and ambidextrous people vary, the results are largely consistent across the board.
Thomas Jefferson learned to write with his left hand at the age of 43.
Though he was a natural righty, Jefferson learned to write with his left hand after dislocating his right wrist in 1786. The founding father’s left-handed letters were surprisingly legible, though the handwriting slightly differed from those written with his right hand.
The root cause of ambidexterity — or any handedness, for that matter — remains tough to pin down. One 2009 study suggests it may be determined by a combination of genetics and environmental influences (for instance, being taught to write with a certain hand in school). It’s also believed that ambidextrous people possess atypical brain laterality compared to right-handed individuals, which forms during development.
This cerebral asymmetry is arguably why ambidextrous people have a higher propensity toward conditions including ADHD, and also why they generally are less proficient than right-handed people in topics such as arithmetic and logical reasoning. Ambidextrous people also possess a unique versatility when it comes to sports, playing music, or performing everyday physical activities.
Despite being left-handed, Jimi Hendrix often played right-handed guitars flipped upside down.
Advertisement
Leonardo da Vinci wrote from right to left.
Italian, like most languages, is traditionally written from left to right, yet the famed Italian polymath Leonardo da Vinci wrote his notes from right to left — a unique style known as mirror writing. Not only was the direction reversed, but each letter was also flipped horizontally as if viewed through a mirror.
Some theorize that Leonardo practiced mirror writing to make it more difficult for people to read his notes and steal his ideas. Interestingly, he only used mirror writing when composing personal notes; if the text was intended to be read by anyone else, he wrote in the standard direction.
Others believe that Leonardo used mirror writing to avoid ink smudges on his left hand, which he used to write. The Renaissance man was a lefty (though some argue he was actually ambidextrous) and was known among his contemporaries as “mancino” — Italian slang for a left-handed person.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Inbox Studio, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.