Original photo by dpa picture alliance/ Alamy Stock Photo
If you’ve ever wondered what appears on Cookie Monster’s birth certificate, the answer isn’t “Cookie Monster” — it’s “Sid.” Sesame Street’s resident cookie-lover joins the likes of Cap’n Crunch (Horatio Magellan Crunch), Yoshi (T. Yoshisaur Munchakoopas), and other fictional characters who have “real” names you might not know. (Speaking of Sesame Street, Snuffleupagus’ first name is actually Aloysius.) Cookie Monster revealed his actual name in the 2004 segment “The First Time Me Eat Cookie,” which includes the line, “In fact, back then, me think me name was Sid.” Despite this, many were still shocked when, in October 2022, the character tweeted, “Did you know me name is Sid? But me still like to be called Cookie Monster.”
In his original drawings of the character, Jim Henson designed Oscar the Grouch as a “spiky, grumpy-looking magenta monster.” But because early color TVs didn’t handle magenta well, Oscar’s color was changed to orange. He didn’t premiere his current look until the second season.
Though he made waves with the 2004 song “A Cookie Is a Sometimes Food,” Cookie Monster’s love of his namesake treat remains undiminished — and no, he isn’t going to change his name to Veggie Monster. Sid was designed by Jim Henson for an unaired General Foods Canada commercial in 1966, made his Sesame Street debut on the beloved show’s first episode three years later, and has remained a fan favorite in the more than half-century since.
While still in the development stage, the show was going to be called 123 Avenue B. There were two overlapping problems with that title, however: It was a real address in New York City, and the creators feared that viewers outside the city wouldn’t be intrigued by it. Sesame Street was chosen in part to evoke the “Open, sesame” command from “Ali Baba and the Forty Thieves,” a story in One Thousand and One Nights. Traces of the original title can be found in the show’s most famous address — 123 Sesame Street — a two-story brownstone apartment where Bert and Ernie live.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Sueddeutsche Zeitung Photo/ Alamy Stock Photo
Pigeons tend to get a bad rap among urban dwellers, but the birds have a distinguished history of service. Bred for their instinctive ability to find their way home from long distances, homing pigeons were trained as message-bearers as far back as in ancient Egypt. With their deployment by besieged Parisians during the Franco-Prussian War of 1870, the era of the military pigeon was underway.
Pigeons can recognize letters of the alphabet and learn words.
A 2016 study demonstrated that pigeons could be trained to pick out words from a group of nonwords, marking the first time that a nonprimate was shown to have an orthographic brain.
By the time the United States entered World War I, homing pigeons were being used on both sides of the fighting for their ability to reliably deliver progress updates from planes, tanks, and mobile lofts on the front lines. While telephone and radio communications were more advanced heading into World War II, there were still times when conditions rendered such technologies useless, and the only solution was to strap a message to a pigeon and send it airborne through a hail of gunfire. Sometimes, a lone bird’s efforts saved the lives of hundreds of soldiers: One such instance occurred in Italy in 1943, when an American pigeon named G.I. Joe was dispatched to an Allied air base in the nick of time to call off the planned bombing of a village that had just been liberated by British troops.
That year, White Vision, Winkie, and Tyke became the first three of the 32 pigeons to receive the People’s Dispensary for Sick Animals (PDSA) Dickin Medal for exceptional wartime accomplishments. Although the award came into being too late to honor pigeon predecessors like Cher Ami and President Wilson, the more recent creation of the Honorary PDSA Dickin Medal in 2014 honored all the winged warriors and other service animals who served during World War I. And although the PDSA is based in the U.K., the Dickin Medal is awarded to animals in theaters of war around the world, and recognized worldwide.
The sense that enables pigeons to perceive direction via Earth’s magnetic field is magnetoreception.
Advertisement
One cat has won the Dickin Medal.
That would be Simon, a tomcat who had the misfortune of getting caught in the strife of the Chinese Civil War in 1949. A crew mascot aboard the British HMS Amethyst, Simon sustained shrapnel injuries when the ship was attacked and cornered by communist forces on the Yangtze River. Not only did Simon get back on his feet and provide comfort to his rattled shipmates, but he also fought off the rats that attempted to raid the dwindling food supply as the crew waited for weeks for safe passage to freedom. Simon then became something of a celebrity after the Amethyst made news with its escape to Hong Kong, with a designated “cat officer” assigned to handle his fan mail. Sadly, the battle-scarred feline died shortly before he was scheduled to receive his Dickin Medal late in 1949, although TIME magazine provided an additional salute by featuring his picture on its obituary page.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The ancient Egyptians are known for many firsts. Hieroglyphics, papyrus, the calendar, and even bowling all come from the minds of the ancient people along the Nile. Egyptians were also some of the first to pay particular attention to oral care. They invented the first breath mint, toothpicks have been found alongside mummies, and they created the oldest known formula for toothpaste.
Teeth and bones are the hardest materials in the human body, but they’re very different. Bones are made of living tissue, so they’re constantly growing or regenerating throughout your life. Teeth are not living tissue, and are the only part of the human body that doesn’t regenerate.
One of the earliest medicinal texts, the Ebers Papyrus contains an astoundingly accurate understanding of the human circulatory system as well as an assortment of medicinal remedies. Written around 1550 BCE, this ancient text also describes an ancient form of toothpaste. This early dentifrice was likely made from ingredients such as ox hooves, ashes, burnt eggshells, and pumice (a type of volcanic rock), but by the fourth century CE, when Egypt was under Roman rule, the recipe evolved to include salt, pepper, mint, and dried iris flower, based on descriptions in another papyrus. Egyptians may have applied the paste with toothbrushes made from frayed twigs.
Although Egyptian toothpaste may seem unrecognizable compared to the science-y ingredients found in modern tubes of Colgate or Crest, these ancient toothpaste recipes essentially do the same thing. Modern toothpaste uses materials known as abrasives to remove gunk from teeth, lessening the potential for decay and cavities. While Egyptians used salt and pepper (or pumice) for this task, today we use hydrated silica for that same abrasive purpose — though thankfully it’s gentler on our gums. And the mint the Egyptians used helped freshen their breath, just as today’s mint-flavored toothpaste does. So while our modern tubes of toothpaste are a relatively modern creation, cleaning our teeth is a habit that goes back at least as far as the ancients.
Invented in China in 1498, the first modern toothbrush used coarse hair from a hog for bristles.
Advertisement
Agriculture is why so many people need braces.
Every year millions of people get braces or have their wisdom teeth pulled. That’s because there isn’t enough dental real estate in the average human mouth. Although a boon for the dental profession, humanity’s mass malocclusion (or misalignment of teeth) wasn’t always this way. In our distant past, before we put down the hunting bow for the dirt-churning plow, human jaws comfortably accommodated all the teeth in our mouth. Yes, even our wisdom teeth. A study in 2015 analyzed the lower jaws of 292 skeletons ranging from 28,000 to 6,000 years old — an age range that straddles our adoption of agriculture some 12,000 years ago. Scientists noticed that early farmers had smaller jaws than their hunter-gatherer forebears. This is likely because before agriculture, Homo sapiens chomped hard-shelled nuts, uncooked vegetables, and tough meats, which required larger, stronger jaws. With the advent of farming, diets consisted of softer foods like beans and cereals, causing the size of the human jaw to decline over time because it’s not subject to the same amount of chewing time. Although our jaws are smaller, our number of teeth has remained the same, leading to the dental traffic jam experienced by millions today.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
There are a few ways to avoid the itch-inducing bites of summer’s biggest pest: the mosquito. Wearing long-sleeved apparel and dousing yourself in insect repellent can help, but avoiding some beverages — particularly alcohol — might further protect you. According to a 2010 study of mosquito biting preferences, beer makes humans more attractive to the paltry pests.
Researchers found that Anopheles gambiae, a mosquito species in the genus responsible for transmitting malaria, were more attracted to humans who had consumed beer (compared to those who consumed only water), and the results were evident as soon as 15 minutes after the humans began drinking. Other studies have produced similar findings; one examination of alcohol’s role in mosquito meal choices found that those who imbibed just one 12-ounce beer were more likely to be pestered by the insects. It’s unclear why beer primes humans to become bite victims, though some scientists believe it could be partly linked to body temperature; alcohol expands the blood vessels, a process that slightly increases the skin temperature and also makes us sweat, two factors that may attract more hungry mosquitoes.
Not every mosquito you see is out for blood. That’s because only female mosquitoes bite, in search of blood that provides them with enough protein to develop eggs and successfully reproduce, while males feed on nectar.
For being such tiny insects, mosquitoes are incredibly effective in their ability to feast on larger prey. Their proboscises — aka mouths — are created from a complex system that includes six needlelike mouthparts called stylets; when a mosquito bites, the stylets are used to hunt for nearby blood vessels. That makes a mosquito’s job of finding food quick and easy work — a necessity when dinner comes with a risk of being swatted.
London’s subway system has a type of mosquito named after it.
There are thousands of mosquito breeds throughout the world, but London has one subspecies informally named for its subway system. Scientists believe the Culex pipiens molestus, often called the London Underground mosquito, is a variation of the Culex pipiens, the most widespread mosquito in the world. The London Underground mosquito is thought to have lived beneath the city’s streets for around 150 years. While the pests were acknowledged during World War II, whenBrits sheltering below ground were bitten by the hungry insects, it wasn’t until decades later that researchers began to study them in earnest. By 1999, English researcher Katharine Byrne determined that the mosquitoes living in London’s subway tunnels hadmorphed into their own subspecies, unable to even breed with other species. However, more recent research suggests the pests didn’t evolve inside the Underground, but possiblyin Egypt and nearby areas centuries ago. Today, Culex pipiens molestus is found in underground locations in many parts of the world.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by MediaPunch Inc/ Alamy Stock Photo
Is there a word for the opposite of writer’s block? If there isn’t, Dolly Parton should get to coin it, since the country music legend says she penned “I Will Always Love You” and “Jolene” in one day. “That was a good writing day” is how the ever-humble fan favorite described the process of writing the two eventual Billboard Country Music No. 1 hits in 1972. They remain two of her best-known songs a full half-century later, with “I Will Always Love You” taking on a second life when Whitney Houston covered it for the 1992 blockbuster The Bodyguard. Parton, who used some of her royalties from the cover to invest in a Black neighborhood in Nashville, is a fan of Houston’s version and has said she “would’ve loved” to perform a duet with Houston even though “she’d have outsung me on that one for sure.”
Though she’s never revealed them publicly, Parton has “a few little tattoos here and there.” The singer apparently scars easily, and has used her ink — including beehives, butterflies, and ribbons — to cover them up.
“Jolene” and “I Will Always Love You” aren’t the only megahits in history that were written quickly, of course. It took Mariah Carey and songwriter Walter Afanasieff just 15 minutes to co-write “All I Want for Christmas,” while the Beatles’ “Yesterday,” Lady Gaga’s “Just Dance,” the Guess Who’s “American Woman,” and several other famous tunes were all put together in around 10 minutes. Sometimes when inspiration strikes, it really strikes.
Parton’s father paid the doctor who delivered her with a sack of oatmeal.
Advertisement
Dolly Parton is Miley Cyrus’ godmother.
By the time Miley Cyrus was born in 1992, Dolly Parton had been a country music icon for more than two decades. Thanks to Parton’s close friendship with Miley’s dad, “Achy Breaky Heart” singer Billy Ray Cyrus, she was chosen as Miley’s godmother. “When Miley came along, I said, ‘She’s got to be my fairy goddaughter,’” Partonrecalled in an interview. Parton has also said that the “Wrecking Ball” singer “just had a light about her” from a young age. The relationship is both personal and professional, and Parton appeared on Hannah Montana with her goddaughter several times. And though Cyrus has elicited occasional controversy throughout her career, Parton hasvowed to “never, ever bad-mouth Miley, no matter what she does. I just always hope she comes out the other end alright.”
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
There are impressive filmographies, and then there’s John Cazale's. The actor only appeared in five films during his lifetime, all of which were nominated for Best Picture at the Academy Awards: The Godfather (1972), The Conversation (1974), The Godfather Part II (1974), Dog Day Afternoon (1975), and The Deer Hunter (1978). Even more remarkably, three of them — both Godfathers and The Deer Hunter — won the top prize. The last of these was released after Cazale’s untimely death from bone cancer in March 1978, at which time the 42-year-old thespian was the romantic partner of fellow great Meryl Streep. (He was also in 1990's The Godfather Part III via archival footage, which didn’t break his streak — that sequel was also up for Best Picture.)
Despite receiving widespread praise for his performances, Cazale was never nominated for an Academy Award himself. He fared better during his stage career, winning an Obie for his performance in 1968’s “The Indian Wants the Bronx” — as did Pacino, with whom he shared the stage.
Described by no less an authority than his Godfather co-star Al Pacino as “one of the great actors of our time — that time, any time,” Cazale remains best known for playing the tragic Fredo Corleone in Francis Ford Coppola’s mafioso saga. Revered by everyone from contemporaries Gene Hackman and Robert De Niro to more recent admirers such as Michael Fassbender and Steve Buscemi, he was the subject of the 2009 documentary I Knew It Was You: Rediscovering John Cazale. The film was well received upon its premiere at the Sundance Film Festival, and further cemented Cazale’s status as one of the most respected performers of his generation.
John Cazale and Al Pacino met while working for Standard Oil Company.
Advertisement
Two actors won Oscars for playing the same "Godfather" character.
Marlon Brando won the Academy Award for Best Actor twice during his legendary career, the first time for 1954’s On the Waterfront (“I coulda been a contender”) and the second time for The Godfather (“I’m gonna make him an offer he can’t refuse”). Few silver-screen characters are as iconic as “Don” Vito Corleone, not least because Brando wasn’t the only all-time great to play him. The Godfather Part II is both a sequel and a prequel, with Robert De Niro playing a young Vito Corleone as he emigrates from Sicily to New York and ascends to power. Like Brando before him, De Niro won an Oscar for his performance — this time as Best Supporting Actor. It was the only time two actors had earned Oscars for playing the same character until Joaquin Phoenix was named Best Actor for playing the title character in 2019’s Joker 11 years after Heath Ledger’s performance as Batman’s nemesis in The Dark Knight. The feat was repeated again in 2022, when Ariana DeBose got a trophy for playing West Side Story’s Anita 60 years after Rita Moreno did likewise.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
When it comes to the Amazon River, there’s no such thing as water under the bridge. The idiom simply doesn’t apply there, as no bridges cross the Amazon River despite it being at least 4,000 miles long. This isn’t because the idea has never occurred to anyone — it would just be extremely difficult to build any. The Amazon has both a dry season and a rainy season, and during the latter its waters rise 30 feet, causing 3-mile-wide crossings to grow by a factor of 10 as previously dry areas are submerged. The river bank itself is also in a near-constant state of erosion due to how soft the sediment it consists of is, and there’s no shortage of debris floating in the water.
The longest river in the world is actually the Nile, which is 4,132 miles long — about 132 miles longer than the Amazon, though counts vary. Third on the list is the Yangtze, at 3,915 miles.
Beyond all those logistical hurdles, there simply isn’t much use for bridges across the massive river. For one thing, there are few roads on either side of the Amazon that need to be connected. The river is, of course, in the middle of a dense rainforest, the vast majority of which is sparsely populated. Other long rivers have numerous crossings, however: The Nile has nine bridges in Cairo alone, for instance, and more than 100 bridges have been built across China’s Yangtze River in the last three decades. For now, boats and ferries are the preferred method of crossing the Amazon, and are likely to remain so for the foreseeable future.
The Amazon used to flow in the opposite direction.
These days, the river flows east and into the Atlantic. That wasn’t always the case, as it used to flow west into the Pacific — and even both directions simultaneously. This was during the Cretaceous Period, between65 million and 145 million years ago, and was the result of a highland (mountainous area) that formed along the east coast of South America when that landmass and Africa broke apart. The Andes eventually formed on the western half of the continent, which forced the river into its current eastward flow.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
What is the oldest continuous culture in the world? Some might say it’s the Egyptians, since they’ve been kicking around for several thousand years, or perhaps the Indians living along the Indus River Valley — one of ancient history’s greatest (and least-known) civilizations. However, the real answer lies far away from these centers of ancient wonder, in the Land Down Under, among that continent’s first peoples — the Aboriginal Australians. A study in 2016 by an international team of researchers gathered genomic data that showed this group first arrived on the continent some 50,000 years ago, after leaving Africa about 70,000 years ago.
The British were the first Europeans to land on the Australian continent.
Although British explorer James Cook’s arrival in the Land Down Under in 1770 is well known, it was actually Dutch explorer Willem Janszoon who, in 1606, landed at what is now called Cape York Peninsula in northern Australia.
However, it’s worth noting that Aboriginal peoples are far from a homogenous unit. After the first peoples arrived on the continent, they quickly spread across Australia, forming isolated pockets that developed independently of one another. By the time Europeans arrived en masse in the late 18th century, some 200 nations of Aboriginal Australians — each with their own language — lived throughout the continent. But that diversity goes beyond just tribes or nations; a study in December 2023 concluded that Aboriginal peoples have high levels of genetic diversity compared to European or Asian populations.
Unfortunately, Aboriginal Australians continue to struggle compared to non-Indigenous Australians, and experience an eight-year shorter life expectancy, poorer health and educational outcomes, and other ill effects stemming from colonialism and mistreatment. But if the past 75,000 years have taught us anything, it’s that Aboriginal Australians are a resilient culture, and they aren’t going anywhere.
First elected in 1972, Neville Bonner was Australia’s first Indigenous parliamentarian.
Advertisement
Aboriginal peoples are not the only Indigenous group living in Australia today.
Although Aboriginal Australians make up the lion’s share of the country’s Indigenous peoples, another important group, called Torres Strait Islander Australians, lives on an archipelago of some 274 small islands between mainland Australia and Papua New Guinea. According to a 2021 census, Torres Strait Islanders constitute roughly 8% of Australia’s Indigenous population. These native peoples first migrated to these islands nearly 70,000 years ago when the land was still part of Papua New Guinea, and while James Cook claimed ownership of the Torres Strait Islands in 1770, the islanders weren’t annexed by Queensland (then a British colony and now an Australian state) until 1879. While the cause of Indigenous rights in Australia often pairs Aboriginal peoples and Torres Strait Islanders together, the two groups possess languages and cultures that are wholly separate. In 2013, the Aboriginal and Torres Strait Islander Peoples Recognition Act finally acknowledged that these two peoples were to forever be considered the first inhabitants of Australia.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
You may love Disney, but you probably don’t love it as much as Jeff Reitz. The 49-year-old brought new meaning to the term “Disney adult” by visiting the Happiest Place on Earth 2,995 days in a row — a streak that only ended when Disneyland shut down during the pandemic. It began as “a joke and a fun thing to do” between him and a friend when the two were in between jobs on New Year’s Eve 2011, and it continued for eight years, three months, and 13 days. The original plan was to spend every day of 2012 at the park, in part because it was a leap year and Reitz liked the idea of going 366 days in a single year, but he didn’t feel inclined to stop once 2013 rolled around. He became the unofficial record-holder at the 1,000-day mark and was oh so close to reaching 3,000 days before COVID-19 prevented that particular milestone when Disneyland shut down on March 14, 2020.
Mickey was actually preceded by one Oswald the Lucky Rabbit, the star of 26 short cartoons beginning in 1927. After losing the rights to Oswald, Disney came up with Mickey — and the two creatures bear a striking resemblance to one another.
Reitz, who worked in nearby Long Beach, would usually arrive at Disneyland between 4:30 and 5 p.m. and log some 10,000 steps during his three-to-five-hour visits. Though he initially struggled with the park’s closure, he eventually made peace with it: “A lot has changed over the eight years that I started it,” he said after his streak ended. “I’m good with it. I went more than eight years. I got to see a lot of changes at the park. Now, I’m not worried about going every day like I was.” After more than a year of being closed, Disneyland reopened in April 2021. It’s not clear if Reitz has been back, but he has enough memories to last him a while.
Disneyland employees aren’t allowed to use the phrase " I don’t know ."
Advertisement
Walt Disney received a custom-made Oscar statuette for “Snow White and the Seven Dwarfs.”
Walt Disney won 32 Academy Awards, a record for the most individual Oscars, and one that’s unlikely to be broken anytime soon (if ever). Because there was no award for Best Animated Feature until 2001, when Shrek won the inaugural prize, he received an Honorary Oscar for Snow White and the Seven Dwarfs in 1939 that included a unique custom design — one regular Oscar statuette and seven miniature ones placed along a stepped base. The Oscar was awarded for Snow White’s “significant screen innovation which has charmed millions and pioneered a great new entertainment field for the motion picture cartoon.” It was presented to him by Shirley Temple, who was a bit confused as to why the star of the film wasn’t being honored as well: “I thought that the big statue was for Walt and that the Seven Dwarfs were the little ones going down the side and that Snow White herself hadn’t gotten anything,” she later said.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Human eyes are entirely unique; just like fingerprints, no two sets are alike. But some genetic anomalies create especially unlikely “windows” to the world — like gray eyes. Eye experts once believed that human eyes could appear in only three colors: brown, blue, and green, sometimes with hazel or amber added. More recently, the ashy hue that was once lumped into the blue category has been regrouped as its own, albeit rarely seen, color.
Brown-eyed folks are in good company, with up to 80% of the global population sporting the shade, while blue eyes are the second-most common hue. Traditionally, green was considered the least common eye color, though researchers now say gray is the most rare, with less than 1% of the population seeing through steel-colored eyes.
Sun-kissed skin is often dotted with freckles — which can also appear on our eyes. Optical freckles are common and generally harmless; some form before birth as molelike spots called nevi, while others appear on the iris thanks to sun exposure and aging.
Eye color is an inherited trait, meaning it’s likely members of the same family have similar eye colors. However, geneticists now believe determining a child’s eye color isn’t as simple as looking at their parents. That’s because as many as 16 genes work together to impact the final hue. Intriguingly, the eye color we have at birth isn’t necessarily the one we’ll have as adults. Most babies are born with fainter eyes that often look gray, light blue, or light brown until the melanocytes — the protein that creates color — produce enough melanin to color the iris. People with less active melanocytes typically have lighter eyes (like blue or green), while people with more melanin usually end up with brown eyes. In most cases, our final eye color begins to emerge around 3 to 6 months old, though it can continue changing until a baby’s third birthday.
Having two different colored eyes is called heterochromia.
Advertisement
The letters on an eye exam chart are called “optotypes.”
Picking out which letters you can (and can’t) see from a chart is now a routine part of an eye exam, in part due to Dutch ophthalmologist Herman Snellen. For hundreds of years, eye doctors used a variety of methods to test their patients’ visual acuity (aka how far and clearly a person can see), including vision charts of their own design featuring seeds and common symbols, though no one test was widely used. In the 1860s, Snellen designed his first vision chart using squares and circles, but ultimately decided to use letters. The chart-topping sizable E, along with the C, D, F, L, O, P, T, and Z were dubbed “optotypes” — a style of consistently sized and geometrically balanced lettering. Snellen’s test became popular when the British Army began using it around 1863, and it eventually became the standard acuity test. While other charts have since emerged (along with tweaks to Snellen’s design), it remains the most widespread eye exam tool, in part because it’s easy and inexpensive to reproduce.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.