Original photo by phloxii/ Shutterstock

Everything about human vision is a biological marvel. Eyes, which first evolved some 540 million years ago (in now-extinct animals called trilobites), provide the primary sense organ through which most humans construct reality — yet our incredibly complex visual system involves much more than just our eyes alone, and requires amazing coordination among many parts of the brain. These six facts explore the ways our eyes and brain make sense of the world, how certain optical illusions can short-circuit those processes, and how the limits of human vision actually stretch far beyond what scientists once thought possible.

Woman laying upside down with eyes pointed at the camera.
Credit: laartist/ iStock

Eyes Actually Perceive Things Upside Down

Much of the work of perceiving the world around us actually takes place in the brain. In a way, our eyes act as a camera, and our brains as a kind of “darkroom” that develops that information into what we call our vision. One of the most perplexing aspects of this dual relationship is that the images projected onto our retina are actually upside-down. Because the cornea — the transparent part of the eye covering the iris and pupil — is a convex lens, when light enters the cornea, it’s flipped upside down. It’s the brain’s job to translate this inverted information, as well as two 2D images, one from each eye, into one cohesive 3D image.

The human eye upclose.
Credit: Liukov/ Shutterstock

Human Vision Has a Major Blind Spot

When glancing around the world, human vision appears nearly flawless. Although our field of vision, at 180 degrees, is relatively narrow when compared to that of animals such as chameleons (who can see at nearly 360 degrees), the image appears complete, and even delivers fidelity equivalent to 576 megapixels. Despite these strengths, every human eyeball has a pretty sizable blindspot, an area in our vision that would appear invisible if not for some clever tricks developed by our brain. This blind spot occurs where the optic nerve, which carries messages from the retina to the brain, meets the retina. Because there are no photoreceptor cells in this part of the human eye, this small space disappears in human vision. Thankfully, humans are born with two eyes, and our brain fills in the gap with information derived from the opposing eye, so you never actually see this blind spot.

Optical illusion created by clay columns forming shapes of two ladies talking.
Credit: Juriah Mosin/ Shutterstock

Optical Illusions Are Important for Understanding the Human Brain

The human brain is notoriously tricky to study due to its immense complexity, but optical illusions play a vital role in helping scientists discover how our brains create our reality. One aspect of this complexity is that an estimated 30 areas of the human brain are involved in human vision. Studies have shown that different illusions impact different areas of the brain differently — sometimes, some parts of the visual system correctly identify visual information, while other parts are tricked (for instance, the visual cortex at the back of the brain might not be fooled by an illusion, while the frontal lobe is). Figuring out how this happens can be valuable for scientists.

Some illusions also take advantage of the fact that humans don’t perceive reality instantaneously, but rather in a 100-millisecond lag — or the time it takes for light to transform into electrical impulses to be interpreted by our brain. To make sure we’re not a complete, uncoordinated mess, the visual system in our brain predicts where an object is headed, but sometimes errors in this prediction can cause an optical illusion. This is only one kind of cause (among many) for optical illusions; the way optical illusions happen is nearly as complex as the human brain itself.

Close up of a man winking.
Credit: EHStock/ iStock

Blinking Provides a “Neurological Reset” for Our Brain

Blinking performs a vital role in the human visual system by cleaning the eye’s surface and lubricating it with tears, but there’s a catch — humans blink way more than simple lubrication requires. In fact, humans blink so much that it’s estimated that we spend 10% of our waking hours with our eyes closed. So why do humans tend to blink approximately 12 times a minute? Around 2010, scientists from the University of Osaka observed study participants as they watched snippets of the comedy series Mr. Bean. The subjects tended to blink during scene changes, or when the main actor left a scene. This blinking activated what the scientists call the brain’s “default mode network,” which resulted in a very brief stand-down of other areas of the brain related to attention. The theory is that these very brief pauses in brain function allow humans to refocus attention on something else.

A magnifying glass looking at the red shape.
Credit: showcake/ Shutterstock

Humans Perceive the Color Red First

After birth, a baby mostly sees in black and white — and that’s only the beginning of its problems. A newborn’s vision is also incredibly fuzzy, and limited to around 8 to 12 inches from its face during the first few weeks of life. Whereas average human sight is considered 20/20, it’s estimated that a newborn’s vision lies somewhere between 20/200 and 20/400. Because red has the longest wavelength (at 700 nanometers), the color doesn’t scatter easily, and it’s the first hue capable of being detected by a baby’s reduced visual range. Within a year, most babies have attained most of the normal human visual faculties, although the ability to accurately judge distances is one of the last skills to be acquired.

A computer generated tetrachromatic imaginative background.
Credit: sakkmesterke/ Alamy Stock Photo

Some People, Known as “Tetrachromats,” Can See 100 Million Different Colors

The human eye is a trichromatic system, meaning that three different types of cones are sensitive to three specific colors: red, blue, or green. However, some people have an abnormal gene that creates a fourth cone that’s particularly sensitive to the yellow-green part of the visual spectrum. As a result, instead of the million colors a human can normally see, people with the condition, known as “tetrachromacy” (tetra is a Greek prefix meaning four), can see 100 times that amount. Strangely, scientists believe that only women can inherit this superhuman vision, because of their two X chromosomes. Because the gene that regulates red and green cones is located in the X chromosome, it’s possible that a woman could encode that gene in each X chromosome differently, thus resulting in four types of cones.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by H.S. Photos/ Alamy Stock Photo

When asked to rank the best Presidents to ever sit in the Oval Office, historians often place the 32nd President of the United States, Franklin Delano Roosevelt, near the top.

Serving as commander in chief for an unprecedented 12 years, Roosevelt oversaw the transformation of the U.S. from an isolationist nation suffering under the worst economic downturn in its history to the start of its status as a postwar superpower. Roosevelt’s social programs, including the New Deal and the Social Security Act, completely redefined the contract between the U.S. government and its citizens, and his wartime leadership helped pave the way for a more peaceful world.

Roosevelt’s time as President isn’t without its blemishes and moral failings, however — chief being the internment of Japanese Americans during World War II. These seven facts illuminate what made him one of the greatest, and most controversial, Presidents who ever lived.

A red tailed hawk rests upon a pillar of the White House.
Credit: Tom Brenner/ Getty Images News via Getty Images

FDR Was a Birder

Growing up in Dutchess County, New York, a young Franklin Delano Roosevelt became obsessed with birds. The future President collected eggs and kept them in drawers in his childhood bedroom, shot and classified over 300 birds in his county, and frequently gave impromptu lectures to family and friends about birds and their migratory patterns. FDR’s passion for ornithology, and natural history more generally, likely contributed to his impressive environmentalism record when he became President many decades later.

U.S. President Franklin Delano Roosevelt sitting at his desk in the Oval Office.
Credit: Bettmann via Getty Images

Genealogists have determined that Roosevelt was distantly related to 11 other U.S. Presidents — five Presidents by blood and another six by marriage. The most obvious relative is former President Theodore Roosevelt, who was FDR’s fifth cousin (and the uncle of FDR’s wife, Eleanor). The other U.S. Presidents include John Adams, James Madison, John Quincy Adams, Martin Van Buren, William Henry Harrison, Zachary Taylor, Andrew Johnson, Ulysses S. Grant, Benjamin Harrison, and William Howard Taft. Roosevelt was also distantly related to major World War II figures, including Winston Churchill and Douglas MacArthur.

Election poster of James M Fox for President and Franklin Delano Roosevelt for Vice President.
Credit: MPI/ Archive Photos via Getty Images

FDR’s First Appearance on a Presidential Ticket Was for VP in 1920

Roosevelt became a member of the New York Senate in 1910. He helped propel Woodrow Wilson to the presidency in 1912, and Wilson returned the favor by appointing Roosevelt assistant secretary of the Navy. In 1920, Franklin reached the national stage as Vice President on the Democratic ticket, along with presidential hopeful and governor of Ohio James Cox.

But the Democrats didn’t prevail that year: The Republican ticket, headed by Warren G. Harding, campaigned on a promise to “return to normalcy” following the end of World War I. The message resonated with voters, and Harding, along with VP Calvin Coolidge, won a landslide victory.

Yet FDR’s star rose after he became governor of New York in 1928. He took on the corrupt Tammany Hall political machine, which eventually led to its downfall. By 1932, he was a clear candidate for President in the Democratic Party.

Portrait of Franklin D. Roosevelt seated at a desk.
Credit: Bettmann via Getty Images

He Served Two More Terms Than Any President Ever Will

Franklin Roosevelt is remembered for many things, but one reason his impact looms so large in American history is because he was elected commander in chief four times — double any other U.S. President. George Washington set a precedent when he served only two terms in the late 18th century, and future Presidents more or less followed this tradition (though FDR’s cousin Theodore Roosevelt ran for a third term). After Roosevelt’s historic 12 years in office (he died early into his fourth term), the U.S. Congress passed the 22nd Amendment, officially limiting any future President’s time in office to two terms.

President Roosevelt signing a war declaration.
Credit: Bettmann via Getty Images

FDR Considered Japanese Internment Years Before WWII

Years before the attack at Pearl Harbor, the U.S. Navy watched Japan’s growing militarism in the Pacific, and some grew worried about the large Japanese American population in Hawaii. In 1936, Franklin called for some Japanese citizens and non-citizens to be “secretly but definitely identified” and “placed on a special list of those who would be the first to be placed in a concentration camp in the event of trouble.”

Seven years later, FDR’s Executive Order 9066 authorized the evacuation of people deemed a threat to national security from the West Coast to relocation centers — which resulted in moving more than 100,000 people of Japanese descent into internment camps. It remains a dark day in U.S. history. Four decades later, Ronald Reagan signed the Civil Liberties Act of 1988, which paid reparations to wronged Japanese Americans while hoping to “discourage the occurrence of similar injustices and violations of civil liberties in the future.”

Roosevelt poses in front of cameras.
Credit: Bettmann via Getty Images

He Was the First President To Be Seen on TV

Although Roosevelt is famous for his fireside chats broadcast via radio, the nation’s 32nd President was also the first to ever be seen on television. The President appeared on TV during the World’s Fair in New York in 1939, although only a handful of TV sets in the area were able to actually watch the broadcast. As World War II exploded across Europe and Asia, and TVs became more commonplace in American homes, FDR became the first President to really use the emerging medium. FDR was also the first President to ever fly on an airplane for official presidential business (though FDR’s fifth cousin Theodore was the first President to ever fly in a plane overall).

Franklin Roosevelt holds his Scotch terrier, Fala, on his lap as he talks to Ruthie Bie.
Credit: Historical/ Corbis Historical via Getty Images

FDR Is One of the Greatest Figures in History With a Disability

In 1921, when he was 39 years old, FDR was stricken with polio (or perhaps Guillain-Barré syndrome), a disease which at the time had no known cure. For the rest of his life, he was totally paralyzed from the waist down. In the 1920s, it was common to send disabled people to asylums or banish them from society altogether. But after years of rehabilitation, Roosevelt learned to cope with his disability and continued his political life. During his presidency, the media largely (though not always) avoided discussing his paralyzation. His Secret Service was also known to confiscate or destroy film that showcased the disability. However, the American public largely sympathized with FDR and saw him as a man who triumphed over his disability rather than a victim of it.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by JodiJacobson/ iStock

Autumn has a lot of fans, and it’s easy to see why, with the abundance of warm spices, colorful trees, crunchy leaves underfoot, and delicious seasonal produce. But fall also means the nights are getting longer, and many cultural traditions, from All Hallow’s Eve to Dia de los Muertos, indicate a thin veil between our world and what lies beyond. It’s an environment ripe for superstition.

For example, how does the annual harvest help predict how rough the winter ahead will be? Where does the wishbone tradition come from, and what’s up with bobbing for apples? How do you ward off evil spirits while keeping your home smelling delightful? These six superstitions will help keep your fall festive — and just a little spooky.

Autumn falling leaves on blue sky.
Credit: vnlit/ Shutterstock

Catching a Falling Leaf Is Auspicious

Next time an autumn leaf blows past your face, try to catch it — legend has it that it will bring good luck, or that you get to make a wish on it. The superstition can get more complex depending on the color and type of leaf you catch: A red leaf (or maple leaves in general) could mean good luck in love, and an orange leaf could mean inner transformation. Ginkgo leaves symbolize enlightenment, as well as hope and resilience.

How long does the good luck last? It could be a week or a month or the whole season, depending on who you’re talking to. After you catch it, you can release it back into the wind or bring it inside to incorporate into your seasonal decor.

Two opposite hands hold a chicken bone and try to break it in half.
Credit: Dmitri Disterheft/ Shutterstock

Wishbones Bring Good Luck

Americans who celebrate Thanksgiving with a whole turkey may still observe this custom. Bird species have a part called the furcula, a forked bone that helps support them in flight, commonly known as the wishbone. In some traditions, two people pull either side of the wishbone, and the person who breaks off the bigger piece can either make a wish or enjoy good luck, depending on your version of the custom. The practice goes back to medieval England, where meals celebrating St. Martin’s Day, also known as Martinmas (traditionally an end-of-winter feast day commemorating the fourth-century St. Martin of Tours), typically included geese.

Woman carving big orange pumpkin into jack-o-lantern.
Credit: kobeza/ Shutterstock

Carved Gourds Can Ward Off Evil Spirits

Ever wonder why we carve jack-o’-lanterns on Halloween? The tradition dates back to Celtic observations of Samhain, celebrated from October 31 to November 1. Because of the holiday’s association with the supernatural, warding off dark forces was a must. Early traditions employed big bonfires to get the job done, but once towns developed, people needed something safer. Hollowed-out turnips or gourds were cheap and easy to get a hold of, so they became makeshift lanterns. Initially they were just pierced to let the light out, but gradually they started to take on the shape of the scary spirits they were supposed to scare off.

When European immigrants came to North America, there weren’t as many turnips or gourds — but there were plenty of pumpkins, so the tradition evolved into the modern jack-o’-lantern.

Apple bobbing at an outside party.
Credit: Monkey Business Images/ Shutterstock

Bobbing for Apples Tells Your Romantic Future

Although it’s not as much of a fall staple as it once was, bobbing for apples — the act of trying to pick an apple out of a bucket of water with your teeth — is a classic way to celebrate autumn. It’s become more of an all-ages activity, but it has its roots in superstitious matchmaking.

In one tradition, women secretly marked apples before their potential mates went a-bobbing, and future matches were foretold depending on whose apple ended up in whose mouth.

In another practice, women were the ones doing the biting; if they managed to get the apple in one bite, it indicated true love, with the prognosis getting a little worse with each subsequent try. In a more complex tradition, the woman to bite an apple first would be the first to get married.

Close-up of Wooden hand broom handle.
Credit: Marcy Schrum/ Shutterstock

Cinnamon Brooms Can Ward Off Bad Energy

The coming of longer nights inspired a lot of folklore about malevolent spirits — and with it, myriad ways to send them packing. Cinnamon brooms (strips of cinnamon bark tied into a broom shape) do double-duty as an air freshener and a paranormal protector. To ward off evil, you’re supposed to hang one above your front door or literally sweep your porch with one. Nowadays, they’re available in many grocery stores, so there’s no crafting required.

Even if you don’t believe in spooky stuff, cinnamon brooms make for festive and sweet-smelling fall decor, so they can’t hurt.

Peeled onion skin, close-up.
Credit: andria01/ Shutterstock

Onions Can Predict Winter Weather

When harvest season rolls around, the chilly winter months are just around the corner, which means climate divination and produce are a natural fit. Keeping an eye on your onion skins was once a popular way to predict winter conditions. Farmer’s Almanac has a handy rhyme to tell you how to read them:

Onion’s skin very thin,

Mild winter coming in;

Onion’s skin thick and tough,

Coming winter cold and rough.

Thick apple skin, thick corn husks, and flowers blooming late in autumn are other traditional indicators that a rough winter is on its way. Fowl bones were also sometimes used: After their Martinmas celebrations, 15th-century Bavarians would let their Martinmas geese’s breastbones dry out and use that to predict the weather, although it took special skills to interpret it.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by GUDKOV ANDREY/ Shutterstock

The Arctic Circle is an imaginary line of latitude that circles the northernmost pole. This parallel separates the Northern Temperate Zone from the Arctic zone above it — the latter of which is an extreme geographic region that covers approximately 5.5 million square miles and has a landscape of glaciers, icebergs, sea ice, and permafrost. Interested in finding out more about the northernmost of the world’s five major lines of latitude? Here are eight fascinating facts that you might not know about the Arctic Circle.

Arctic Circle sign along the Dalton Hwy in Alaska.
Credit: Senthil Raman/ Shutterstock

The Position of the Arctic Circle Changes Every Year

The Arctic Circle is located at approximately 66.3 degrees north of the equator; however, its actual location changes slightly every year. This is due to the fluctuation of Earth’s axial tilt, which is influenced by the orbit of the moon and the consequent tidal changes. The same axial tilt causes the different seasons that we experience on Earth. Currently, the circle is moving north at a rate of around 49 feet per year. In 2017, an art exhibit called Orbis et Globus was inaugurated on Iceland’s Grimsey Island to monitor the circle’s movements.

Polar bear with cubs in the tundra.
Credit: GUDKOV ANDREY/ Shutterstock

Earth’s Largest Land Predators Are Unique to the Region

Polar bears are the largest land carnivores in the world, and they are only found in the Arctic region. They reside around ice-covered waters and are dependent on sea ice for food, to rest, and to breed. Fully grown male polar bears measure around eight to nine feet from nose to foot, while females measure approximately six to seven feet. Despite their enormous size, polar bears are only about the size of a guinea pig when born. These cuddly-looking bears feed mainly on seals, are talented swimmers, and possess a coat of white fur (although it’s actually transparent) to camouflage themselves in their snowy habitats.

Ursa major Constellation stars in outer space with shape of a bear in lines.
Credit: Alexandr Yurtchenko/ Alamy Stock Photo

The Arctic Name Is a Reference to the Greek Word for Bear

Appropriately, the word “arctic” itself is derived from the Greek word arktos, which means “bear.” However, the bear in reference isn’t the polar variety, but instead the celestial kind, specifically the Ursa Major (Great Bear) and Ursa Minor (Little Bear) constellations. Both of the constellations are visible from the Northern Hemisphere, and the latter contains the North Star. At the opposite end of the world, Antarctica gets its name because those constellations aren’t visible from that region. Interestingly enough, there are also no animal bears in Antarctica.  

Murmansk city in winter.
Credit: nickbeam/ Shutterstock

Over 4 Million People Live Within the Arctic Circle

The Arctic Circle incorporates portions of eight countries: Canada, Greenland, Iceland, Norway, Finland, Russia, Sweden, and the United States. And despite a harsh climate and often inhospitable living conditions (for most), an estimated 4 million people live and work there year-round. Murmansk, in northwestern Russia, is the largest and one of the oldest settlements in the Arctic Circle. This city on the Barents Sea is home to around 300,000 residents and is known for its seaports and naval bases. In fact, eight of the 10 largest Arctic settlements are located in Russia.

Inuit Woman on the Tundra.
Credit: RyersonClark/ iStock

Dozens of Indigenous Groups Thrive in the Region

Residing among the large population of the Arctic Circle are over 40 different ethnic groups, such as the Inuit, Saami, and Yupik peoples. They account for 10% of the regional population. While they vary greatly in culture, language, and history, these Indigenous groups have a strong connection to the arctic lands they’ve inhabited for thousands of years. Many maintain traditional fishing, reindeer herding, and hunting activities, but their livelihoods and productivity are under threat from dramatic weather changes and disappearing sea ice.

The Svalbard Global Seed Vault.
Credit: TT News Agency/ Alamy Stock Photo

It’s Home to the Largest Seed Storage Facility in the World

Set amid the frigid waters between Greenland and Norway is the Norwegian island of Svalbard. Here, the Norwegian government opened the Svalbard Global Seed Vault, which is the world’s largest secure seed storage, in 2008. This 10,764-square-foot vault is buried almost entirely into the island’s permafrost — only the concrete entrance is visible to the outside world, and only scientists and staff are allowed inside. The structure has the capacity to store 4.5 million different seed types and maintains them at constant temperatures of 37.4 to 39.2 degrees Fahrenheit. The collection is stashed here for safekeeping in case of crop failures or natural disasters — because of its naturally stable Arctic climate, and also since it’s one of the least likely places on Earth to experience either a flood or a heat wave, both of which could damage the seeds.

Aerial view of the Baird Mountains in Kobuk Valley National Park.
Credit: NPS Photo/ Alamy Stock Photo

Four U.S. National Park Sites Lie Within the Arctic Circle

Alaska is home to 54 million acres of land protected under the U.S. National Park Service, representing about two thirds of the land in the entire system. Four of the state’s national park units are situated inside the Arctic Circle: Cape Krusentern National Monument, Gates of the Arctic National Park and Reserve, Kobuk Valley National Park, and Noatak National Preserve. Visitors to these parks and reserves have the chance to discover untamed wildernesses made up of glaciated valleys, rivers, and lagoons framed by soaring mountain ranges. There are also opportunities to spot caribou and grizzly bears and to experience days of extreme daylight and darkness.

Walrus and her pup floating on ice in a fjord , Eastern Greenland.
Credit: wildestanimal/ Shutterstock

There Are Actually Four North Poles Located in the Arctic Circle

For many, the North Pole is often associated with Santa Claus, flying reindeer, and toy-making elves. What most don’t know is that there are actually four recognized North Poles. The Geographical North Pole (aka True North) is the northernmost point on the planet and where all of Earth’s lines of longitude meet. The Magnetic North Pole is the spot at which the planet’s lines of magnetic force all point vertically downward (and the point that attracts the needle of a compass). The Geomagnetic North Pole is the northern end of where the axis of the magnetosphere — the magnetic field that surrounds the Earth and extends into space — intersects the planet. Finally, the North Pole of Inaccessibility is the point in the Arctic Ocean that’s furthest from any coastline.

Circa 1885: Santa Claus.
Credit: Hulton Archive/ Archive Photos via Getty Images

A Cartoonist Invented Santa’s North Pole Home

In the mid-1800s, German-born American artist Thomas Nast made a name for himself for his caricatures and political cartoons. He’s also credited with creating the popular image of Santa Claus (or Father Christmas). In 1863, Harper’s Weekly magazine published two of his illustrations that depicted Santa as a larger-than-life character with a long beard and stocking cap. One of the images was inscribed with the words “Santa Clausville, N.P.” The N.P. was an abbreviation of North Pole, and so began the myth of Santa residing in a far-off and remote northern land.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Allstar Picture Library Limited/ Alamy Stock Photo

Few performers in history have risen to Liza Minnelli’s achievements across such a wide range of artistic disciplines. With her powerful alto singing voice and acting chops worthy of both the stage and screen, the actress, singer, dancer, and choreographer has delighted countless audiences en route to becoming one of the most prolific entertainers of her generation. From her showbiz childhood to the surprising role she originally lost out on, here are six facts about the unforgettable Liza Minnelli.

Judy Garland with her husband, Vincente Minnelli, and their daughter Liza Minnelli.
Credit: Bettmann via Getty Images

Minnelli Was Born to Famous Parents

Stardom runs in Liza Minnelli’s family — she was born in Los Angeles on March 12, 1946, to two parents who were both Academy Award winners. Liza’s mother was acclaimed actress and singer Judy Garland, star of notable films such as The Wizard of Oz and A Star Is Born. Liza’s father was film director Vicente Minnelli, who brought many musicals to the silver screen and even directed Garland in Meet Me in St. Louis.

Minnelli has described her relationship with her parents (who later divorced) as a loving one. Her relationship with her mother, however, had its fair share of difficulties, as Garland struggled with addiction. Through it all, Minnelli provided Garland with unconditional love and support, and Garland treated her daughter as an artistic equal, inviting a then-18-year-old Minnelli to share the stage in a memorable November 8, 1964, performance at the London Palladium. Minnelli also drew inspiration from her father, whom she viewed as a visionary. She once said, “I got my drive from my mama and my dreams from my father.” Her father even helped conceive of the black pixie cut that Liza wore in the 1972 movie Cabaret, a style she would sport for most of the rest of her life.

Judy Garlandand with daughter Liza Minnelli, on the set of "In The Good Old Summertime".
Credit: PictureLux / The Hollywood Archive/ Alamy Stock Photo

She Made Her On-Screen Debut at Age 3

As she came from such a show business-oriented family, it’s no surprise that Minnelli began acting at a young age. Some would argue that her first on-screen appearance was while she was still in the womb, as her mother Judy Garland was pregnant while filming the 1946 film Till the Clouds Roll By. Just a few years later, Minnelli made her actual on-screen debut — albeit an uncredited one — alongside her mother during the final scene of 1949’s In the Good Old Summertime. In the movie, Minnelli played a young child being held by Garland and actor Van Johnson.

Minnelli went on to make her professional stage debut before ever earning her first official on-screen credit. In 1963, she portrayed Ethel Hofflinger in the 1963 Off-Broadway revival of Best Foot Forward, paving the way for her later success on stage. Minnelli’s first credited on-screen acting role came in a 1964 episode of the television show Mr. Broadway, and she made her official film debut in 1968’s Charlie Bubbles as the character Eliza.

American actress Liza Minnelli as Sally Bowles in 'Cabaret', directed by Bob Fosse, 1972.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Minnelli Lost Out on the “Cabaret” Stage Role That Later Won Her an Oscar

Liza Minnelli is perhaps best known for the role of Sally Bowles in Cabaret, but when the musical first opened on Broadway in 1966, she didn’t have the part. Though she auditioned for the role, Minnelli was rejected in favor of actress Jill Haworth due to a perceived lack of experience. While it may be natural to feel pessimistic in the wake of a rejection, Minnelli claims her optimism remained undeterred, stating that she “knew [she’d] get the movie for some reason.”

When producers were casting for the 1972 film version, Minnelli was working in Paris and invited one of them to a show at which she performed the titular song, “Cabaret.” Though she initially struggled to convince the producers to hire her, that performance ultimately won her the part. The film version of Cabaret, directed by Bob Fosse, earned Minnelli her first and only Academy Award. She was 27 at the time.

Minelli & Fosse at the Tony Awards.
Credit: Sonia Moskowitz/ Archive Photos via Getty Images

She’s One of a Select Few Entertainers to Hold EGOT Status

As of 2023, fewer than two dozen entertainers have achieved rare EGOT status, which entails winning an Emmy, Grammy, Oscar, and Tony Award in one’s lifetime. Liza Minnelli is one of those select few, though only if you include honorary awards. While the majority of her award victories were competitive, her Grammy came in a non-competitive category.

As an accomplished stage actor, Minnelli found most of her award success at the Tonys, where she took home the awards for Best Leading Actress in a Musical in 1965 for Flora the Red Menace, a Special Tony Award in 1974, Best Leading Actress in a Musical in 1978 for The Act, and Best Special Theatrical Event in 2009 for Liza’s at the Palace…! In 1973, Minnelli won an Emmy for her special Liza with a “Z”: A Concert for Television. That same year, she took home the Oscar for Best Actress in a Leading Role for her portrayal of Sally Bowles in Cabaret. In 1990, Liza finally became an EGOT winner by being honored with the Grammy Legend Award, recognizing her lifetime contributions to the field of music.

NEW YORK NEW YORK (1977) LIZA MINNELLI.
Credit: Moviestore Collection Ltd/ Alamy Stock Photo

Minnelli Popularized “New York, New York” Before Frank Sinatra

New York, New York” is one of crooner Frank Sinatra’s signature hits, but it was Liza Minnelli who debuted the track before Ol’ Blue Eyes covered it in 1979. John Kander and Fred Ebb originally wrote “New York, New York” as the theme for a 1977 Martin Scorsese film of the same name. The two songwriters had previously written many of the legendary songs Minnelli sang in the film Cabaret. Her rendition of the “Theme from ‘New York, New York’” was a modest success on the charts.

A year later, Sinatra sang the song at a charity event at the Waldorf-Astoria in New York City; he also recorded it for his album Trilogy: Past Present Future, released in 1980. Sinatra’s version quickly superseded Minnelli’s in popularity, cracking the Top 40, and has since been adopted by the New York Yankees to celebrate victories.

American emo rock band My Chemical Romance.
Credit: Richard Ecclestone/ Redferns via Getty Images

Minnelli Sang on My Chemical Romance’s 2006 Album

Emo rock group My Chemical Romance thrived in a genre of music not commonly associated with Liza Minnelli, but in 2006, the band recruited her to sing on their album The Black Parade. The band’s frontman, Gerard Way, explained in an interview that the LP was “very theatrical” and that they wanted someone “kind of motherly but who was also a survivor, had been through a lot, but was rooted in theater.” Minnelli was a natural choice. Minnelli accepted the band’s offer and recorded her part remotely in what proved to be a legendary musical collaboration. Her voice can be heard on the song “Mama” as she dramatically sings the lines, “And if you would call me a sweetheart / I’d maybe then sing you a song.”

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by New Africa/ Shutterstock

Names aren’t usually pulled out of thin air, and often their stories are far more interesting than you’d expect. Shakespeare is responsible for the popularity of a couple of names, including one he made up entirely. One list-topper from the 2000s came from a 1980s rom-com. A couple of names that are complete stand-alones today started out as nicknames. One rising star even started as a last name, was coined as a separate word, and then trickled its way back into being a first name. Do you know the stories behind these seven popular names?

Scene from Shakespeare's Merchant of Venice with Lorenzo and Jessica.
Credit: Culture Club/ Hulton Archive via Getty Images

Shakespeare Made Up the Name Jessica

The first use of the name Jessica was in William Shakespeare’s play The Merchant of Venice for the daughter of the Jewish moneylender Shylock. The Bard likely used Anglicized versions of names taken from the Christian Book of Genesis for all of his Jewish characters. In the case of Jessica, the root name was probably Iscah, a very minor biblical figure who was a niece of Abraham and sister of Lot. Centuries later, Jessica had an incredible 21-year run, from 1977 to 1997, on the baby name top-five list in the United States, spending most of that time in the No. 1 or No. 2 slot.

Twelfth Night by William Shakespeare, featuring Olivia.
Credit: Culture Club/ Hulton Archive via Getty Images

Shakespeare Also Popularized Olivia

The masculine Oliver’s popularity far predates Shakespeare. According to legend — particularly in the 11th-century text Song of Roland — it was the name of one of Charlemagne’s warriors. The feminine version, Olivia, wasn’t popularized until the early 17th-century Shakespeare comedy Twelfth Night. In the play, Countess Olivia is a smart, beautiful noblewoman, so of course her name took root.

The meaning of the name Oliver is somewhat debated. It depends on whether or not it has the same origins as Olaf. If it does, it means “ancestor,” but if not, it could mean “olive tree.”

Her Royal Highness, the Princess Amelia.
Credit: Sepia Times/ Universal Images Group via Getty Images

Emily Was a Nickname for Amelia

The name Emily rose to prominence in the 18th century, and although it evolved independently from the same root as Amelia and Emile (all from Aemilia, the name of a Roman patrician family), it entered the popular English-speaking imagination as a nickname. The German House of Hanover rose to the English throne in 1714, and they brought the name Amelia with them, first with Princess Amelia Sophia Eleanor of Great Britain (1711-1786), and then later with Princess Amelia of the United Kingdom (1783-1810). Both princesses were nicknamed “Emily.”

Robin Hood archetypal figure in English folklore.
Credit: Culture Club/ Hulton Archive via Getty Images

Robin Was a Nickname for Robert

The earliest Robins weren’t named after the bird. In medieval England, the name Robin was a diminutive for Robert — essentially, an ancient version of Bobby or Robby. One of the earliest prominent examples is the medieval legend of Robin Hood, whose full name is, in some Elizabethan retellings, Robert Fitzooth.

Robin gained traction as a feminine name in the 1940s, possibly in those cases with avian origins. Soon, the feminine Robin far outpaced the masculine Robin, and peaked as the 25th most popular name in the U.S. in 1962 and 1963. Robert, for the record, means “bright” or “famous”; it was in the top five masculine names in the U.S. from 1906 until 1971.

Large ancient ash tree in a summer meadow.
Credit: Francesca Leslie/ Shutterstock

Ashleigh Was a Feminized Version of the Masculine Ashley

Until about the 1960s, Ashley (which means “meadow of ash trees”) was seen as a masculine name, and Ashleigh was considered a feminized spelling. All spellings of Ashley are largely considered to be feminine today, although the traditional “-ey” ending is still the most popular. It was the 73rd most popular name of the 2010s in the U.S., but had a long run in the top five from 1983 to 2001, peaking at No. 1 in 1991 and 1992. Despite the perception of “-eigh” names being pretty popular, Ashleigh peaked in 1991, roughly following the trajectory of its more traditional counterpart, as the 176th most popular name in the U.S.

Close-up of road sign for Madison Avenue, on the Upper East Side of Manhattan.
Credit: Smith Collection/Gado/ Archive Photos via Getty Images

Madison Was a Joke From a 1980s Romantic Comedy

As a first name, Madison wasn’t popular until very recently. For most of the 20th century it was an unpopular masculine name, but it cracked the top five baby names in the United States as a feminine name from 2000 to 2007. The story of how it got to be there is, however, a little fishy.

Madison has a long and storied history as a last name — and that’s relevant to its recent spike. President James Madison is one of the most famous examples; he’s the namesake of Madison Park in New York City, which, in turn, lends its name to Madison Avenue. In the 1984 movie Splash, Daryl Hannah plays a mermaid trying to pass as human in the Big Apple. Her character’s real name can’t be pronounced by humans, so when she’s asked, she adopts the name Madison from the street.

Her co-lead, played by Tom Hanks, responds, “That’s not a name.” But it soon would be: The next year, Madison broke the top 1,000 names in the United States, a small but important step toward hitting the top 10 in 1997.

1218 Portrait of Samuel Augustus Maverick.
Credit: History and Art Collection/ Alamy Stock Photo

Maverick Has its Origins in Cattle Farming

Maverick took a meandering route to become the 40th most popular name of 2022. In modern parlance, a maverick is a free-minded individualist, making it a popular nickname (as with Tom Cruise’s character in Top Gun). Eventually, it was normalized as a given name, too. What’s curious about this is that the word “maverick” already comes from a person’s name — Samuel Augustus Maverick, a politician, land baron, and cattle rancher. He had a large herd of calves without brands that wandered freely. “Maverick” was coined to refer to an unbranded calf, but its meaning evolved pretty quickly to apply to humans, too. The name has been on the rise for more than a decade, and is around 13 times more popular now than it was in 2010.

Sarah Anne Lloyd
Writer

Sarah Anne Lloyd is a freelance writer whose work covers a bit of everything, including politics, design, the environment, and yoga. Her work has appeared in Curbed, the Seattle Times, the Stranger, the Verge, and others.

Original photo by John Smit/ Unsplash

As modern technology progressed from vacuum-tube computers that fill entire rooms to ultra-fast smartphones that fit in our pockets, the industry simultaneously developed a robust iconographic lexicon. Today, these digital-age hieroglyphics adorn most of the computers, laptops, gaming consoles, smartwatches, and smartphones in our homes — but have you ever wondered where they came from? Here are the stories behind eight electronic icons and symbols you probably see every day.

The push button switch on washing machine.
Credit: winnond/ Shutterstock

Power Button

The electronic world is run by zeros and ones, so it makes sense that the power button — arguably the most important toggle on our gadgets and gizmos — is adorned with both. During World War II, engineers used “1” to mean “on” and “0” to mean “off.” If you look at some power strips, you’ll see that the glowing red power toggle is adorned with both a 0 and a 1. In 1973 the International Electrotechnical Commission codified a broken circle with a line in the middle as a “standby power state.” Today, that design is simply an emblematic stand-in for “power,” and devices still use the symbol half a century later.

The Command key on an Apple computer.
Credit: hannah joshua/ Unsplash

Command Key

Glance at the keyboard of an Apple computer, and you’ll see a strange symbol denoting the command key. The original idea for the key — back when it was called the Apple key — was for it to allow users to navigate an Apple computer without a mouse, something that was much more common in the early 1980s before the era of trackpads. Annoyed by how many Apple logos appeared on the command list of the application MacDraw, Apple co-founder Steve Jobs allegedly proclaimed, “There are too many Apples on the screen! It’s ridiculous! We’re taking the Apple logo in vain!” and asked Apple’s bitmap artist Susan Kare to come up with a solution. While searching an international symbol directory, Kare came across the floral design then used in Scandinavian maps for attractions or places of cultural heritage. This simple yet effective design was just the thing Kare needed to symbolize the concept of “command.”

404 ERROR Internet Page not Found.
Credit: Savvapanf Photo/ Shutterstock

404 Error

Just as zero was a major breakthrough in the world of mathematics, so too would the digital world be impossible without the 404 error. Early hypertext systems kept a centralized database of all links and where each link sent a user. If one link was updated, then the database was updated so links always led to their intended destination. But as the internet ballooned in size, keeping track of every link proved impossible. So Tim Berners-Lee, the inventor of the World Wide Web, came up with a solution — just don’t validate links. Instead, a broken link would display the dreaded 404 error. Codes starting with “4” refer to a user-side error, and “04” simply means requesting a nonexistent (or no longer existent) address. While the 404 error allowed the early internet to flourish, it also created a few problems, chief among them being “link rot” — which describes the general tendency for links to break over time. But like it or not, “404” is here to stay.

Woman accessing bluetooth her mobile phone.
Credit: Westend61 via Getty Images

Bluetooth Icon

King Harald “Bluetooth” Gormsson ruled over Denmark and Norway in the 10th century, but he’d likely be surprised to learn that his name would instead be known for a short-range wireless technology invented more than a millennium later. Developed in 1994 by Dutch inventor Jaap Haartsen, Bluetooth gets its name from Gormsson’s famous nickname — a nod to his legendary dead “blue” tooth — but honoring Gormsson wasn’t a random hat tip to a historical king. When the three companies Intel, Ericsson, and Nokia met to create a wireless communication standard, Intel’s Jim Kardach suggested Bluetooth as a temporary code name, saying that “King Harald Bluetooth… was famous for uniting Scandinavia just as we intended to unite the PC and cellular industries with a short-range wireless link.” But the temporary code stuck, and the world-renowned symbol for the technology became a combination of the Scandinavian runes Hagall (ᚼ) and Bjarkan (ᛒ), which stand for the famous Viking king’s initials.

USB charging via a USB power supply.
Credit: lhluo8/ Shutterstock

USB Symbol

Universal Serial Bus, or USB, has been through some changes since its debut in the mid-’90s. There’s USB 2.0, 3.0, and now Type-C (which has finally made its way to the iPhone). Such a powerful technology capable of connecting a variety of peripherals deserves an equally powerful symbol, so why not use Neptune’s trident? This divine pitchfork-inspired symbol has been around since the technology’s inception, and the different symbols at its three tips — a circle, square, and triangle — represent all the disparate technologies that can now be connected via the Universal Serial Bus.

View of the WiFi button on a modem.
Credit: Stephen Phillips/ Unsplash

Wi-Fi Sign

This ubiquitous wireless technology got its name when the Institute of Electrical and Electronics Engineers (IEEE) created the 802.11 standard that we now simply call Wi-Fi (a big improvement from the name “IEEE 802.11b Direct Sequence”). The Wi-Fi Alliance, which owns the Wi-Fi trademark, created a logo inspired by the yin-yang symbol from Chinese philosophy. This was a nod to the technology’s universal compatibility but also to its literal function — it existed yet it was invisible. But a practical symbol that conveyed the strength of that Wi-Fi connection had to translate the invisible into the visible, so the Wi-Fi signal symbol was created to express invisible electromagnetic waves, which propagate as radio waves to deliver wireless internet to your devices.

Play and rewind buttons on an electronic device.
Credit: Steve Mann/ Shutterstock

Play Button

Although many of these symbols have definitive beginnings, the play button’s exact origin story is a bit of mystery. What is known is that the symbol first appeared on reel-to-reel tape decks in the mid-1960s — the grandfather of the cassette tapes that ruled the ’80s (and are now making an unexpected comeback). Luckily, the reason for the play button is much more clear, as it simply points in the direction in which the tape moves.

Selective focus on the @ of a laptop keyboard.
Credit: TopMicrobialStock/ iStock

@ Symbol

While the rest of these icons have their foundations in the digital age, the @ symbol is an emblem from the medieval era that almost fell into obscurity. Although scholars debate where the @ symbol originated — some say medieval monks invented it while looking for handwriting shortcuts — its first definitive use arrived in the mid-16th century, and merchants often used the symbol to communicate buying rates (bananas @ $1, for example). However, the information age eschewed the @ symbol, and it almost disappeared entirely until 1971, when Bolt, Beranek and Newman (BBN) programmer Raymond Tomlinson decided to use the neglected symbol to separate users and terminals in computer network addresses — or what we today call “email.”

“I was mostly looking for a symbol that wasn’t used much,” he once told the Smithsonian. “And there weren’t a lot of options.” With the more recent addition of the symbol in social media handles, the reign of “@” is strong.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by PictureLux / The Hollywood Archive/ Alamy Stock Photo

It wasn’t socially groundbreaking like All in the Family, nor a long-running audience favorite like M*A*S*H. Nevertheless, Mork & Mindy carved out its own place in the hearts of viewers with its endearing spin on the fish-out-of-water concept — and, of course, by introducing the world to the unparalleled talents of Robin Williams. Here are six fun facts about this standout sitcom from the late 1970s and early ’80s.

A still from the television series, 'Mork and Mindy'.
Credit: Hulton Archive/ Archive Photos via Getty Images

Mork Originated on “Happy Days”

Fans may remember that Mork from Ork initially appeared in Richie Cunningham’s dream during a February 1978 episode of Happy Days, a premise apparently conceived of by the 8-year-old son of series creator Garry Marshall. Although this seemed like a terrible idea to the writers, they quickly realized the potential of the situation when the little-known actor Robin Williams wowed during his audition and rehearsals. Mork then proved a hit after going toe-to-toe with the Fonz on screen, prompting Marshall and his cohorts to devise a spinoff series about the character in time for the fall 1978 TV season. Meanwhile, the “My Favorite Orkan” Happy Days episode was reedited for syndication to show that the alien encounter was real.

Actress Pam Dawber posing for a still.
Credit: Bettmann via Getty Images

Co-Star Pam Dawber Never Auditioned for the Role of Mindy

While executives seized the chance to build a show around the comedic abilities of Williams, Pam Dawber had no idea she’d been tapped for the role of Mindy McConnell. With a resume that mainly consisted of modeling and commercial work up to that point, she was chosen for her “honest, all-American girl” performance in a rejected pilot called Sister Terri. Dawber initially wasn’t happy to learn she was starring in a “stupid” TV show about an alien, but she warmed to the idea after meeting with Marshall, and especially after watching Williams’ footage from Happy Days. “I was laughing out loud watching that show, and I remember going, ‘Oh my God, oh my God, oh my God, I am so lucky,'” she later revealed. “It was just like, ‘Where do I sign?'”

Robin Williams, US actor and comedian, in costume, holding a helmet under his arm.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Mork’s Spacesuit Was Recycled From an Episode of “Star Trek”

Since Mork was first meant to be a one-off character, there wasn’t a whole lot of thought put into his appearance; someone simply grabbed a red spacesuit from the Paramount wardrobe collection, added a silver triangle, and the Ork uniform was born. It’s unknown whether anyone at the time caught the uncanny resemblance between Mork’s suit and the one worn by Colonel Green in the 1969 Star Trek episode “The Savage Curtain,” but we do know that Mork & Mindy dipped into the Star Trek archives at least one more time: The spaceman costume worn by Mindy’s father (Conrad Janis) in the “Mork Goes Public” episode of season 1 was comprised of a helmet and suit from two separate episodes of the sci-fi predecessor.

American actress Morgan Fairchild posing.
Credit: Steve Wood/ Hulton Archive via Getty Images

Numerous Guest Stars Appeared on the Series

As with Williams’ early showing on Happy Days, several up-and-coming talents used Mork & Mindy as a springboard to greater fame. This included Morgan Fairchild, who enjoyed a recurring role as Mindy’s old rival on season 1, and young comedians David Letterman and Paul “Pee-Wee Herman” Reubens, who appeared in seasons 1 and 4, respectively. On the flip side, Raquel Welch was already an international movie star by the time of her two-episode run as Captain Nirvana in season 2, while William Shatner provided another connection to Star Trek with his appearance late in season 4.

American actor and comedian Robin Williams as alien visitor Mork.
Credit: Silver Screen Collection/ Moviepix via Getty Images

Williams Became Disenchanted With Playing Mork

Although it was the role that skyrocketed him to stardom, Williams soon grew tired of the constraints of playing a naive space alien. Series director Howard Storm later recalled the difficulty of convincing Williams to deliver the audience-pleasing catchphrase “nanu nanu,” while one of the show’s writers remembered the star derisively calling his character “Morko the Pin-Headed Boy.” Fortunately, Williams received a jolt when his hero Jonathan Winters, who appeared as Mindy’s uncle in season 3, joined the regular cast in season 4 as baby Mearth. “Having him on the show was one of the main reasons I stayed with it,” noted Williams, per Dave Itzkoff’s biography, Robin. “For me, it was like the chance to play alongside Babe Ruth.”

Williams and Dawber in the TV sit-com Mork And Mindy.
Credit: Frank Edwards/ Archive Photos via Getty Images

The Ending Was Shuffled to Accommodate the Series Cancelation

With the novelty of the series long gone, Mork & Mindy was canceled near the end of its fourth season. This wound up catching producers off guard, as plans were already in place for a time-traveling fifth season that paired Mork and Mindy with historical characters like Abraham Lincoln. This would explain the somewhat confusing end to the show, which sees the leads transported to a prehistoric cave at the end of a three-episode arc. In an attempt to compensate, producers shifted “The Mork Report,” a flashback-infused episode about the Mork-Mindy relationship, from an earlier scheduled broadcast date to the series finale.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by BARBARA LOPEZ/ Shutterstock

For the last 5,000 years or so, the vast majority of human knowledge has been passed down through writing, from clay tablets to papyrus scrolls to today’s e-readers. Here are eight fascinating facts about books that you may not have learned at the library. They prove that reading really is fundamental.

Carved wood standard of Ur and Sumerian civilization.
Credit: DEA PICTURE LIBRARY/ De Agostini via Getty Images

The Sumerians Started It All

The first known example of writing developed around 3500 BCE in the Persian Gulf region of Mesopotamia (now south-central Iraq). The Sumerian civilization there used pointed reeds to inscribe characters onto clay tablets, a form of writing now known as cuneiform. The Gilgamesh tablet, thought to be the oldest surviving work of human literature, was created by the Sumerians; it was looted during the first Gulf War, but is now back in Baghdad.

View of Egyptian hieroglyphics.
Credit: Leemage/ Corbis Historical via Getty Images

The Egyptians Came After  

The earliest pharaonic civilization of ancient Egypt developed its own system of writing a few hundred years after the Sumerians, around 3100 BCE. Hieroglyphics (meaning “sacred carvings” in ancient Greek) combined pictographs with symbols designating sound and syllables to celebrate the lives of the gods and the deeds of Egyptian royalty, who were worshiped as gods themselves. Hieroglyphic writing was indecipherable for 1,500 years, until French scholar Jean-François Champollion deciphered the Rosetta Stone (which included hieroglyphs side by side with ancient Greek) in 1822.

The codex text on a scroll.
Credit: Vladimir Zapletin/ iStock

The Codex Conquers the Scroll

Papyrus reeds grew plentifully (and almost exclusively) along the Nile, and the enormously profitable art of papermaking was a closely guarded Egyptian secret for centuries. Soon the preferred writing material for Egyptians spread throughout the Mediterranean. It was the Romans who popularized the switch from papyrus scrolls (which could exceed 100 feet in length and required two hands) to the codex, where sheets of papyrus or parchment were stacked and bound between two wooden covers.

Buddhist text called the Diamond Sutra.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Asia Printed Long Before Gutenberg

While scribes and illuminators in European monasteries were laboriously copying and decorating manuscripts by hand, the Chinese were making books via the art of woodblock printing, which was developed during the Tang Dynasty, around 700 CE. Japan’s Empress Shōtoku commissioned the Hyakumanto Darani (“The One Million Pagodas and Dharani Prayers”) in 764 CE. A Buddhist text called the Diamond Sutra is the earliest example of a dated, printed book (not scroll) and was printed in 868 CE. Woodblock printing was laborious, as each page was carved by hand.

Johannes Gutenberg in his workshop.
Credit: Heritage Images/ Hulton Archive via Getty Images

Gutenberg Sped Things Up

Movable metal type wasn’t his invention, but Johannes Gutenberg’s improvements around 1448 commercialized the process of printing, bringing books within the reach of common people. (Prior to this, books were almost solely possessions of the very wealthy or the church.) The German goldsmith printed 180 copies of the Bible, and sparked a revolution. The popularization of the printing press made books much cheaper to produce, allowing ideas (like the Protestant Reformation) to spread quickly.

Pages flipping through a bible.

The Bible May Be the Bestselling Book of All Time

Fewer than 50 editions of the Bible Gutenberg printed are still in existence, and only 16 of those are complete copies. If you found one in the attic today, it would probably fetch at least $35 million. And even non-Gutenberg Bibles are big business: The Christian Bible is said to be the bestselling book of all time, with at least 5 billion copies having been printed. The Book of Mormon and Quotations From Chairman Mao Zedong are up there as well, and Cervantes’ Don Quixote tops the fiction chart, with more than 500 million copies sold since it was written in 1605.

View of the oldest surviving Hebrew Bible or Tanakh.

But the Bible Is Not the Most Expensive Book

The oldest extant copy of the Hebrew Bible, the Codex Sassoon (created some time between 880 and 960 CE), sold at auction in 2023 for $38 million, making it the most expensive Jewish manuscript in the world. But it’s a science book, not a religious text, that currently holds the title of most expensive book in the world. In 1994, Microsoft founder Bill Gates paid more than $30 million for the Codex Leicester, the handwritten and illustrated notebook of Renaissance legend Leonardo da Vinci. In today’s dollars, that makes the codex worth around $60 million. (But you can buy a copy today for $35.) Meanwhile, a rare first printing of the U.S. Constitution sold for $43 million in 2021,  but you can read a copy here absolutely free.

Robert Downey Jr. in Sherlock Homes movie.
Credit: Album/ Alamy Stock Photo

Great Books Make Great Movies

Harry Potter was far from the first: Moviemakers have been adapting books to the screen since the beginning of the motion picture industry. In 2012, Guinness World Records crowned Sir Arthur Conan Doyle’s Sherlock Holmes as “the most-portrayed literary human character in film and TV,” with 254 on-screen depictions. Dracula (not a human) is the most-portrayed character overall, with 272 film adaptations and counting. Mary Shelley’s Frankenstein, meanwhile, has been adapted at least 80 times. Jane Austen’s novels could sustain their own motion picture studio, and Stephen King might as well pass on the printing and skip straight to the screenplays, since his books are almost immediately adapted to the big screen.

Cynthia Barnes
Writer

Cynthia Barnes has written for the Boston Globe, National Geographic, the Toronto Star and the Discoverer. After loving life in Bangkok, she happily calls Colorado home.

Original photo by Anna_Pustynnikova/ Shutterstock

Autumn officially begins on the fall equinox, but increasingly, the cultural start of fall is whenever pumpkin spice becomes available. The warming spice mix — typically including cinnamon, ginger, nutmeg, cloves, and/or allspice — has been associated with the harvest since long before Starbucks debuted their pumpkin spice latte in 2003, but the beverage certainly ushered in the mania for all things pumpkin spice.

That mania has grown far beyond pie or even coffee. Each August, the shelves of supermarkets and menus of coffee shops and bars begin to fill up with pumpkin spice beer, candy, and dog treats. Pumpkin spice Spam even came along in 2019, and Hefty now makes pumpkin spice trash bags.

Think you’re a pumpkin spice aficionado? Test your knowledge on fall’s most iconic flavor with these six autumnal facts.

Fresh homemade pumpkin pie made.
Credit: Brent Hofacker/ Shutterstock

The Flavor Dates Back Hundreds of Years

People have been mixing together warming spices like cinnamon and ginger for thousands of years — hello, chai! — and this kind of spice set has been associated with pumpkin for centuries. The 1798 edition of American Cookery has two pumpkin pie recipes, with slightly different spice variations, but featuring nutmeg, allspice, and ginger. Branding got involved in the 1930s, when mass-market spice companies such as McCormick’s started selling premade spice blends as “pumpkin pie spice” (and, adjacently, “apple pie spice,” although there’s a wide breadth of opinion on what separates the two).

Starbucks was far from the first to put this autumnal spice blend together with coffee; people have been putting cinnamon, cloves, and other such spices in their java for centuries. But when employees formulated the infamous PSL in the company’s Liquid Lab in 2003, they brought in actual pumpkin pie to develop the flavor (more on that below).

Starbucks logo sign.
Credit: Athar Khan/ Unsplash

Starbucks Pumpkin Spice Didn’t Originally Contain Pumpkin

Pumpkin pie spice blends are made to spice pumpkin, not mimic pumpkin, and, similarly, the pumpkin spice latte only referred to the seasoning. (After all, it’s not a pumpkin pie latte.) Many pumpkin spice-flavored things don’t contain pumpkin — just cinnamon, nutmeg, ginger, cloves, and sometimes allspice. But after a 2015 refresh, the pumpkin spice latte does contain squash (specifically kabocha). The recipe revamp also removed artificial flavors and caramel coloring and added fruit and vegetable juices. Pumpkin puree is currently the third ingredient in the chain’s Pumpkin Spice Sauce, right behind sugar and condensed skim milk.

Close-up of pumpkin flavored drinks from Starbucks.
Credit: ZUMA Press Inc/ Alamy Stock Photo

The PSL Was Almost the FHL (Fall Harvest Latte)

Before 2003, Starbucks didn’t have an autumnal beverage; their first seasonal drink was the wintertime favorite peppermint mocha. While developing what would ultimately become pumpkin spice, the team in Starbucks’ Liquid Labs brainstormed a bunch of flavors, including chocolate and caramel, which polled the best among customers. Pumpkin wasn’t a runaway favorite, but because customers had indicated it was unique, they kept it as an option. The team members brought in fall decorations and pumpkin pies, alternating slices with sips of espresso to figure out which pie flavors complemented the coffee best.

Once they tried the finished product next to the chocolate and caramel options that were also in the running, pumpkin spice was the clear winner. When it came time to name it, they considered a less obvious name; Peter Dukes, who led the team that made pumpkin spice, said that “fall harvest latte” was in the running.

A Starbucks barista prepares a drink.
Credit: Ramin Talaie/ Corbis Historical via Getty Images

Most PSL Drinkers Only Get One Per Season

Seasonal pumpkin products bring in a cool $800 million every year, and with all the hullabaloo, you’d think PSL fans would have a Starbucks cup in their hand every day. While loyal customers do come back every year for the autumn ritual, most customers are only getting their fix once or twice a year. According to data from 2015, the vast majority of pumpkin spice purchasers (72%) have just one per year; 20% have two. It’s only the die-hard fans, the remaining 8%, who have a PSL three or more times in a year.

A Starbucks coffee cup and coffee beans.
Credit: PAUL J. RICHARDS/ AFP via Getty Images

Pumpkin Spice Season Starts Earlier Almost Every Year

No, you’re not imagining it: Festive fall beverages are creeping into your life a little earlier every year. PSL’s original limited rollout in 2003 started on October 10 — practically Christmas, by marketing standards.

The latest release date in the last decade was September 8 in 2015, the year the recipe changed to include real pumpkin. Most years since then it has darkened our door a little bit earlier. In 2024, Starbucks’ fall menu returned on August 22.

Starbucks no longer controls the start of pumpkin spice season, though. Competitor Dunkin’ launched its 2023 (pumpkin-less) pumpkin spice latte on August 16. 7-Eleven got an even bigger jump on autumn in 2023, making their take on the pumpkin spice latte available on August 1.

Close-up on fire and rescue words on the red fire truck.
Credit: charnsitr/ Shutterstock

Pumpkin Spice Shut Down an Entire Baltimore High School

One Thursday afternoon in October 2017, students and teachers on the third floor of Cristo Rey Jesuit High School in Baltimore noticed a weird smell — and it was getting stronger. After some people reported difficulty breathing, the school president evacuated the building and called the fire department. The hazmat team arrived and tested for dangerous materials. Finding none, they opened the third-floor windows and took a closer look.

The source of the strange odor turned out to be a plug-in pumpkin spice-scented air freshener in a third-floor classroom — perfectly safe, although perhaps overly air “freshening.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.