Nearly 25 years after Ulysses S. Grant’s death, a peculiar story hit the pages of the Washington Evening Star. Within the paper’s Sunday edition one day in 1908, retired police officer William H. West recounted how he had caught the 18th president speeding through the streets of Washington, D.C. — and decided the only appropriate course of action was to proceed with an arrest.
The U.S. government has at times set the maximum speed limit on highways.
President Nixon signed the Emergency Highway Energy Conservation Act in 1974, dropping highway speed limits to 55 mph. At the time, OPEC’s 1973 fuel embargo had created an energy crisis; Nixon’s move tried to reduce fuel use by cutting down on speeding. The law was repealed in 1995.
West’s tale hearkened back to 1872, during a particularly bad bout of traffic issues, when complaints of speeding carriages were on the rise. West had been out investigating a collision when he witnessed Grant — then the sitting president — careening his horse-drawn carriage down the road. The officer flagged down the carriage, issued a warning, and sent Grant on his way. But Grant, who had a reputation for high-tailing horse rides, couldn’t resist the need to speed. West caught him the very next day once again tearing through the city. Feeling he had no other option, the officer placed the president under arrest. At the police department, Grant was required to put $20 (about $490 in today’s money) toward his bond before being released.
Historian John F. Marszalek, who oversaw Grant’s presidential collection at Mississippi State University, says the situation blew over pretty quickly. Grant’s arrest wasn’t the first time he had been cited for speeding. It also wasn’t a political quagmire for either party. At the time, West — a formerly enslaved Civil War veteran who became one of just two Black police officers in Washington, D.C., immediately after the war — was commended for his actions in trying to make the city streets safer. And Grant owned up to his mistake — though he did choose to skip his court appearance scheduled for the following day, which meant he forfeited his $20. He didn’t face any further consequences, however.
U.S. presidents aren’t allowed to drive on public roads after leaving office.
After their term is up, U.S. presidents don’t just leave behind the keys to the White House — they effectively hand over their car keys, too. Turns out, the highest government officials in the land are soft-banned from driving in public, a task that’s handed over to their Secret Service detail at the beginning of their presidency, and continues for the rest of their life due to security risks. The Former Presidents Act of 1958 sets out retirement parameters for presidents (including staffing and pay), and while it doesn’t explicitly say former leaders can’t drive themselves around, several presidents have alluded to the unwritten rule’s enforcement by the Secret Service. Lyndon B. Johnson (1963–1969) is credited as the last president to routinely drive himself, and was known for sporting a convertible Lincoln Continental, though the president’s car collection also included a German Amphicar — the first mass-produced amphibious automobile made for civilians — and a Jolly 500 Ghia, a gift from the Fiat Company so rare that it couldn’t be restored due to a lack of existing parts.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Less is more in the Hawaiian alphabet, which consists of just 13 letters: A, E, I, O, U, H, K, L, M, N, P, W, and the ‘okina, which represents the glottal stop consonant — a sound produced by the abrupt obstruction of airflow in the vocal tract. Known as ka pīʻāpā Hawaiʻi in Hawaiian, the alphabet traditionally lists the five vowels first and also includes the kahakō, a bar above vowels that indicates an elongated vowel sound.
When British explorer James Cook made the first known European expedition to the Hawaiian islands in 1778, he spelled the islands’ name as both “Owhyhee” and “Owhyee.” Hawaiian was purely an oral language at the time; its written form wasn’t formalized until American missionary Elisha Loomis printed a primer titled simply “The Alphabet” in 1822. This written alphabet initially consisted of 21 letters before being standardized in 1826, although four of the original letters (F, G, S, and Y) were included only for the purpose of spelling foreign words. Other letters — B, R, T, and V — were excised because they were considered interchangeable with existing letters.
Hawaii is the only U.S. state with two official languages.
English and Hawaiian are both official languages of the Aloha State, but it isn’t the only state with multiple tongues. South Dakota also recognizes Sioux, and Alaska recognizes more than 20 Indigenous languages.
By 1834, Hawaii's literacy rate was estimated to be between 90% and 95%, one of the highest in the world at the time. But the Hawaiian language declined in usage after 1896, when Act 57 of the Laws of the Republic of Hawaii made English the “medium and basis of instruction” for all schools, after which schoolchildren were sometimes even punished for speaking Hawaiian. The language has seen a resurgence since the 1970s, with several groups working toward preserving it.
Hawaii was an independent kingdom for nearly a century.
Six years after George Washington became the first president of the United States, another ruler came into power on the other side of the Pacific: Kamehameha I, who established the Hawaiian Kingdom in 1795 by conquering the islands of Maui, Moloka‘i, O‘ahu, and Lāna‘i. Kauaʻi and Niʻihau joined willingly 15 years later, making every inhabited island part of the kingdom.
The House of Kamehameha reigned until 1874, when the House of Kalākaua came into power. The kingdom was overthrown in 1893 by the United States, which the U.S. officially acknowledged a century later with 1993’s Apology Resolution. The joint resolution acknowledged that “the Indigenous Hawaiian people never directly relinquished their claims to their inherent sovereignty as a people or over their national lands to the United States.” The Hawaiian sovereignty movement continues to this day.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
There are nearly 2,000 known species of cacti, all of which are native to the Americas alone — except for one. That would be Rhipsalis baccifera, also known as the spaghetti cactus or the mistletoe cactus, which grows wild in India, Sri Lanka, and Africa as well as parts of the Americas. Even stranger than the idea of a single cactus species thousands of miles away from its prickly relatives is the fact that scientists aren’t exactly sure how R. baccifera ended up in the Eastern Hemisphere to begin with. The epiphytes (also called air plants) are remarkably resilient, able to survive without soil by drawing moisture from the air, and the many theories attempting to explain their broad distribution fit their strange nature.
Should you ever find yourself lost in the desert, resist the urge to drink from the nearest cactus. Cacti protect themselves with alkaloids and acids, both of which are present in their water — and neither of which are a good idea for humans to drink.
One explanation is that birds snacked on the white berries containing R. baccifera’s seeds somewhere in South America before flying all the way to Africa, where they passed those seeds and essentially planted the cactus on the other side of the world. Problem is, scientists don’t know of any berry-snacking birds that could have actually made that journey. Another theory suggests that sailors used the cactus, with its fetching long green branches, to decorate their living quarters while journeying across the Atlantic from Brazil, then left them behind upon arriving in Africa. Yet another theory posits that the plant could have existed way back when Africa and the Americas were part of one supercontinent called Gondwana — which then broke up around 184 million years ago, leaving a little cactus on both sides. However, it’s unlikely the plant existed all those years ago. The truth is, we’ll probably just have to embrace the mystery of it all.
The largest cactus in the United States is the saguaro.
Advertisement
Prickly pears are a common ingredient in Mexican cuisine.
Known as the prickly pear in English, these species of Opuntia cactus are more commonly called nopales in Spanish. Rich in antioxidants and fiber, the cactus paddles — with needles removed, of course — are a key ingredient in a number of Mexican dishes. Nopales con huevos, or scrambled eggs with nopales, are a favorite, as are salads and tacos featuring them; sometimes nopales are simply eaten as a side vegetable. With a taste that’s been described as a cross between green beans and asparagus (pears and watermelons have also been mentioned as points of reference), they’re enjoyed for their versatility and nutritional value alike. They can even be added to margaritas.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
We humans have somewhere between 20,000 and 25,000 genes — a sizable number to be sure, but still considerably fewer than the 31,760 in everyone’s favorite nightshade. Though scientists still aren’t sure why tomatoes have such a complex genome, an emerging theory relates to the extinction of the dinosaurs. Around the time those giant creatures disappeared from Earth, the nightshade family (solanaceae) tripled its number of genes. Eventually the superfluous copies of genes that served no biological purpose disappeared, but that still left a lot of functional ones; some believe the extra DNA helped tomatoes survive during an especially perilous time on the planet, when it was likely still recovering from the aftereffects of a devastating asteroid.
Despite being less sweet than the likes of apples and peaches and sometimes being used as vegetables in cooking, tomatoes are botanically considered fruits because they form from a flower and contain seeds.
Humans, meanwhile, have two copies of every gene: one from their mother and one from their father. The number of genes doesn’t necessarily imply biological sophistication, but rather how an organism “manages its cells’ affairs” — simply put, humans make more efficient use of the genes they have. The Human Genome Project, which launched in 1990 and took 13 years to complete, successfully mapped and sequenced every single gene found in Homo sapiens. With thousands of scientists involved, it remains the largest international collaboration ever undertaken in the field of biology — until the Tomato Genome Project is launched, that is.
Lycopersicon lycopersicum, one scientific name for the tomato, means “wolf peach.”
Advertisement
People used to think tomatoes were poisonous.
The humble tomato used to have a far more sinister moniker: “poison apple.” In the 18th century, many believed that European aristocrats were falling ill and even dying after eating tomatoes — a misconception stemming from the use of pewter plates, which contained high lead content. Tomatoes, which are highly acidic, would leach that lead and then poison the unlucky eater. The fear of tomatoes was just as prevalent across the pond, where some American farmers believed that the green tomato worm was “poisonous as a rattlesnake.” An entomologist eventually set the record straight, and by the late 1800s more people began to appreciate tomatoes for the nutritious treat they are.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Original photo by Everett Collection/ Shutterstock
Not many Americans know the name Charles G. Dawes today, but they should. As one of only three U.S. vice presidents to receive the Nobel Peace Prize during their lifetimes (for his work to preserve peace in Europe), he has reserved a place in the history books alongside Theodore Roosevelt and Al Gore. But perhaps even more notably, he’s also the only veep with a No. 1 hit pop song. Dawes was a self-trained pianist and flautist as well as a banker, and in 1911, 14 years before he would become Calvin Coolidge’s vice president, he wrote a short instrumental piece titled “Melody in A Major.” The song received some attention during Dawes’ lifetime, but it wasn't until 1951 — the year he died — that American songwriter Carl Sigman put lyrics to Dawes’ creation and called it “It’s All in the Game.” Seven years later, Tommy Edwards became the first Black artist to reach No. 1 in the U.S. with his doo-wop-influenced rendition of Sigman’s song.
Bob Dylan, the only U.S. songwriter to win the Nobel Prize, has no No. 1 hits.
Not true, but just barely. Surprisingly, folk legend Bob Dylan has only one No. 1 hit under his own name, and it arrived in 2020 with the 17-minute-long song “Murder Most Foul,” which debuted atop Billboard’s Rock Digital Song Sales chart (his first time topping any Billboard chart).
But that wasn’t the end of Dawes’ posthumous music stardom. The song soon transformed into a pop standard, and was covered by a variety of artists across several genres. There’s Nat King Cole’s big band affair (1957), Elton John's upbeat cover (1970), Van Morrison’s sorrowful take (1979), Issac Hayes’ soulful remix (1980), and Merle Haggard’s country creation (1984), just to name a few. To this day (and for likely many days to come), Dawes remains the only chief executive — president or vice president — to score a hit on the Billboard Hot 100.
The song “Hail, Columbia” now honors the vice president, though it was once an unofficial national anthem.
Advertisement
President Calvin Coolidge had a lot of pets, including a pygmy hippopotamus.
While Charles Dawes had a No. 1 pop single, his boss also had a few quirks. Calvin Coolidge, the 30th U.S. president, had arguably the most exotic collection of pets of any American chief executive (though Theodore Roosevelt gave him a run for his money). During his presidency, Coolidge had six dogs, a bobcat, two raccoons, a goose, a donkey, a cat, a bear, two lion cubs, an antelope, a wallaby, and more. But the strangest of Coolidge’s pets was probably Billy, a pygmy hippopotamus, who was given to Coolidge as a gift from businessman Harvey Firestone (as in Firestone tires). Perhaps because of his size (even a pygmy hippo can weigh up to 600 pounds), or because he was one of only a few pygmy hippos in the U.S., Billy was donated to the Smithsonian National Zoological Park, where he became the proud father of many hippo calves. In fact, most of the pygmy hippos in the U.S. today can be traced back to his lineage.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Original photo by Niebrugge Images/ Alamy Stock Photo
Although many of the Grand Canyon's visitors make a point of packing into the tourist stop known as Grand Canyon Village, far fewer realize there's a bona fide village nestled into Havasu Canyon some 3,000 feet below. There, amid the towering limestone cliffs that surround the Havasupai Indian Reservation, live the 200 or so Native Americans who populate the remote hamlet of Supai.
The Havasupai are the only Native American tribe with territorial ties to the Grand Canyon.
The Hualapai Tribe, who live on land that borders the Havasupai Reservation, oversee tourist operations at Grand Canyon West. Altogether, 11 federally recognized tribes have historically used the land that comprises Grand Canyon National Park.
For those who don't feel like splurging on a helicopter ride, simply reaching Supai is a feat unto itself. From the nearest town of Peach Springs, travelers embark on a 67-mile drive to Hualapai Hilltop, at which point they can descend the 8-mile trail by foot or mule ride. That's the route taken daily by the USPS, which brings in vital supplies like food while also carrying mail stamped with a unique "Mule Train" postmark. Those who complete the journey can refresh themselves at the general store and tribal cafe, or rest up for the return trip with an overnight stay at the local lodge. Many others continue along the trail to the reservation's campgrounds and famous waterfalls.
Although the isolation brings unparalleled views of ancient landscapes and turquoise pools, it also involves an element of danger. Severe rains damaged buildings, bridges, and even parts of the lone path in and out of the village in 2010. The pandemic also forced the Havasupai to close off tourist access to their grounds from 2020 to 2022, but they reopened for business in 2023.
The Grand Canyon was established as a national park by President Woodrow Wilson.
Advertisement
The turquoise waters of Supai are caused by a combination of geology and chemistry.
A life force sacred to the Havasupai, whose traditional name means “people of the blue-green waters,” the water that flows through Supai via Havasu Creek gets its start from the rain and snowmelt that accumulate in the porous limestone above. Saturated with calcium carbonate from dissolved limestone, this groundwater is eventually funneled through rock layers to a spring. When the gushing spring emerges from underground, a chemical reaction causes the calcium carbonate to appear turquoise in the sunlight, a breathtaking sight for both first-time observers and longtime residents.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
It’s easy to think of major cities such as London and New York as sprawling grids of concrete, brick, and metal — but it may surprise you to learn that both cities also maintain roughly 20% tree canopy cover. That’s an urban forest of more than 8.4 million trees across Greater London and approximately 7 million trees in New York City. While other major cities have even higher tree canopy coverage — including Oslo at ~72%, Atlanta at ~50%, and Singapore at ~29% — what sets London and New York apart is the amount of greenery they contain given their massive size, population density, and global urban footprint.
The London plane, a hybrid tree that received its name from its ability to withstand the city’s air pollution during the Industrial Revolution, is the most common street tree in both London and New York City. Other widely planted species include sycamore, oak, and silver birch in London and littleleaf linden, Norway maple, and green ash in NYC.
Trees use underground mycorrhizal fungi networks to exchange nutrients and send chemical signals — even warning nearby trees about pests or disease. The phenomenon is known as the “wood wide web.”
These urban trees aren’t just decorative. They reduce air pollution, lower urban temperatures, and absorb stormwater that would otherwise flood drains. In New York alone, trees remove around 1,100 tons of air pollution each year and intercept more than 890 million gallons of stormwater, easing pressure on the city’s infrastructure.
Greater London’s trees provide around £132.7 million worth of ecosystem services annually, which include air purification, temperature regulation, building energy savings, and rainwater interception. And of course, tree cover also supports and shelters city wildlife. So while their skylines may steal the spotlight, it’s the tree canopy that quietly keeps these cities cooler, cleaner, and more inviting.
The London plane is a hybrid of the American sycamore and the Oriental plane.
Advertisement
Tree Preservation Orders have been protecting U.K. trees since 1947.
Introduced under the Town and Country Planning Act 1947, Tree Preservation Orders (TPOs) give local authorities in the U.K. the power to legally protect specific trees, groups of trees, or woodlands if their removal would negatively impact the local environment or public enjoyment.
TPOs make it an offense to cut down, top, lop, uproot, or willfully damage a protected tree without permission. Today, thousands of trees across London are safeguarded by TPOs, ensuring the city’s urban forest continues to thrive for generations to come.
Kristina Wright
Writer
Kristina is a coffee-fueled writer living happily ever after with her family in the suburbs of Richmond, Virginia.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Much human innovation is a collective effort — scientists, innovators, and artisans building off the work of predecessors to develop some groundbreaking technology over the course of decades, or even centuries. But in the case of writing systems, scholars believe humans may have independently invented them four separate times. That’s because none of these writing systems shows significant influence from previously existing systems, or similarities to another. Experts generally agree that the first writing system appeared in the Mesopotamian society of Sumer in what is now Iraq. Early pictorial signs appeared some 5,500 years ago, and slowly evolved into complex characters representing the sounds of the Sumerian language. Today, this ancient writing system is known as cuneiform.
Experts estimate that at least half of the human population can speak two languages or more. However, the U.S. is drastically behind compared to other countries. According to the U.S. Census Bureau, only 20% of Americans speak another language, whereas that number is 56% in Europe.
However, cuneiform wasn’t a one-off innovation. Writing systems then evolved in ancient Egypt, in the form of hieroglyphs, around 3200 BCE — only an estimated 250 years after the first examples of cuneiform. The next place that writing developed was China, where the Shang dynasty set up shop along the Yellow River and wrote early Chinese characters on animal bones during divination rituals around 1300 BCE. Finally, in Mesoamerica, writing began to take shape around 900 BCE, and influenced ancient civilizations like the Zapotecs, Olmecs, Aztecs, and Maya. Sadly, little is known about the history of many Mesoamerican languages, as Catholic priests and Spanish conquistadors destroyed a lot of the surviving documentation.
“Hieroglyph” comes from ancient Greek words meaning “sacred carvings.”
Advertisement
There are many ancient languages that have yet to be deciphered.
Discovered in July 1799, the Rosetta Stone is perhaps the most famous linguistics discovery in human history; it turned out to be a 1,600-pound key that unlocked the ancient mysteries of the Egyptian language. However, many other lost languages haven’t been so lucky, including tongues such as Meroitic from Sudan, Linear A from Crete, and Proto-Elamite from Iran. But the most famous undeciphered written language is the Indus script, which is the oldest written language on the Indian subcontinent and dates back to around 2600 BCE. Because this script has no bilingual text like the Rosetta Stone (at least not so far), the language has remained incomprehensible — a major reason why the Indus Valley Civilization is one of the least-known major civilizations in ancient history. It’s even possible that the Indus script is a fifth example of independently evolved language, though it’s impossible to know for sure without deciphering it.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Horsepower, a common unit of power typically referring to the sustained output of an engine, was developed in the late 18th century by Scottish engineer James Watt (after whom the watt is named) as a way to demonstrate the power of steam engines. Watt calculated that in an average day’s work, a horse could turn a 24-foot mill wheel roughly 2.5 times per minute. This amount of energy worked out to 33,000 foot-pounds (approximately 746 watts) per minute, which Watt deemed a new unit of measurement called horsepower.
Logic would suggest the power of a solitary horse should equal one horsepower, but the measurement is meant to represent a horse’s continuous output over a full workday, not what horses are capable of in short bursts of extreme effort. In 1993, biologists R.D. Stevenson and R.J. Wassersug used data from the 1925 Iowa State Fair horse-pulling contest to calculate the maximum output of a horse over a short period of time, ultimately finding that one horse can exert up to 14.9 horsepower. Humans, by comparison, have a maximum output of slightly more than a single horsepower.
Due to the anatomy of their respiratory systems, horses are only able to breathe through their noses.
It's sometimes suggested that Watt deliberately underestimated the power output of a horse to help promote his new steam engine. But Watt’s calculations weren’t technically incorrect; he just presented them in such a way to make his engines seem more attractive. He emphasized sustainable rather than peak performance, underlining the fact that, unlike horses, his engines could work all day long without tiring. It’s because of this that a single horse can actually be capable of nearly 15 horsepower — at least over short periods of time.
The offspring of a male horse and a female donkey is called a hinny.
Advertisement
The earliest ancestor of the horse is estimated to have lived 55 million years ago.
Around 55 million years ago, the first members of the horse family were scurrying through the forests that covered much of North America. These hoofed mammals were called Hyracotherium, one of which was about as big as a medium-sized dog.
With its arched back, raised hindquarters, four functional hooves on each front foot, and three on each hind foot — unlike the unpadded, single-hoofed feet of modern equines — this early ancestor was quite unlike modern horses as we know them. Paleontologists initially thought the species entirely unrelated to equines, until fossils were found that showed a link between Hyracotherium and later extinct horses.
For more than half their history, the majority of horse species were small, forest browsers, eating leaves and twigs from trees and shrubs. Then, about 20 million years ago, new horse species began rapidly evolving when changing climate conditions allowed grasslands to expand. Some of these new grazers grew to much larger sizes, becoming more like the horses we’re familiar with today.
Tony Dunnell
Writer
Tony is an English writer of nonfiction and fiction living on the edge of the Amazon jungle.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Difficult times can lead to great art. Case in point: the volcanic explosion that caused a “year without a summer” in 1816 — and inspired the novel Frankenstein. The eruption took place at Indonesia’s Mount Tambora, many thousands of miles away from author Mary Shelley’s home in England. In addition to a harrowing death toll, the April 1815 explosion caused mass amounts of sulphur dioxide, ash, and dust to fill the atmosphere, blocking sunlight and plunging the global temperature several degrees lower, resulting in the coldest year in well over two centuries. In part because of the volcano, Europe and North America were subjected to unusually cold, wet conditions the following summer, including a “killing frost” in New England and heavy rainfall that may have contributed to Napoleon’s infamous defeat at Waterloo.
“Frankenstein” was originally published anonymously.
The literary world didn’t exactly welcome female authors with open arms in the early 19th century, and Shelley wasn’t alone in initially publishing her book anonymously. Many believed that Percy Shelley was the author of “Frankenstein,” a falsehood Mary later had to correct many times.
So what does that have to do with Shelley’s masterpiece? Then 18 and still going by her maiden name of Godwin, she and her lover/future husband, Percy Bysshe Shelley, traveled to Lake Geneva in April 1816, a time of extremely gloomy weather. One fateful night that July, the two were with their friend Lord Byron, the infamous poet, when he suggested, “We will each write a ghost story.” Shelley completed hers in just a few days, writing in the introduction to Frankenstein; or, the Modern Prometheus that “a wet, ungenial summer, and incessant rain often confined us for days to the house.” Who knows: If it had been bright and sunny that week, we might never have gotten the endlessly influential 1818 book, which later spawned an assortment of movies, TV shows, plays — and of course, iconic Halloween costumes.
Mary Shelley’s second novel was called “Valperga.”
Advertisement
Shelley claimed the idea for Frankenstein came to her in a waking dream.
After agreeing to Lord Byron’s ghostly prompt, Shelley initially struggled to come up with an idea for her tale. “I busied myself to think of a story,” she later wrote. “‘Have you thought of a story?’ I was asked each morning, and each morning I was forced to reply with a mortifying negative.” The idea eventually came to her one sleepless night, when her “imagination, unbidden, possessed and guided [her].” She then saw “the pale student of unhallowed arts kneeling beside the thing he had put together. I saw the hideous phantasm of a man stretched out, and then, on the working of some powerful engine, show signs of life, and stir with an uneasy, half vital motion. Frightful must it be; for supremely frightful would be the effect of any human endeavor to mock the stupendous mechanism of the Creator of the world.” Two years later, her book was published, leading to Mary Shelley eventually being hailed as the foremother of science fiction.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.