Original photo by eeqmcc/ iStock / Getty Images Plus
Every fingerprint is unique, but that doesn’t mean they’re easy to tell apart — especially since humans aren’t the only species that’s developed them. Chimpanzees and gorillas have fingerprints too, but it’s actually koalas — far more distant on the evolutionary tree from humans — whose prints are most similar to our own. This was first discovered by researchers at the University of Adelaide in Australia in 1996, one of whom went so far as to joke that “although it’s extremely unlikely that koala prints would be found at the scene of a crime, police should at least be aware of the possibility.”
Koalas are actually marsupials, which makes them more closely related to opossums and kangaroos than to grizzlies or pandas. A defining trait of marsupials is that they carry their young in a pouch called a marsupium — which is where they get their name.
That discovery lent support to one of the primary theories in the centuries-long debate over the purpose of fingerprints and their swirly microscopic grooves: They help grasp. Koalas’ survival depends on their ability to climb small branches of eucalyptus trees and grab their leaves to eat, so the fact that they developed fingerprints — which assist in that action — independently of primates millions of years ago is likely no coincidence.
“Koala” is believed to mean “no drink” in the Dharug Aboriginal language.
Advertisement
Before fingerprints, body measurements were used to identify criminals.
In 1879, several decades before the use of fingerprints became widespread, a French criminologist named Alphonse Bertillon developed a system based on body dimensions to identify and catalogue criminals and suspects. The five main measurements were head length, head width, length of the middle finger, length of the left foot, and length from the elbow to the end of the middle finger. Each of these was classified as being either small, medium, or large. Despite his insistence that “every measurement slowly reveals the workings of the criminal,” the system was imprecise, and eventually law enforcement agencies turned to fingerprinting instead.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Today carrots are practically synonymous with the color orange, but their auburn hue is a relatively recent development. When the carrot was first cultivated 5,000 years ago in Central Asia, it was often a bright purple. Soon, two different groups emerged: Asiatic carrots and Western carrots. Eventually, yellow carrots in this Western group (which may have developed as mutants of the purple variety) developed into their recognizable orange color around the 16th century, helped along by the master agricultural traders of the time — the Dutch.
Eating too many carrots can turn your skin orange.
Carrots contain a red-orange pigment called beta-carotene. Carotenemia occurs when eating too many beta-carotene-rich foods turns human skin a yellowish orange. If you were to eat 10 carrots a day for weeks, you could develop it — but doctors don’t recommend trying it.
A common myth says the Dutch grew these carrots to honor William of Orange, the founding father of the Dutch Republic, but there’s no evidence of this. What’s more likely is that the Dutch took to the vegetable because it thrived in the country’s mild, wet climate. (Although the orange color may have first appeared naturally, Dutch farmers made it the predominant hue by selectively growing orange roots — scholars say these carrots likely performed more reliably, tasted better, and were less likely to stain than the purple versions.) The modern orange carrot evolved from this period of Dutch cultivation, and soon spread throughout Europe before making its way to the New World. Today, there are more than 40 varieties of carrots of various shapes, sizes, and colors — including several hues of purple.
The Greek name for wild carrot was “philtron,” which means “loving.”
Advertisement
Purple is associated with royalty thanks to a rare mollusk.
For most of European history, creating a rich, resilient purple dye was an extremely expensive process. The dye could only be made from the dried mucus glands of murex shellfish found near the ancient Phoenician town of Tyre on the Mediterranean (now part of Lebanon). Making just 1 gram of this pigment, known as Tyrian purple, required nearly 9,000 of these mollusks, so only the very wealthy — emperors and royals — could afford to use the color. In ancient Rome, purple became associated with the power of the emperor, and the idea continued after the empire’s fall. In medieval and Elizabethan England, a series of sumptuary laws ensured that the color purple was reserved only for the most elite members of society “upon payne to forfett the seid apparel.” Luckily, in 1856, chemist William Henry Perkin accidentally created a synthetic purple dye, later called mauve, while trying to synthesize a drug for malaria. Purple’s imperial reign was over.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Animal-based names are surprisingly common when it comes to units of measurement. In addition to horsepower (which usually measures the output of engines or motors) and hogsheads (today mostly used for alcohol), there’s also the mickey — a semi-official means of measuring the speed of a computer mouse. Named after a certain Disney character who’s probably the world’s most famous rodent, it’s specifically used to describe the smallest measurable movement the device can take. In real terms, that equals 1/200 of an inch, or 0.1 millimeter. Both the sensitivity (mickeys per inch) and speed (mickeys per second) of a computer mouse are measured this way by computer scientists.
Mickey Mouse wasn’t the character’s original name.
The character was originally called Mortimer Mouse, which was changed at the behest of Lillian Disney, wife of Walt Disney. That name was reused for Mickey’s longtime rival, who first appeared in 1936.
Had the original name for the device stuck, it’s unlikely this measurement system would have come about. The mouse was briefly known as a “bug” when it was invented at the Stanford Research Institute to make computers more user-friendly, though that seems to have been a working title that no one was especially fond of. (That version of the device was also extremely primitive compared to the mice of today — it even had a wooden shell.) As for how the mouse got its current name, no one can quite remember, except that that’s what it looked like.
The computer mouse was invented by Douglas Engelbart.
Advertisement
A lot of people didn’t think the mouse would take off.
In perhaps one of the most infamous articles ever published about computers, the San Francisco Examiner’s John C. Dvorak wrote in 1984, “The Macintosh uses an experimental pointing device called a ‘mouse.’ There is no evidence that people want to use these things.” Written as a review of Apple’s landmark personal computer, which had launched earlier that year, Dvorak’s not-so-prescient article wasn’t exactly a hot take at the time. The relatively small number of people who used computers regularly back then were just fine using the keyboard for everything, and Dvorak was hardly alone in asserting that he didn’t want to use a mouse. His predictive abilities didn’t seem to improve with time, alas, as he also wrote that Apple should “pull the plug” on the iPhone prior to its 2007 release.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Radium is, quite famously, not good for you. Its effects on the body are deleterious, not that anyone realized this when Marie Curie discovered the alkaline earth metal in 1898 — a scientific breakthrough that led to her winning the 1911 Nobel Prize in chemistry. Before long, the dangerously false belief that radium had health benefits began to spread: It was added to everything from toothpaste and hair gel to food and drinks, with glow-in-the-dark paints made from radium still sold into the 1970s. It was marketed as being good for any “common ailment,” with radioactive water sold in small jars that shops claimed would “aid nature” and act as a natural “vitalizer.”
The five other earth metals — beryllium (Be), magnesium (Mg), calcium (Ca), strontium (Sr), and barium (Ba) — all weigh less.
Of course, none of this was true — exposure to even a small amount of radium can eventually prove fatal. Curie had no way of knowing this at the time, just as she didn’t have the slightest inkling that her notebooks would remain radioactive for more than 1,500 years after her death. She was known to store such elements out in the open and even walk around her lab with them in her pockets, as she enjoyed how they “looked like faint, fairy lights.”
Radium’s color changes from silvery white to black when exposed to air.
Advertisement
Marie Curie also won a second Nobel Prize.
Marie Curie wasn’t just the first woman to win a Nobel Prize — she was also the first person to win two and remains the only person to be awarded the Nobel Prize in two different scientific fields. Her first award came eight years before her Nobel Prize in chemistry, when she and her husband Pierre Curie won the 1903 Nobel Prize in physics for their work in radioactivity. More than two decades later, their daughter Irène Joliot-Curie won the 1935 Nobel Prize in chemistry along with her husband Frédéric Joliot for synthesizing new radioactive elements.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Original photo by Chris Willson/ Alamy Stock Photo
On January 9, 2007, Apple CEO Steve Jobs revealed the iPhone to the world. Since then, Apple’s pricey slab of glass stuffed with technology has become synonymous with the word “smartphone” (sorry, Android fans). But smartphones predate the iPhone by more than a decade. To pinpoint the smartphone’s true birthdate, look back to November 23, 1992, and the introduction of IBM’s Simon at a trade show in Las Vegas. Today, IBM is best known for supercomputers, IT solutions, and enterprise software, but in the ’80s and early ’90s, the company was a leader in consumer electronics — a position it hoped to solidify with Simon.
The Finnish telecommunications company Nokia was originally a paper mill.
Nokia is a fixture of cell technology — you’ve almost certainly heard their flagship ringtone, an adaptation of Francisco Tárrega’s “Gran Vals.” But the company actually began as a paper mill in 1865. In fact, it’s only been primarily a telecommunications company for about 30 years.
Simon was a smartphone in every sense of the word. It was completely wireless and had a digital assistant, touchscreen, built-in programs (calculator, to-do list, calendar, sketch pad, and more), and third-party apps, something even the original iPhone didn’t have. The idea was so ahead of its time, there wasn't even a word for it yet — “smartphone” wasn’t coined for another three years. Instead, its full name when it debuted to the larger public in 1993 was the Simon Personal Communicator, or IBM Simon for short. But there’s a reason there isn’t a Simon in everyone’s pocket today. For one thing, the phone had only one hour of battery life. Once it died, it was just a $900 brick (technology had a long way to go before smartphones became pocket-sized; Simon was 8 inches long by 2.5 inches wide). Cell networks were still in their infancy, so reception was spotty at best, which is why the Simon came with a port for plugging into standard phone jacks. In the mid-aughts, increases in carrier capacity and the shrinking of electronic components created the perfect conditions for the smartphones most of us know today. Unfortunately for Simon, it was too late.
The popular electronic game Simon was launched in 1978 at New York City’s famous Studio 54.
Advertisement
Nikola Tesla predicted the smartphone 66 years before IBM’s Simon.
Famed scientist Nikola Tesla — best known for developing the modern AC electrical system — was also something of a technological soothsayer, accurately describing future tech such as Wi-Fi, self-driving cars, and MRIs several decades before their creation. But in 1926, during an interview with Collier’s magazine, Tesla really channeled his inner Nostradamus when he foresaw a world in which “through television and telephony we shall see and hear one another as perfectly as though we were face to face, despite intervening distances of thousands of miles … a man will be able to carry one in his vest pocket.” Tesla arguably not only foresaw the convergence of different types of technology (i.e., television and telephones) into one device, but also predicted the eventual miniaturization of these technologies into something pocket-sized. In fact, the only thing slightly inaccurate in this prediction is Tesla’s belief that vests would still be in fashion.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Those who travel to Holy Trinity Church in Stratford-Upon-Avon, England, to see the final resting place of the world’s greatest playwright are greeted with an ominous warning befitting the legendary wordsmith: “Good friend for Jesus sake forbear, To dig the dust enclosed here. Blessed be the man that spares these stones, And cursed be he that moves my bones.” Although historians aren’t exactly sure how Shakespeare died at the age of 52 in 1616 (fever is a leading theory), they do believe these words likely belong to the Bard himself. And in the 17th century, Shakespeare had cause to worry — grave-robbing was common at the time, and graves were also often moved to make room for more burials.
The most-visited cemetery in the world is in the U.S.
Housing the remains of such luminaries as Frédéric Chopin, Jim Morrison, Edith Piaf, and Oscar Wilde (among many more), Père-Lachaise Cemetery in Paris, France, is the most-visited cemetery in the world, and sees roughly 3 million visitors annually.
However, Shakespeare’s curse appears to have done the trick, as the church kept his grave intact — mostly, at least. In 2016, a (noninvasive) radar scan of Shakespeare’s grave revealed, in an almost Shakespearean twist, that the playwright’s skull seemed to be missing. For evidence surrounding this missing head’s whereabouts, experts reexamined an 1879 article from The Argosy magazine that told a tale about a trophy-hunter taking Shakespeare’s skull. While the story was originally dismissed as fantasy, the details appear to closely line up with the results of the radar study. Although the story relates that the skull was deposited in another church some 15 miles away after the grave-robber panicked, an analysis of a skull at the church in question showed that it appeared to belong to a 70-year-old woman. We’ll likely never know for sure who stole Shakespeare’s skull — and whether the Bard’s curse delivered on its ominous promises.
Some of the moons of the planet Uranus are named after Shakespeare characters.
Advertisement
The oldest known burial site was possibly not made by Homo sapiens.
Scientists know that Homo sapiens have been burying their dead for at least 78,000 years, but 2023 research argued that this funerary practice may not be unique to our species. Lee Berger, a paleoanthropologist at the University of Witwatersrand in Johannesburg, recently explored the Rising Star cave in South Africa. Although incredibly difficult to access, this cave is well known because it contains remains of Homo naledi, a hominin with brains around one-third the size of modern humans’ brains. In a non-peer-reviewed study published in the journal eLife, Berger argued that this species practiced a kind of funerary rite — they “dug holes that disrupted the subsurface stratigraphy and interred the remains of H. naledi individuals.” However, other experts have found Berger’s work unconvincing, and doubt that such a primitive species would exhibit such a complex culture. If Berger is right, this would be the first piece of evidence that a species other than Homo sapiens buried their dead.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Original photo by PictureLux / The Hollywood Archive/ Alamy Stock Photo
The Addams Family was filmed in black and white, and it’s difficult to imagine it any other way — not only because it premiered in 1964, when color television was still something of a novelty, but because the aesthetic perfectly suits the show’s gothic vibes. It was hardly dour on set, however, as the iconic living room where most of the action takes place was actually pink. A resurfaced photo of the set shows just how garish many of the colors were — including bright pink walls and rugs — which in hindsight makes perfect sense: As long as nothing looked out of place in the final black-and-white rendering, its real-life hue didn’t make much of a difference.
“The Addams Family” premiered the same week as “The Munsters.”
The macabre sitcoms debuted within six days of each other in September 1964 and ended their two-season runs a little over a month apart in April and May in 1966.
Several of the set’s props were repurposed from another MGM production, The Unsinkable Molly Brown, which was released a few short months prior to The Addams Family. The characters of the latter made their first appearances in a series of single-panel New Yorker comics by series creator Charles Addams, the first of which debuted in 1938. None of the characters had names in the original comic, however. Most of them, including Morticia and Wednesday, received their monikers when Addams licensed a doll collection based on the cartoon in 1962. And speaking of names, Wednesday’s middle name is — naturally — Friday.
“The Addams Family” theme song was composed and sung by Vic Mizzy.
Advertisement
Lurch and Thing were played by the same actor.
In addition to his roles in Star Trek and I Dream of Jeannie, Ted Cassidy is best known for his performance as Lurch in The Addams Family. He reprised his role as the hulking butler in several iterations of the franchise, including the 1973 animated series and the 1977 television movie Halloween With the New Addams Family, as well as in episodes of the 1960s Batman TV series and The New Scooby-Doo Movies.
But Lurch wasn’t his only contribution to the show, as the disembodied hand known as Thing belonged to Cassidy as well — something many fans didn’t realize at the time, as the character is credited as “Itself” in the credits. Cassidy had a separate contract for playing Thing and portrayed the character with his right hand, though he occasionally switched to his left to see if anyone would notice. Audiences probably didn’t, just as they likely couldn’t tell when assistant director Jack Voglin portrayed Thing in scenes featuring both of Cassidy’s characters.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
On Christmas Day 1741, Anders Celsius, a professor at Uppsala University in Sweden, took the world’s first temperature measurement using degrees Celsius — well, kind of. His scale had one big difference compared to the system we use today: It was backward. Instead of 0 degrees marking the freezing point of water, it instead marked the boiling point, while 100 degrees marked the freezing point. The reason for this arrangement may have been in part to avoid using negative numbers when taking temperature readings. After all, it’s pretty cold in Sweden a majority of the year, and air temperature never gets hot enough to boil water (thank goodness).
The U.S. was the first English-speaking country to adopt a decimal system for money.
The U.S. populace doesn’t generally use the metric system, but it was the first country to create a decimal monetary system (100 pennies equals a dollar) with the Coinage Act of 1792. Three years later, revolutionary France adopted the decimal franc.
Celsius’ scale, then known as the centigrade (or “100-step”) scale, remained this way for the rest of his life, but in 1745 — one year after his death — scientist Carl Linnaeus (of taxonomy fame) ordered a thermometer with the scale adjusted to our modern orientation. Several other scientists also independently reversed the scale. Yet it wasn’t until some two centuries later, in 1948, that the International Bureau of Weights and Measures decided to rename “centigrade” to Celsius, in part to fall in line with the other major temperature scales named after their creators, such as Daniel Fahrenheit and William Thomson, 1st Baron Kelvin.
Although the Swedish scientist didn’t invent, or even use, the precise scale that now bears his name, his groundbreaking work is still worthy of the accolade. Before Celsius, a couple dozen thermometers were in use throughout the world, and many of them were frustratingly inaccurate and inconsistent (some were based on the melting point of butter, or the internal temperature of certain animals). Celsius’ greatest contribution was devising a system that could accurately capture temperature under a variety of conditions, and his name now graces weather maps around the world (excluding the U.S., of course).
Anders Celsius was the first to suggest that the aurora borealis was caused by magnetic fields.
Advertisement
There was once such a thing as a decimal time.
Today’s second is derived from a sexagesimal system created by the ancient Babylonians, who defined the time unit as one-sixtieth of a minute. Fast-forward to the tail end of the 18th century, and the French Revolution was in a metric frenzy. In 1795, France adopted the gram for weight, the meter for distance, and centigrade (later renamed Celsius) for temperature. However, some of France’s decimal ideas didn’t quite stand the test of, er, time. By national decree in 1793, the French First Republic attempted to create a decimal system for time. This split the day into 10 hours, with each hour lasting 100 minutes, and each minute lasting 100 seconds (and so on). Because there are 86,400 normal seconds in a day, the decimal second was around 13% shorter. Although it was easy to convert among seconds, minutes, and hours, France’s decimal time proved unpopular — after all, many people had perfectly good clocks with 24 hours on them — and the idea was abolished two years later. Since then, a couple of other temporal decimal proposals have been put forward, including watchmaker Swatch’s attempt to redefine the day as 1,000 “.beats” (yes, the period was included) in 1998 in response to the internet’s growing popularity. However, ancient Babylon’s perception of time is likely too ingrained in human culture to change any time soon.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Going out to a fancy restaurant is often tied to a special occasion — and there’s no special occasion quite like one for moms. Research from the National Restaurant Association has found that in an average year,about a third of U.S. adults visit a restaurant to celebrate on Mother’s Day. That often makes Mother’s Day thebusiest day of the year for restaurants. This deluge of diners even surpasses Valentine’s Day, perhaps because the mid-February holiday really only takes up tables for two, whereas a Mother’s Day celebration often takes up a four-top or more.
The word for “mother” in nearly every language can be attributed to babies.
In almost every language, the short form for “mother” is an “m” sound followed by “ahh.” This is because babies first form “ahh” sounds and interrupt them by closing their mouths, creating an “m.” Babies are just playing, but it sounds like they’re addressing “mama.”
Restaurants aren’t the only businesses that get a boost on Mother’s Day. TheNational Retail Federation estimates that consumers are expected to spend $34 billion on Mother’s Day celebrations in 2025, with a big chunk of that change being forked over for jewelry, while other industries, such as spa services, also see a noticeable uptick. However, the two biggest winners of the day besides restaurants areflorists (it’s their third-most-lucrative day of the year) and greeting card companies (which deliver heartfelt salutations tothree-quarters of the moms in America). Overall, Americans spend much more on gifts for mom than for dad — people spent an average of $246 on Mother’s Day in 2022, compared to$171 for Father’s Day.
Ancient Greeks worshipped the goddess Rhea as the mother of the gods during a spring fertility festival.
Advertisement
Mother’s Day was originally an anti-war protest.
Following four bloody years of the U.S. Civil War,two women called for a “mother’s day” to push for peace. In the summer of 1865, Ann Jarvis created Mothers’ Friendship Days in West Virginia that aimed to bring together Americans from all political backgrounds, and she continued the annual tradition for years. Inspired by Jarvis,Julia Ward Howe — who famously penned the lyrics to “The Battle Hymn of the Republic” — also wrote an “Appeal to Womanhood Throughout the World” in 1870, highlighting men’s role in war and calling on women to resist being “made a party to proceedings which fill the world with grief and horror.” She also tried to establish June 2 as “Mother’s Day for Peace.” However,it wasn’t until 1908 that Anna Jarvis (the daughter of the West Virginia peace activist) celebrated a “Mother’s Day” in May in honor of her deceased mother. Within a decade, the observance became a nationally recognized holiday.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Questions abound when it comes to Stonehenge, but not everything about the monument is shrouded in mystery. We know, for instance, that around 100 stones make up the site — and that some of them came from nearly 150 miles away. Given that Stonehenge is 5,000 years old, that’s quite the feat. This raises two crucial questions: Who transported said stones, and how? That’s where the mystery begins. For one thing, no one’s sure who built England’s world-famous monument, with everyone from Merlin to aliens receiving credit from various factions; more plausible culprits include Danes, Celts, and Druids.
It actually took closer to 1,500, with Neolithic builders completing different portions of it at different times. There were several centuries-long gaps in this process, which is thought to have taken place in three main phases.
The stones at Stonehenge are grouped into two types: larger blocks known as “sarsen stones,” and smaller stones in the central area known as “bluestones.” Over the last decade or so, researchers have confirmed that the bluestones came from the Preseli Hills of western Wales, about 150 miles from Stonehenge. (The sarsen stones, meanwhile, were likely found 20 to 30 miles away from the monument.) As for how the bluestones made that long journey, we only have theories: Some scholars believe they were dragged on wooden rafts, although others have suggested that a glacier carried the stones at least part of the way. Most archaeologists scoff at the glacier theory, however, and research in 2019 at outcroppings in the Preseli Hills both conclusively linked them to Stonehenge and confirmed evidence of quarrying work around 3000 BCE — the same era when Neolithic builders were first constructing the mysterious stone circles. That means human hands took the rock from the locations in Wales, but as for exactly how, we simply don’t know — and possibly never will.
Stonehenge was bought at an auction in 1915 — then given away.
We’ve all made impulse buys from time to time, but most of them are fairly minor. The same can’t be said of Cecil Chubb, a 39-year-old lawyer who reportedly arrived at a 1915 auction to buy a set of dining chairs at the behest of his wife and ended up buying Stonehenge instead. Just as surprising as the fact that the monument was being sold at an estate sale in the first place — it was privately owned for centuries — is that no one was especially keen on buying it. “Gentlemen, it is impossible to value Stonehenge,” said the auctioneer when the bids had increased only £1,000 over the starting price of £5,000. “Surely £6,000 is poor bidding, but if no one bids me any more, I shall set it at this price. Will no one give me any more than £6,000 for Stonehenge?” he added. Chubb would, but not much more — he won with a bid of £6,600, just a little more than $1 million in today’s money. He donated it to the British people three years later, writing that “the nation would like to have it for its own, and would prize it most highly.”
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.