Original photo by Pressmaster/ Shutterstock

They’ve been described as the windows to the soul by William Shakespeare and the jewel of the body by Henry David Thoreau, and featured in song titles by musicians ranging from Van Morrison and The Who to Billy Idol and Billie Eilish. Needless to say, eyes hold a prominent place in our lives, both for our dependence on their functionality as well as the aesthetic qualities that have inspired so many artists. Here are six eye-opening facts about these amazing organs.

A fossilized trilobite found in the Green River Formation in Wyoming.
Credit: Layne Kennedy/ Corbis Documentary via Getty Images

The First Eyes Appeared at Least 540 Million Years Ago

The first known organism to demonstrate the leap from light-sensitive receptors to complex eyes was the trilobite, which left records of its evolutionary impact from approximately 540 million years ago. The orbs of these early arthropods more closely resembled the compound eyes of modern insects, with multiple lenses, as opposed to the single lens-to-retina camera-style eye built into humans. Because they offered trilobites a clear advantage in hunting prey (and thus encouraged their predators to evolve in response), the emergence of working eyes in these and subsequent life forms may have helped drive the so-called “Cambrian Explosion,” which gave rise to most of the creatures that now populate the animal kingdom.

Blue eyed woman's face.
Credit: IoanaB/ Shutterstock

The Human Eye Can See Objects Millions of Miles Away

While the majority of us wouldn’t consider our vision to be extraordinary, the human eye can see much farther than most of us realize. That’s because our ability to perceive an object is based not only on its size and proximity, but also on the brightness of the source. Practically speaking, our sight is hindered by factors such as the Earth’s curvature, which creates the dropoff point of the horizon just 3 miles away, and atmospheric conditions. However, a trip outside on a clear night reveals the true power of our vision, as most of us are able to make out the faint haze of the Andromeda Galaxy some 2.6 million light-years into space.

 Eyeglasses laying on colored papers with different colors visible thru the eyeglasses.
Credit: kmatija/ iStock

Some People Can Distinguish Between 100 Million Colors

Most people are trichromatic, meaning they possess three types of cone cells in their retinas to detect variations of red, green, and blue light. Dichromatic or colorblind people are those with missing or defective cone cells; normally this means they have trouble differentiating between two colors, with red and green being the most common combination. On the extreme ends of the spectrum, those suffering from achromatopsia lack the ability to see any colors, while those born with an extra set of cone cells, tetrachromats, are said to be extraordinarily sensitive to light wavelengths and capable of distinguishing between 100 million colors.

Woman with her eyes shut.
Credit: PhotoTalk/ iStock

There are a few established reasons for why we blink: This rapid closure triggers secretions that flush away foreign particles, while also providing a lubrication that keeps our precious eyes functioning smoothly. However, this action, which can be voluntary or involuntary, is also affected by a raft of psychological reasons. We blink less when concentrating, for example, and more when we’re nervous. Recent studies also indicate that blinking may be a way of providing the brain a brief moment of rest. Regardless of the reasons, we all blink a lot. Most people average at least 15 per minute, which translates to 14,400 for each waking 16-hour period, and a whopping 7.8 million blinks per year.

Huge squid with an eye popping out
Credit: sunara/ iStock

The Colossal Squid Boasts the Largest Animal Eyes

The human eye measures about two-thirds of an inch across at birth, before growing to its full size of 1 inch by adulthood. By comparison, the eye of the 45-foot-long colossal squid has been measured at 11 inches in diameter, making it the largest such organ in the animal kingdom and possibly the largest in the history of recorded life. Among land-dwelling creatures, the ostrich tops the pack with an eye that measures around 2 inches from the cornea to the retina — dimensions that also happen to be bigger than its walnut-sized brain.

Close-up of an eye showing the details in the iris and sclera.
Credit: Peter Finch/ Stone via Getty Images

All Humans Had Brown Eyes at One Point

Eye color (along with skin and hair color) is determined by the amount of melanin our bodies produce; those with blue or green eyes simply possess a lower density of this pigmentation in the iris than those with dark brown peepers. According to research published by a University of Copenhagen team in 2012, all humans had brown eyes until sometime between 6,000 and 10,000 years ago, when a genetic mutation created the first blue-eyed individual. Nowadays, 70% to 79% of the world’s population has brown eyes, with 8% to 10% sporting baby blues, approximately 5% featuring hazel or amber, and just 2% showing green. Less than 1% of people possess two completely different colored eyes — a condition known as heterochromia.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by lucky-photographer/ iStock

Presidents are much more than the policies they back or the speeches they give — sometimes they put out fires in the Library of Congress, popularized ice cream Stateside, or were nationally ranked wrestlers. Below are 20 strange and fascinating facts about America’s first 20 Presidents.

George Washington, an American politician, soldier and the first President of the United States.
Credit: Universal History Archive/ Universal Images Group via Getty Images

George Washington

While every acting President serves as the commander-in-chief of the U.S. military, nobody will ever outrank George Washington. He was posthumously given the rank of General of the Armies of the United States, sometimes compared to being a six-star general. Although General John J. Pershing also held the title for his service during WWI, when President Ford appointed Washington in 1976 — 200 years after Washington’s heyday — he specified that our first President would always rank first among all Army officers past and present.

John Adams

The Boston Massacre, in which British soldiers shot and killed five citizens, was a major flashpoint on the road to the Revolutionary War. When murder charges were brought against the soldiers, John Adams acted as their defense lawyer. He called the trial one of his most “gallant” and “disinterested actions of my whole life.”

Portrait of Thomas Jefferson, a past American president.
Credit: Ann Ronan Pictures/ Print Collector via Getty Images

Thomas Jefferson

When Jefferson sent Meriwether Lewis and William Clark out west to explore the Louisiana Territory in 1804, the President told the explorers to watch out for mammoths. Jefferson was apparently obsessed with mammoths and was convinced they were still alive, gallivanting in America’s wild west.

James Madison

James Madison and his wife, Dolley, helped popularize ice cream in America. Tastes in the treat, however, would be considered questionable today: chestnut, asparagus, and parmesan were all on the menu. Dolley’s favorite flavor was … oyster.

James Monroe, the fifth president of the United States 1817-25.
Credit: Universal History Archive via Getty Images

James Monroe

While George Washington gets all the credit for bravely crossing the icy Delaware River in 1776, Monroe deserves attention too. Then-Lieutenant Monroe was part of an advance unit and crossed the river before Washington’s crew. On the other side, Monroe was wounded at the Battle of Trenton, as Hessian soldiers shot him in the shoulder.

John Quincy Adams

The election of 1824 saw four viable candidates, none of whom won an outright majority of electoral votes. Andrew Jackson nabbed 99, John Quincy Adams won 84, William H. Crawford earned 41, and Henry Clay claimed 37. Despite having neither the highest number of electoral or total popular votes, Adams was chosen as President by the U.S. House of Representatives.

Andrew Jackson, the 7th President of the United States.
Credit: Universal History Archive via Getty Images

Andrew Jackson

In 1835, Jackson received a present at the White House: a 1,400 pound wheel of cheese. It sat around idle for two years until Jackson, sick of the cheese, invited 10,000 visitors into the White House to get rid of it. As one resident recalled: “The air was redolent with cheese, the carpet was slippery with cheese, and nothing else was talked about at Washington that day.” Another called the event, “an evil-smelling horror.”

Martin Van Buren

Having grown up in a Dutch-speaking community in New York, Van Buren was the only President whose first language was not English. Although Van Buren worked hard to mask his original tongue, observers claim his accent would surface whenever he became visibly excited.

William Henry Harrison, the 9th President of the United States.
Credit: Universal History Archive via Getty Images

William Henry Harrison

When a pesky newspaper claimed that Harrison sat in a log cabin and swilled hard cider all day, Harrison didn’t fight the libelous claim. He embraced it. Supporters started calling him the “Log Cabin and Hard Cider candidate” and handed out cider and specialty bottles of booze shaped like log cabins. (Supporters even composed a song for him called “Good Hard Cider.”)

John Tyler

According to Smithsonian, Tyler was “the only president to commit a public act of treason against the U.S. government.” Sixteen years after leaving office, Tyler embraced Virginia’s decision to secede from the Union and was elected to the Confederate Congress. He died before taking a seat with the Confederate government, but the damage was done: When Tyler died, President Lincoln reportedly refused to lower the flag at half-staff.

James K Polk, the 11th President of the United States.
Credit: MPI via Getty Images

James K. Polk

For two days a week, any Joe Schmoe could walk off the street and visit President Polk in the White House for an evening reception. These “office hours” weren’t very productive, though. Many people walking off the street, to Polk’s annoyance, merely came in asking for a job.

Zachary Taylor

The second President to die in office did not smoke or drink, but he did chew tobacco — lots of it. In fact, he was an impressive spittle-shooter. According to the book Presidents: A Biographical Dictionary, Taylor had “the reputation of never missing the sand-filled box that served as a spittoon in his office.”

Millard Fillmore, the 13th president of the United States.
Credit: Universal History Archive via Getty Images

Millard Fillmore

Millard Fillmore was an avid reader who would do nearly anything for a book. On Christmas Eve in 1851, when the Library of Congress caught on fire, Fillmore ran to the scene with a group of Congressmen and “rendered all the service in their power” to stop the fire. Fillmore led the bucket brigade early into Christmas morning.

Franklin Pierce

Franklin Pierce was one of many Presidents who came from a law background, but he was also known for being especially personable and having an excellent memory. He reportedly could give half-hour-long speeches without the help of notes and had “a knack for recalling the names of every juror who ever sat on one of his cases.”

James Buchanan, the 15th President of the United States.
Credit: Universal History Archive via Getty Images

James Buchanan

Considered one of America’s worst presidents, Buchanan’s fence-sitting on the topic of slavery helped set the stage for the Civil War. It was just one of many controversial positions: On the campaign trail, Buchanan argued that a wage of 10 cents was fair for a day’s labor. (At the time, most Americans supposedly needed around $6 a week — or 86 cents a day — to live.) The gaffe earned Buchanan the nickname “10 Cent Jimmy.”

Abraham Lincoln

Honest Abe was an accomplished wrestler. It’s said that, as a young man in Illinois, Lincoln competed in about 300 wrestling contests and lost just one match. In 1830, after he was crowned his county’s wrestling champion, Lincoln wasn’t afraid to trash-talk his opponents: “I’m the big buck of this lick,” he reportedly said. “If any of you want to try it, come on and whet your horns.”

Andrew Johnson, the 17th president of the United States.
Credit: Universal History Archive via Getty Images

Andrew Johnson

Lovers of Alaska’s mountains and streams have Andrew Johnson and his Secretary of State, William H. Seward, to thank. During Johnson’s presidency, Russia sold Alaska to the United States for the price of  $7.2 million … in gold. Although most Americans supported the purchase, critics would call the land “Walrussia,” “Icebergia,” and Johnson’s “polar bear garden.”

Ulysses S. Grant

The “S” in Ulysses S. Grant’s name doesn’t mean anything. It’s a clerical error. Grant received the erroneous middle initial when a friend of his father, Thomas Hamer, nominated Ulysses for enrollment at West Point. The initial stuck. Writing to his future wife, Grant said: “I have an ‘S’ in my name and don’t know what it stands for.”

Portrait of Major General Rutherford B. Hayes.
Credit: Library of Congress/ Corbis/ VCG via Getty Images

Rutherford B. Hayes

Several Presidents served during the Civil War, but only Hayes was wounded in combat. He had four horses shot from under him. He suffered a wounded knee, a gunshot wound to the left arm, a hit from a spent musket ball to the head, an ankle injury, and a final gunshot wound at the Battle of Cedar Creek.

James A. Garfield

In 1880, Garfield attended the Republican National Convention with no intention of running for president. But when the convention stalled, a delegate nominated Garfield as a compromise, and a stream of unexpected votes flooded in. “This honor comes to me unsought,” Garfield said. “I have never had the presidential fever, not even for a day … I have no feeling of elation in view of the position I am called upon to fill.”

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Cottonbro/ Pexels

Desserts have long reigned as the pièce de résistance of suppertime, encouraging picky eaters to clear their plates in anticipation of a bubbling pie, warm cookies, or decadently sweet puddings. While the dessert course seems like an ages-old staple, finishing off a meal with dazzlingly decorated cakes and sugary sweets is a relatively new concept attributed to French royalty. Turns out, the word “dessert” —  meaning “to clear the table” after a meal — didn’t even hit the modern lexicon until the 17th century, a time when French nobles feasted on a post-dinner selection of preserves, jams, and cookies, calling the course “le fruit.”

Renaming the serving of sweets gained popularity as chefs crafted painstakingly designed sugar sculptures, cakes, and treats — though the term wouldn’t take off until the French Revolution, when commoners more easily gained access to sugar. By the late 1700s, the idea of dessert had spread through Europe and to young America, where home cooks were dishing up cakes, cookies, and tarts once only seen in aristocratic dining rooms. In the centuries since, some sugary dishes have come and gone (like tomato soup spice cake and mock apple pie), while others have maintained their place in dessert history. Here’s the sweet story behind several classic and beloved dessert dishes.

Homemade cheesecake with fresh berries and mint for dessert.
Credit: Mizina/ iStock

Cheesecake

New York City may be the world’s cheesecake capital, but the classic delight was actually created and shared thousands of years before the city’s founding. Originating with the ancient Greeks, early cheesecake was constructed from cheese that had been beaten until smooth and then blended with honey and flour. The crustless delight was served at special events, weddings, and even the first Olympic Games in 776 BCE.

But like any food that has transcended the centuries, the cheesecake morphed into the dessert we know today only slowly. By 160 BCE a newer version emerged with a separately baked crust, and an English recipe from 1390 blended sugar and dried elderflowers with cheese curds before baking the entire dish in a pie shell. When mass-produced cream cheese emerged in the 1870s, cheesecake recipes underwent another change. By the 1930s, New York bakeries had fully adopted the use of a cream cheese and sour cream base paired with a Graham cracker crust — creating the cheesecake we enjoy today.

Variety of ice cream scoops in cones with chocolate, vanilla and strawberry.
Credit: VeselovaElena/ iStock

Ice Cream

Ice cream may seem like a modern culinary invention considering its need for refrigeration, but food historians believe it appeared sometime within China’s Tang Dynasty, between 618 and 907 CE. Early ice cream blended goat, buffalo, or cow milk with flour and camphor harvested from evergreen trees, and it reached its creamy state after being plunged into ice pools. Cultures throughout Asia and Europe made similar frozen treats; Arab chefs created fruit sherbets during the Middle Ages, and sugary, milk-infused ice drinks became known as the first Italian sorbettos during the mid-1600s.

By the time ice cream reached North America in the 1700s, similar treats such as gelato had been popular in Italy and France for decades. But upgrades to icehouse technology in the 1800s allowed the American masses to finally enjoy ice cream more regularly. That same century saw a deluge of related innovations. Paper companies produced foldable “ice cream satchels” complete with wire handles for easily transporting the cold confection, and in 1897, Black inventor Alfred L. Cralle patented the Ice Cream Mold and Disher — the original ice cream scoop (from which he never made any money). And while waffle-style ice cream cones likely existed before the 20th century, the 1904 World’s Fair in St. Louis was so successful at popularizing the edible containers that it’s often credited as the site of their invention.

Half a dozen frosted sprinkle doughnuts in an opened box.
Credit: Cottonbro/ Pexels

Doughnuts

Whether you enjoy them for breakfast or as a workday snack, doughnuts have become an iconic treat. World War I catapulted doughnuts to the front of our taste buds, but the circular sweets are actually much older. Archaeologists have unearthed fossilized remnants of prehistoric doughnuts from Indigenous communities in North America, while ancient Romans and Greeks paired fried dough with sweet and savory sauces. A plethora of fritters and fried pastries spread throughout Europe in the Middle Ages before early Dutch settlers brought them to the American colonies under the name “olykoeks,” describing how they were cooked in oil.

By 1750, the first American doughnut recipe was published, allowing cooks to create the desserts at home. But it wasn’t until World War I that doughnut popularity skyrocketed; the Salvation Army’s volunteer “doughnut girls” cooked and distributed the fried rings along trenches and frontlines to homesick soldiers. Shortly after the war’s end, the first doughnut machine popped up in New York City, cementing the dessert as a culinary mainstay.

A plate of red fancy red jello.
Credit: Bildagentur Zoonar GmbH/ Shutterstock

Jell-O

Gelatin-based foods have a long, unsavory history. First emerging in medieval Europe, wiggly foods were hard to come by unless you were nobility. That’s because early gelatin dishes were based on livestock bone collagen; boiling the bones to extract the substance took days of labor-intensive work. Hours of boiling, straining, and mixing with other ingredients required a large kitchen staff, making gelatin dishes a status symbol for upper-class diners.

Fast forward hundreds of years to 1845, when inventor Peter Cooper (who also created America’s first steam locomotive) crafted the first gelatin powder, which required little time to set after being mixed with just hot water. Cooper’s “portable gelatin” wasn’t a big hit, and the patent was sold to Pearle and May Wait, owners of a cough syrup and laxative company. The Waits added fruit syrups to the powdered gelatin, launching Jell-O as a jiggly dessert product before selling off the brand in 1899. With the help of magazine ads and radio jingles, the confection became a household name, its popularity rising in the 1920s and remaining a household name for decades to follow.

A high angle view looking down on a freshly baked pumpkin pie.
Credit: RyanJLane/ iStock

Pumpkin Pie

While apple pie may have misappropriated origins (the first recipe appeared in England around 1381, not the U.S.), pumpkin pie deserves more credit as a purely American dessert. The spiced autumnal pie that now inspires countless fall desserts and drinks was concocted by early English colonists who encountered native pumpkins for the first time. Accounts from the mid-1600s suggest that newcomers to young America were reliant on pumpkins, brewing them in ale and baking them into pies. Because of their easy-to-grow nature, pumpkins became popular throughout Europe, where countless recipes for the baked squash pies directed chefs to boil pumpkin flesh in milk or mix pumpkin puree with baked apples.

Modern pumpkin pie construction became significantly less laborious around the 1920s, when Libby’s brand launched its first canned pumpkin puree. Most cooks today continue to opt for the store-bought ingredient, though pie purists may just opt to roast their own pumpkins, considering commercial purees actually consist of a sweeter, butternut-like squash. Pumpkin pie may be seasonal, but the tradition of adding your own flair is what keeps it around from year to year — just like every other popular dessert.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Mercury Green/ Shutterstock

For nearly 200 million years, Earth was the domain of the dinosaurs. Although many people picture giant, green-skinned reptiles roaming the hothouse jungles of the Mesozoic, dinosaurs were incredibly varied creatures — large and small, warm- and cold-blooded — and roamed every continent (yes, including Antarctica). But with some 66 million years or so of separation between humans and dinosaurs, and with many of these wondrous creatures’ secrets hidden away under layers of rock, paleontologists are still trying to understand these amazing beings. Here are six fascinating facts about dinosaurs that debunk long-lasting myths, and explain why paleontology is one of the most exciting scientific fields today.

Ornitholestes Dinosaur in the act of catching the Jurassic Bird.
Credit: Universal History Archive/ Universal Images Group via Getty Images

An Asteroid Didn’t Kill All the Dinosaurs

According to the prevailing theory among scientists, some 66 million years ago, an asteroid we now call Chicxulub slammed into the coast off the Yucatan Peninsula, triggering Earth’s fifth mass extinction in its more than 4 billion-year-long history. The debris ejected into the atmosphere streaked through the sky, and the resulting friction superheated the atmosphere, causing forest fires around the globe. After a prolonged winter caused by a thick haze of ash blotting out the sun, some 75% of all living species on Earth went extinct. Although many of those species were land-dwelling dinosaurs, one group largely survived the devastation — beaked avian dinosaurs known today as birds.

The first avian dinosaur, archaeopteryx, popped up around 150 million years ago. This proto-bird had teeth, though through evolution, a subsect of these flying dinos dropped teeth for beaks instead. Some scientists theorize that these beaks gave birds a post-apocalyptic advantage, because they could more easily dine on the hearty nuts and seeds found throughout the world’s destroyed forests.

Skeleton of the brontosaurus, the largest land animal of all time.
Credit: Bettmann via Getty Images

Science Is Still Debating the Existence of the Brontosaurus

Paleontologists have been debating the existence of the giant sauropod named brontosaurus for nearly 150 years. The story starts during the fast-and-loose “Bone Wars” period of paleontology in the late 19th century. During that time, a bitter rivalry developed between American paleontologists Edward Drinker Cope and Othniel Charles Marsh. It was Marsh who discovered the skeleton of a long-necked apatosaurus in 1877, but the fossil was missing its skull. Marsh incorrectly paired the body with the skull of another dinosaur (likely a camarasaurus). Two years later, when a more complete apatosaurus skeleton wound up in his possession, the specimen was unrecognizable compared to Marsh’s Frankenstein dino, so he instead created a whole new species — brontosaurus, meaning “thunder lizard.” Scientists spotted the mistake in 1903, but the name stuck in the public’s mind.

However, a century later, scientists examining more fossils determined that a close cousin of apatosaurus who had a thinner and less robust neck did exist, and resurrected the name brontosaurus to describe it. However, not all paleontologists accept the revived name for the genus — as beloved as it is.

Painting from a series by Ernest Untermann in the museum at Dinosaur National Monument.
Credit: Bettmann via Getty Images

Dinosaurs Didn’t Live in Water

Although many aquatic reptiles existed during the Age of the Dinosaurs, they were not dinosaurs. The most famous of these water-dwelling creatures was ichthyosaurus, which is actually a distinct marine vertebrate — not a dino. The term “dinosaur” instead mostly refers to terrestrial reptiles who walked with their legs under them (not to the side like crocodilians). Other factors such as foot and neck size also help define what is and isn’t a dinosaur.

Despite the fact that nearly all dinosaurs were terrestrial, a few lived a semi-aquatic existence. The spinosaurus, which lived 99 million to 93 million years ago, shows evidence of eating fish, and ankylosaurus lived near coastlines.
Similarly, species like the flying pterodactyls (also known as pterosaurs) — which could be as large as a fighter jet or as small as a paper airplane — are distant cousins of dinosaurs, not dinosaurs themselves, although media coverage frequently refers to them that way.

 Illustration of Megazostrodon.
Credit: DE AGOSTINI PICTURE LIBRARY via Getty Images

Dinosaurs and Mammals Coexisted

Mammals and dinosaurs coexisted during most of the Mesozoic Era (252 million to 66 million years ago). The first known mammal, called morganucodontids, appeared around 200 million years ago and was about the size of a shrew. During the Age of the Dinosaurs, mammals remained small, never really exceeding the size of a badger, and were a go-to food source for carnivorous dinos (though sometimes the opposite was also true).

Things changed when a giant asteroid smacked into Earth at the end of the Cretaceous period. Mammals’ small size meant they could burrow underground and escape scorching surface temperatures. As for food, mammals were perfectly content with eating insects and aquatic plant life (which also survived the asteroid’s impact), while large herbivorous dinosaurs went hungry. Over the next 25 million years, mammals underwent a drastic growth spurt as the Age of Mammals began to take shape.

Laura Dern and Sam Neill come to the aid of a triceratops in a scene from the film 'Jurassic Park'.
Credit: Universal Pictures/ Moviepix via Getty Images

The Film “Jurassic Park” Is a Bit of a Misnomer

The entry point for many into the world of dinosaurs is Steven Spielberg’s 1993 film Jurassic Park, which inspired an entire generation of paleontologists. Despite its outsized impact on the field, the film does get a few things wrong about dinosaurs. For one, dinosaurs are now thought to sport feathers, whereas Jurassic Park’s dinos represent the lizard-esque depiction popular in times past. Also, the film’s very name is a misnomer, as the dinosaurs that take up the most screen time — such as the Tyrannosaurus rex, velociraptor, and triceratops — all lived during the Cretaceous period (145 million to 66 million years ago).

This may seem like a small difference, but the Age of the Dinosaurs is surprisingly long. In fact, the T. rex lived closer to humans, separated by more than 60 million years, than to the stegosaurus, which lived in the Jurassic period some 80 million years before the “king of the tyrant lizards.”

 A paleontologist at the Dinopolis theme-park lab in Teruel.
Credit: PIERRE-PHILIPPE MARCOU/ AFP via Getty Images

We’re Living in a Golden Age of Dinosaur Discovery

Paleontology is far from a static field. Every year, an estimated 50 new dinosaur species are discovered — that’s basically a new dinosaur every week. Roughly half of those species are being discovered in China, a country that only recently opened up to paleontological pursuits. Technology has also upended the field, with CT scans able to examine the interiors of dino skulls, while other tomographic image techniques can render 3D recreations of bones. Dinosaurs may be a species buried in Earth’s geological past, but uncovering that past has a bright and exciting future.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Steve Cukrov/ Shutterstock

The earliest pies were valued by anybody who needed to store food for the long haul. A well-baked pie, made with a thick crust called a “coffin,” could last in your pantry for up to a year. Pies were especially beloved by sailors, who required stockpiles of well-preserved food that would take up little space in a ship. As the BBC notes, “having a hold stacked with pies was a far more sensible use of precious square metres than bringing a cook and dozens of livestock along for the journey.”

Before the 16th century, most of these pies featured savory fillings. The sweet pies we enjoy today were rare and pricey, reserved for royalty and anybody willing to pay top-dollar for sweeteners. Dessert pies wouldn’t become common among regular folk until the height of the slave trade, which saw millions of sacks of sugar imported from the West Indies.

Like the traveling pies of the Middle Ages, the word “pie” itself has taken a fascinating journey. According to the Oxford English Dictionary, the word may be a nod to the magpie, a black-and-white bird common to Europe. It’s believed that early pies, with their light crusts and dark fillings, resembled the bird’s plumage.

Another theory is that the word refers to the magpie’s nest, which is famous for being stuffed with anything the bird can get its claws on. (Early pies, after all, were a motley mix of whatever the cook could find in the kitchen: meat, offal, fruits, spices, and more.) Support for this etymology lies in Scotland’s national dish of haggis, which — like early pies — is famed for containing a slew of ingredients. According to Alison Richards at NPR, “the word haggis or haggesse turns out to be an alternative name for magpie.”

In any case, pie as we know and define it now was in common rotation by the 19th century. Today it’s a staple of American cuisine, in particular, and the preferred dessert for many holidays. Home cooks and professional chefs alike invent new recipes all the time, sometimes competing in national pie competitions in an attempt to create a new favorite flavor. Nothing beats the classics, though. Here’s a closer look at the origins of five of the world’s most popular pies.

Mince Pies on a cooling rack.
Credit: monkeybusinessimages/ iStock

Mincemeat Pie: Cuisine From the Crusades

In the 13th century, European crusaders returned home with stories of war — and, if legends are true, a few good pie recipes inspired by Middle Eastern cuisine, which fearlessly combined sweet and savory flavors. Clearly impressed, the crusaders told those back home about delicacies containing an array of meats, fruits, and spices available only in distant lands. (A 1390 recipe for “tartes of flesh,” for example, suggests adding saffron to a pastry of sugar, pork, cheese, and eggs.) Expensive to bake, the pie recipes influenced by the crusaders were initially  reserved for the wealthy or presented at feasts and holidays. By the 16th century, though, these “mincemeat” treats were a Christmastime mainstay. Today’s mincemeat pies are actually just mince pies; meat was dropped from the recipe sometime before the Victorian era.

Sweet homemade blueberry pie, ready to eat.
Credit: Brent Hofacker/ Shutterstock

Blueberry Pie: A Wartime Treat

Berry and drupe-based pies have existed since the 16th century, when Queen Elizabeth I famously took a bite of the world’s first cherry pie. But when pies came to the New World, non-native fruits took precedence over blueberries. That changed during the Civil War. As brother fought brother, sardine canneries in New England lost most of their business in the Deep South. Thankfully, Maine was (and is) the largest producer of wild blueberries in the world, so the factories pivoted to canning local fruits instead. Soon, the struggling canneries captured a new market: Soldiers who had never tasted Maine blueberries were downing the stuff by the dozens, transporting them in the form of pies. An American classic was born.  

Slice of apple pie with a scoop of ice cream on top.
Credit: Charles Brutlag/ Shutterstock

Apple Pie: Britain’s Gift to America

The phrase “as American as apple pie” is a misnomer: The dish is decidedly British. Unlike blueberries, apple trees are not native to North America. (Rather, America’s first apple seeds and cuttings were brought over by Jamestown colonists for the purpose of making cider.) Britain’s first apple pie recipe was recorded back in 1381 by Canterbury Tales author Geoffrey Chaucer, who called for figs, berries, saffron, and more. Here it is:

Tak gode Applys and gode Spyeis and Figys and Reysons and Perys and wan they re wel ybrayed colourd wyth Safron wel and do yt in a cosyn and do yt forth to bake wel.

As with blueberry pie, America’s love affair with apple pie may be traced back to the United States military. By the early 20th century, America had become one of the world’s largest apple producers. During World War II, it was common for soldiers abroad to say they were fighting “for mom and apple pie.”

Fresh homemade pumpkin pie.
Credit: Brent Hofacker/ Shutterstock

Pumpkin Pie: Star of America’s First Cookbook

When you think of it, it’s odd to transform a gourd into a sweet dessert. But Americans have been doing it since the mid-17th century. In 1655 in New Netherland — now New York state — a Dutch lawyer named Adriaen van der Donck observed that “the English, who are fond of tasty food, like pumpkins very much and use them also in pies.” These early pastries, however, did not resemble modern pumpkin pies. “They contained layers of sliced (sometimes fried) pumpkin, combined with sugar, spices, and apple slices,” Ellen Terrell writes for the Library of Congress blog. The first modern custard-style pumpkin pie recipe wouldn’t be recorded until  141 years later, when Amelia Simmons wrote the first American cookbook. (You can view the recipe here.)

Aerial view of a slice of key lime pie.
Credit: PamelaJoeMcFarlane/ iStock

Key Lime Pie: The Pride of Florida

Floridians are defensive about their state pie — and for good reason. Key limes, with their uniquely pleasant pucker, are named for their association with the Florida Keys, where they first thrived in the United States. But the pie itself may not be a Sunshine State creation. According to some sources, the dairy-loving masterminds at the Borden Company concocted the recipe that would become key lime pie in a New York City test kitchen in 1931. (The recipe was a ploy to sell sweetened condensed milk.) Floridians, however, still insist that the original key lime pie was invented by a cook with the mysterious name of “Aunt Sally,” who allegedly adapted the recipe after acquiring it from a sponge fisherman working off the Florida Keys.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by AF archive/ Alamy Stock Photo

Four decades ago, an unlikely character wove its way into film history with its glowing heart and desire to phone home. E.T. the Extra-Terrestrial opened in theaters on July 11, 1982, and won over audiences with his penchant for Speak & Spells and Reese’s Pieces.

The film was another breakthrough for Steven Spielberg, who had been on a roll with Jaws (1975), Close Encounters of the Third Kind (1977), and Raiders of the Lost Ark (1981). It also marked the star-making turn for Drew Barrymore, who was just 7 years old when she played Gertie, as well as Henry Thomas, who won the title (human) role at the age of 9 of Elliot, the boy who bridges worlds by forming a tight friendship with an alien. Here are 10 facts you may not know about the Academy Award-winning film, which grossed almost $800 million worldwide (roughly $2.3 billion today).

The Extra-Terrestrial and Steven Speilberg poses for a portrait in Los Angeles, California.
Credit: Aaron Rapoport/ Corbis Historical via Getty Images

Spielberg Came up With the Idea While Directing Another Movie

While working on his 1977 sci-fi classic Close Encounters of the Third Kind, the director wondered about another alien concept and played out what could happen if the creature didn’t go back to the mothership. Also around that time, he had been thinking of making a film exploring the impact of divorce on teens, since his own parents had gotten divorced when he was 15. Combining the two, he created “the most personal thing I’d done as a director,” he said.

One of the famous shots from the E.T. film.
Credit: AF archive/ Alamy Stock Photo

Everything Was Filmed With Code Names for Fear of Plagiarism

Spielberg was worried that his innovative plot might be ripped off quickly, so he had the production go to great lengths to keep everything under wraps while they filmed from September to December 1981. Actors had to read the script behind closed doors and everyone on the set also had to wear an ID card to ensure no unauthorized people snuck in for a peek. And the entire project was filmed under the codename “A Boy’s Life.”

Close-up of the E.T. movie poster.
Credit: Blueee/ Alamy Stock Photo

One of the Movie’s Posters Was Inspired by Michelangelo

If the movie’s poster of the universe with a human hand reaching out looked familiar, it’s because the late artist John Alvin was inspired by Michelangelo’s “The Creation of Adam,” the centerpiece of his Sistine Chapel fresco masterpiece. Alvin’s daughter was the hand model for the image that was used to promote the film. The original artwork hung on writer and producer Bob Bendetson’s office wall until it was auctioned off for $394,000 in 2016.

Child looking at men with flashlights in a scene from the film 'E.T. The Extra-Terrestrial'.
Credit: Archive Photos/ Moviepix via Getty Images

Another Actor Was Almost Cast as Elliot

The on-screen chemistry between the child actors was crucial to the film. So before casting director Marci Liroff finalized her choices, she invited the finalists — including a boy she had honed in on to play Elliot — over to screenwriter Melissa Mathison’s home to play the role-playing game Dungeons and Dragons. “In about three minutes, it became very clear that nobody liked this little boy,” Liroff said. “I just think when you play a game sometimes, your true character comes out … He became very bossy. It just showed that he was not our kid. So I basically had to start over.”

Peter Coyote leaning over to talk to Henry Thomas in a scene from the film.
Credit: Archive Photos/ Moviepix via Getty Images

Thomas Nailed the Role With a Teary Audition

Soon, they called in Thomas, who had just been in a film called Raggedy Man, and flew him in from Texas for the audition. Liroff said they set up an improv-like scenario about the NASA officials coming to take E.T. away. The young Thomas stepped into the character so deeply that he had tears in his eyes — which, in turn, led all the others in the room to bawl as well. “He just became this little boy. He used, I think, his fear and anxiety, to really push further in the role and he moved us so deeply and so fully,” she said and called it one of the most moving auditions she’d ever experienced.

A young Drew Barrymore in a scene of the E.T. film.
Credit: ScreenProd / Photononstop/ Alamy Stock Photo

Barrymore Was Cast After Being Turned Down for “Poltergeist”

Although Barrymore and Spielberg ended up having such a close relationship that he later became her godfather, she had first auditioned for the role of clairvoyant Carol Anne (“they’re heeeere!”) in his 1982 horror classic Poltergeist. Heather O’Rourke got the role, but the director turned to Barrymore for his following project, E.T. Barrymore now remembers her time fondly with a souvenir she took from the set: the red cowboy hat. “It is in [my daughters’] room somewhere and reminds me that I was 6 years old wearing that hat,” she told Domino. “I’m so glad I still have it.”

ET looking around door in a scene from the film.
Credit: Archive Photos/ Moviepix via Getty Images

Eighteen People Contributed to E.T.’s Voice

The primary voice behind the alien was an older woman named Pat Welsh, who smoked two packs of cigarettes a day to get that certain vocal timbre. But when it came to E.T.’s other sounds, like burping and snorting, they were sourced from all over, including from the film’s sound effect creator’s wife and Spielberg himself. Ultimately, there were a total of 18 people who took part in giving the fictional friend a voice, and at some points, even sea otters, raccoons, and horses were used.

Close-up shot of Thomas during a scene on set.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Thomas Ate a Lot of Candy on Set

E.T.’s favorite treat, Reese’s Pieces — which became the choice snack after Mars. Inc passed on the use of M&Ms — also became Thomas’ obsession. “I made myself sick from eating them because we always had those two-pound bags lying around,” Thomas told CNN. “They were set dressing in Elliott’s room, so in between takes, I was constantly eating those things.“

A look at a scene from the E.T. film.
Credit: Allstar Picture Library Ltd./ Alamy Stock Photo

The Movie Was Shot From a Kids’ Eye Level

To emphasize the story from Elliot’s point of view, the entire movie up until the final act was shot from the eye level of a child. In fact, no adult face was ever shown in the film with one big exception: Elliot’s mom, Mary. “She was like one of the kids,” Spielberg told Entertainment Weekly.

Director Steven Spielberg and actor Harrison Ford on set of the E.T. film.
Credit: Pool GARCIA/URLI/ Gamma-Rapho via Getty Images

Harrison Ford Had a Cameo That Was Cut

Among the grown-ups who appeared without their faces shown was Harrison Ford — then at the peak of his Indiana Jones fame — playing the part of the school principal who scolds Elliot after the frog rescue scene. In the cut scene, Elliot’s chair starts to levitate until he hits the ceiling and crashes down with a perfect landing. Ford’s character was oblivious to it all since he was too busy reprimanding the child to notice.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Trinity Mirror / Mirrorpix/ Alamy Stock Photo

Diana, Princess of Wales, was — and arguably, still is — one of the most famous women in the world. From the time she began dating Prince Charles in 1980 to her tragic death in 1997 at age 36, she was constantly photographed by paparazzi, surrounded by crowds, and the subject of daily headlines, whether truthful or not. Detractors and fans alike scrutinized almost every facet of her life. “It took a long time to understand why people were so interested in me,” Diana once said.

While it feels like every detail about Diana’s short but famous life is well-known — thanks to a constant stream of books, articles, TV, and film projects — some stories haven’t grabbed as much attention. Here are eight lesser-known facts about the People’s Princess.

The wife of Prince Charles, on her first birthday at Park House, Sandringham.
Credit: Hulton Archive/ Hulton Royals Collection via Getty Images

Baby Diana Waited a Week for Her Name

When Diana was born on July 1, 1961, her parents had been hoping for a son. They already had two girls and had lost a baby boy who died shortly after his birth in January 1960. Her father, due to inherit an earldom, desperately wanted a male heir. Diana’s parents were so focused on having a boy that they hadn’t come up with names in case their newborn turned out to be a girl. A week passed before Diana was named Diana Frances Spencer. Frances honored her mother, while Diana was a nod to the Spencer family tree.

Prince Charles and Princess Diana on board the Royal yacht Britannia.
Credit: Mirrorpix via Getty Images

Diana Had Her Own Royal Heritage

Before Diana married into the British royal family, she had her own royal connections via her ancestors; illegitimate offspring of Kings Charles II and James II had joined the aristocratic Spencer line. Thanks to her lineage, Diana actually had more English royal blood than Prince Charles, as the Windsors have strong Germanic ties. Charles’ great-grandfather, King George V, changed the family name from Saxe-Coburg-Gotha to Windsor in 1917, due to tensions with Germany during World War I.

Lady Diana Spencer aged 19 at the Young England Kindergarden.
Credit: Tim Graham/ Time Graham Photo Library via Getty Images

Diana Left School at 16

When Diana was a 15-year-old student in June 1977, she took her O level (ordinary level) exams. These standardized tests are supposed to demonstrate mastery of different subjects; in Diana’s case, English literature, English language, history, art, and geography. Unfortunately, she failed all these exams, perhaps due to anxiety or lack of studying. She then failed a second attempt at her O levels later that year.

After her O level failures, Diana had to leave school when she was 16. Even after becoming a princess, she remembered this setback with a degree of shame. A 1985 documentary recorded her telling a boy at a children’s home, “I never got any O levels: brain the size of a pea, I’ve got.”

Prince Charles & Lady Diana on their wedding day.
Credit: Express Newspapers/ Hulton Royals Collection via Getty Images

Diana Didn’t Say “Obey” in Her Marriage Vows

Diana was only 20 when she wed Prince Charles, who was 12 years her senior. Despite being so young, she was willing to buck royal tradition when it came to her 1981 wedding vows. Other royal brides, even Queen Elizabeth II, had stuck to traditional Church of England wording from 1662 and promised to “obey” their husbands (men were not required to say they would obey their wives). Diana instead opted for the church’s updated marriage service. At the altar, she told Charles she would “love him, comfort him, honor, and keep him, in sickness and in health.”

Though Diana never met future daughter-in-laws Kate Middleton and Meghan Markle, they followed in her footsteps by omitting “obey” in their wedding ceremonies.

Princess Diana and Prince Charles dancing together in Melbourne Australia.
Credit: Mirrorpix via Getty Images

Diana Loved To Dance

Dance was a longtime passion of Diana’s. After years of ballet, tap, and ballroom lessons, she won a school dance competition in 1976. And she didn’t abandon dancing when she became a princess. She even asked ballet dancer Wayne Sleep for lessons in the early 1980s; his schedule couldn’t accommodate her, but he found a colleague to teach her.

After seeing a performance of the musical Cats, Diana and Charles visited Andrew Lloyd Webber backstage. According to Webber’s memoir, Charles remarked on the dancing and Diana demonstrated some splits herself. At the White House in November 1985, First Lady Nancy Reagan prompted John Travolta to ask Diana to dance; they impressed onlookers as they shared the floor in one of the famous photo ops of Diana’s life. In December 1985, Diana stunned Charles at the Royal Opera House — though not in a good way — with an onstage choreographed number with Sleep, set to Billy Joel’s “Uptown Girl” (an incident depicted on The Crown). Sleep later said, “She loved the freedom dancing gave her.”

Freddie Mercury of Queen performs on stage at Live Aid at Wembley Stadium.
Credit: Phil Dent/ Redferns via Getty Images

Diana Went Clubbing With Freddie Mercury

According to actress Cleo Rocos, in the late 1980s she, Diana, comedian Kenny Everett, and rock star Freddie Mercury once got together to watch reruns of The Golden Girls, the sound muted so they could spice up the dialogue themselves. Diana then wanted to join the group on their outing to a gay bar that night. Some were hesitant, but Mercury said, “Go on, let the girl have some fun.” Hidden by sunglasses and a cap, Diana was able to sneak into the bar. She remained unrecognized and, per Rocos, “She loved it.”

That wasn’t the only time Diana went under disguise for a night out on the town. Shortly before her sister-in-law Sarah Ferguson (aka Fergie) wed Prince Andrew on July 23, 1986, Diana, Fergie, and others donned police outfits and staged a fake arrest in front of Buckingham Palace for a bachelorette party prank. They were picked up by a police van, but released once the officers realized who their passengers were. After this, Diana and the gang, still in disguise, headed to a nightclub. They only left when they were recognized.

Princess Diana at Balcony of Royal Enclosure.
Credit: Trinity Mirror / Mirrorpix/ Alamy Stock Photo

Diana Considered Starring in a Sequel to “The Bodyguard”

After the success of 1992’s The Bodyguard with Whitney Houston, Kevin Costner wanted to replicate the successful formula in a sequel that would feature his bodyguard character watching over a post-divorce Diana instead of a popular singer. And, of course, the pair would fall in love. With help from Fergie, Costner was able to speak to Diana about the project. She was interested enough to discuss her lack of acting experience, and also asked if there would be “a “kissing scene.” However, Diana passed away before anything came to fruition.

Diana, Princess of Wales wearing protective body armour & a visor, visiting a landmine minefield.
Credit: Tim Graham/ Tim Graham Photo Library via Getty Images

Diana Walked Through a Cleared Minefield … Twice

Following her divorce from Prince Charles, Diana decided to bring attention to the dangers and devastation of landmines. In January 1997, she traveled to Angola to meet with victims of these mines. She famously walked through a cleared — but still dangerous, should any explosives have been missed or improperly deactivated — path in an active minefield.

But what some may not know is that when some photographers said they needed a second take, Diana didn’t object — she walked through the field once more because she realized how important those images would be. Pictures of Diana made it to the front pages of papers around the world. Mike Whitlam of the British Red Cross said, “It was Diana’s involvement in the anti-personnel landmines that made this appalling weapon of war a global issue and persuaded many countries to sign the Ottawa Convention. Her involvement made a real difference, not just to those people running the charities, but to those people who were helped by them.” In 1997, after Diana’s death, the Nobel Peace Prize was awarded to campaigners to ban landmines.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Allstar Picture Library Ltd/ Alamy Stock Photo

Few first ladies are as recognizable as Jacqueline Kennedy Onassis, who rose to prominence alongside her husband President John F. Kennedy, and then became a preservationist, patron of the arts, and fashion icon. From her time in the White House — cut short by the tragic assassination of JFK as he sat next to her in a motorcade in Dallas — to her later life in New York City, she remained ever-present in the public eye, establishing a legacy as one of 20th-century America’s most admired figures. Here are six facts about Jackie Kennedy that highlight her contributions to the culture and history of the United States.

Jackie Kennedy posing with a flash camera.
Credit: ullstein bild Dtl via Getty Images

Her First Role in D.C. Was as a Journalist

In 1951, shortly after finishing her studies at George Washington University, Jackie (then known as Jacqueline Lee Bouvier) embarked on a journalism career. Working as the “Inquiring Camera Girl” and producing a daily column of the same name for the Washington Times-Herald, Bouvier roved the streets of D.C. with her camera in hand, taking pictures of people she encountered and interviewing them about pressing current affairs. She also covered major events of the time, including President Dwight Eisenhower’s first inauguration in 1951 and the coronation of Queen Elizabeth II in June 1953, the latter of which was one of her final assignments.

While many of her columns featured everyday Americans, she asked questions of high-profile figures as well. One example was her brief interview with then-Vice President Richard Nixon for the April 21, 1953, edition of her column, in which she asked about his views regarding Senate pages. The column also included an answer on the same topic from then-Senator John F. Kennedy, whom she had met at a dinner party the year prior and would go on to marry five months later. Nixon, of course, went on to lose to JFK in the 1960 U.S. presidential election.

Jacqueline Kennedy on CBS White House Tour.
Credit: Bettmann via Getty Images

She Earned an Emmy for a Televised Tour of the White House

In 1941, long before she became a resident of the White House, Jackie toured the building with her mother and sister, and was dismayed by the lack of historical furnishings and informative pamphlets. Shortly after moving in with her husband in 1961, she made it her mission to overhaul the White House experience. As a young, attractive couple, John and Jackie did away with the archaic conventions of administrations past, and began cultivating a more comfortable environment. But it was Jackie’s physical renovation of the building that really stood out.

Enlisting the help of Americana collector Henry Francis du Pont, French designer Stéphane Boudin, and decorator Dorothy Parish, the first lady began work on a massive restoration project. Her goal was not merely to redecorate but to showcase the history of the mansion and the country itself. “It must be restored, and that has nothing to do with decoration,” she told Life magazine of her plans. “That is a question of scholarship.”

Within a mere two weeks, she had used all of the initial $50,000 budget to refurbish the private living quarters — and that was just the beginning. From outfitting the Blue Room with French furniture that President James Monroe had ordered back in 1818, to redesigning the Treaty Room in a Victorian style, Jackie left no corner of the White House untouched. Life featured the project in a September 1961 issue, but it found its biggest spotlight on February 14, 1962, when Mrs. Kennedy unveiled her stunning work on television. Accompanied by CBS News correspondent Charles Collingwood, Jackie led a guided tour of the building on CBS and NBC, drawing an estimated 80 million viewers and earning an honorary Emmy Award for the production.

Jacqueline Kennedy and Prime Minister Nehru of New Dehli.
Credit: Bettmann via Getty Images

She Spoke Multiple Languages

John F. Kennedy may be known for the line “ich bin ein Berliner,” but Jackie was the true polyglot of the family. A lover of languages from a young age, Jackie helped John translate French research books into English when he needed to study up on politics in Southeast Asia, where the French had a heavy presence. But her linguistic prowess really shone through on the campaign trail.

When JFK campaigned for reelection to the U.S. Senate in 1958, Jackie gave her first campaign speech in the native tongue of a French-speaking group in Massachusetts. As she continued to tour the country, she also showcased her familiarity with Italian, Polish, and Spanish. In fact, during the lead-up to the 1960 presidential election, Jackie was the star of a minute-long campaign ad conducted entirely in Spanish.

Full length shot of the President and Mrs. John F. Kennedy.
Credit: Bettmann via Getty Images

She Coined the Term “Camelot” About the Kennedy Administration

Shortly after her husband’s funeral, Jackie welcomed Life magazine reporter Theodore H. White to the family compound in Hyannis Port, Massachusetts, in an effort to ensure JFK’s lasting legacy. During the interview, she coined a term that’s now synonymous with her husband’s administration: “Camelot,” a reference to both Arthurian legend and JFK’s favorite Broadway musical. In likening his presidency to the storied court, Jackie sought to establish her husband as an almost mythical figure. Quoting the musical, she stated, “Don’t let it be forgot, that once there was a spot, for one brief, shining moment that was known as Camelot.” She went on to add that while there would be other Presidents, there would “never be another Camelot again.” Editors at Life reportedly objected to the Camelot theme running throughout the interview, but Jackie was insistent on keeping it and even added her own edits to White’s notes.

Jacqueline Onassis in a department of Viking Press in NYC.
Credit: Bettmann via Getty Images

She Became a Successful Book Editor in New York City

Working with words proved to be one of Jackie’s strong suits, and she spent the final decades of her life in publishing. Having not had a paying job since 1953, she returned to the workforce as a book editor in 1975, after the death of her second husband, Aristotle Onassis. Tommy Guinzburg, the president of Viking Press, brought her in as a consulting editor working primarily on titles that aligned with her interests in history and art. The first title she edited was a work called Remember the Ladies, about the role of 18th-century American women.

In 1977, Viking controversially published a fictional book involving a plot to assassinate a President based on JFK’s brother Ted Kennedy, which led to Jackie’s resignation. The next year, she became an associate editor at Doubleday, where she worked with pop singer Michael Jackson on his 1988 memoir, Moonwalk, among other titles. She continued to work in publishing until her passing in 1994.

Jacqueline Onassis attending a rally to save Grand Central Terminal.
Credit: WWD/ Penske Media via Getty Images

She Helped Save Grand Central Terminal From Being Demolished

Much like she did in preserving the history of the White House, Jackie played a key role in maintaining one of New York City’s most prominent landmarks. In the mid-1970s, developers hatched a plan to demolish part of Grand Central Terminal to build an office tower. The former first lady was among a group of notable New Yorkers who objected to the plan, and in 1975, she spoke at a press conference at Grand Central’s famed Oyster Bar restaurant to protest the destruction of the Beaux Arts-style structure. She and other preservationists worked to ensure the building’s protection, which was ultimately assured by the U.S. Supreme Court decision Penn Central Transportation Co. v. New York City. A plaque dedicated in 2014 at the entrance on 42nd Street and Park Avenue honors Jacqueline Kennedy Onassis for her role in saving the indelible Manhattan icon.

And Grand Central Terminal isn’t the only NYC landmark to commemorate her legacy. Located at the northern end of Central Park, where Jackie was known to jog, the Jacqueline Kennedy Onassis Reservoir pays homage to the former first lady’s contributions to the city. The artificial body of water, constructed between 1858 and 1862, spans 106 acres and was the largest human-made body of water in the world at the time of its creation.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by IHervas/ iStock

The Atacama Desert in Chile, one of the world’s oldest deserts, is also one of the driest places on Earth. While parts of Antarctica have never recorded any precipitation, the Atacama’s rainfall statistics are still quite impressive. Until the early 1970s, some portions of the desert hadn’t seen rainfall for around 400 years. It’s rare to see heavy rainfall even now (though occasionally flash flooding can occur), and when it does, it’s a spectacular sight. The desert blooms, transforming into a beautiful carpet of wildflowers. But even when there isn’t any rain, the vivid colors of its mineral-rich rock and intense hues of its lagoons and salt flats make this a truly breathtaking place. Here are five things you might not know about the Atacama Desert.

View of NASA robot called Zoe, at the Atacama desert near Domeyko range.
Credit: AFP via Getty Images

NASA Uses the Desert to Mimic Mars

When NASA decided to look for life on Mars, it started right here on Earth. In fact, one of the Atacama’s most famous valleys – Valle de Marte – translates to Mars Valley due to its resemblance to the red planet. The rough rocky surface, characterized by bumpy nodules of rock salt or halite, is as close as you’ll get without setting off for space. The Atacama Rover Astrobiology Drilling Studies project, or ARADS for short, has conducted a series of experiments in the region, from growing trees to testing vehicles.

Unsurprisingly, the Atacama Desert’s otherworldly landscape has made it the choice of several filmmakers, too, including in the British series Space Odyssey: Voyage to the Planets and the 2008 James Bond film Quantum of Solace (though not as Mars).

Stars of the Atacama desert.
Credit: donwogdo/ iStock

It’s One of the Best Places in the World for Stargazing

This remote, high-altitude locale — reaching elevations of 13,000 feet — also happens to be one of the best on the planet to observe the night sky. On average, the Atacama Desert experiences 330 cloud-free nights every year, a fact not overlooked by the world’s top astronomers. If you’re used to stargazing from a town or city, the sight of so many stars glittering against a pitch-black sky is sure to be jaw-dropping. Stargazing tours depart from the main tourist town of San Pedro de Atacama to an array of nearby telescopes, where an astronomer guide will help you spot constellations, nebulae and even the rings around Saturn.

Scientists flock here too. On the Chajnantor Plateau, the Atacama Large Millimeter Array, or ALMA for short, boasts a collection of 66 radio antennas — making it the largest radio telescope in the world. Collectively, those antennas are capable of identifying an object the size of a golf ball from a distance of 9 miles.

The European Southern Observatory operates another two sites in Chile’s Atacama Desert, at La Silla and Paranal. They’re also building what’s known as the Extremely Large Telescope (ELT), which should be able to collect 100 million times more light than the human eye, enabling it to search for planets circling stars and help boost understanding of black holes and galaxies.

Snow-covered volcanoes Pomerape and Parinacota, llamas (Lama glama).
Credit: imageBROKER/ Shutterstock

Though It’s One of the Driest Places on Earth, the Fauna Is Surprisingly Diverse

Visitors to the Atacama Desert are often taken aback at just how much wildlife can exist in what appears to be such an inhospitable place. But there are places where rainfall is sufficient to support vegetation and animals, including wild Andean foxes, which live off lizards and small rodents. The viscacha, a type of chinchilla, can also be seen. Herders tend flocks of llamas, bringing them down to mountain lakes to graze. Their wild cousins, vicuñas and guanacos, are harder to locate, but migrate towards water sources.

Birdlife is also abundant. Some of those dazzling high altitude lakes and salt flats boast colorful flocks of flamingos. Where the desert meets the coast, Humboldt penguins nest in cliffs overlooking the ocean. Hummingbirds visit seasonally, drawn by nectar, seeds and insects. When there’s sufficient water to bring out the blooms on the region’s flowers, you might even spot birds of prey, such as burrowing owls.

A panoramic view over the Atacama Desert valleys with humidity called "Camanchaca".
Credit: abriendomundo/ iStock

Water Is Harvested From Fog to Grow Crops — And Even Brew Beer

Around a million people live in the Atacama Desert, many of them making a living from copper or lithium mining or from tourism. But while the annual rainfall is less than 1 millimeter per year, some residents manage to grow crops via an ingenious method of fog harvesting called “camanchaca.” Near the coast, parts of the Atacama Desert are susceptible to thick fog, which rolls in off the Pacific Ocean. In the 1950s, a scientist called Carlos Espinosa Arancibia came up with the idea of a fog catcher — basically, a net with holes to capture the water vapor, which would collect  and drip down the netting into a channel underneath. From there, the moisture could be piped to where it was needed and used to irrigate crops. Since then, research has continued and at the Atrapaniebla (Fog Catcher) Brewery in Peña Blanca, this precious water has even been used to make beer. The owners claim it is the only beer in the world to be produced in this way.

Skull with a coca leaf on a cemetery of mummies, Chauchilla, near Nasca, Atacama Desert.
Credit: imageBROKER/ Alamy Stock Photo

It’s Home to Mummies That Are Older Than Egypt’s

If you thought the mummies in Egypt’s ancient pyramids were the oldest on the planet, think again. The oldest Chinchorro mummy, the Acha man, dates back to approximately 7020 BCE, several thousand years before the first of the Egyptian mummies.

Around a third of these mummies, like Acha man, were mummified naturally, with the dry desert climate helping to preserve the bodies. Later, embalmers replaced internal organs with animal hair and created a clay mask in place of what would have been skin and flesh. Unusually, the Chinchorro people didn’t reserve mummification for royalty, nor did they favor one sex over the other. Archaeologists have recovered 282 Chinchorro mummies from the Atacama since the first discovery just over a century ago.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by PA Images /Alamy Stock Photo

Gone is the soda jerk and milkman. The telegraphists and the bowling alley pinsetters are relics of the past. The haberdasher, town crier, and lowly VCR repairman have all gone the way of the lamplighter. And that’s just during the 20th century! Due to advances in technology and the evolution of society, many occupations that were once considered essential no longer exist in today’s job market. Here’s a look at 13 common professions that have disappeared over the past few centuries.

Close-up of a royal toilet used for the hierarchal power.
Credit: Goss Images/ Alamy Stock Photo

Groom of the Stool

Starting back in Tudor times, the Groom of the Stool handled all of the English king’s toilet-related needs. Whenever the monarch had to evacuate his bowels, the groom would accompany him to the toilet — a bowl of water and towel in tow. While the job might sound (literally) crappy, it was a powerful position. Because of the intimacy involved, it was common for the Groom of the Stool to become one of the king’s closest confidantes. The role wasn’t abolished until 1901.

Female "human computers" perform mathmatics calculations for NASA.
Credit: Smith Collection/Gado/ Archive Photos via Getty Images

Computer

For centuries, “computer” was a job description for flesh-and-blood humans. (The word literally means “a person who computes.”) During the Enlightenment, freelance computers helped scientists double- check their math. Through the 1800s, human computers analyzed data and did calculations at astronomical observatories, helping publish nautical almanacs and predicting the passing of comets. At the turn of the 20th century, human computers in the U.S. government — usually women — did calculations that eventually helped launch people into space. Their story is now immortalized in the 2016 film Hidden Figures.

A hurrier transporting coal in a corf.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Hurrier

Coal mines were never fun places to work. But in the early 1800s, some coal chutes and tunnels were only 2 feet high. Coal was transported up these tunnels in baskets called “corfs,” which were pushed and pulled along a system of rails. The people doing the pushing and pulling? They were hurriers — small boys or girls, sometimes as young as 4 years old. Because the tunnels were so tight, many hurriers pushed the coal by walking on all fours, pushing the corf with their heads (causing some children to develop bald spots).

View of a Women's Health Clinic.
Credit: Mark McMahon/ Corbis Historical via Getty Images

Uinyo

During the early days of Korea’s Joseon dynasty, it was taboo for women to visit male doctors. (Confucian principles — and a spoonful of social shame — demanded strict segregation of the sexes.) Predictably, many women died. So, in the early 1400s, the Korean government allowed some women to practice as female-only doctors, calling them uinyo. Uinyo specialized in giving medical care to other women at state-sponsored health clinics.

Coffee sniffers disturb a coffee party.
Credit: Bildagentur-online/ Universal Images Group via Getty Images

Coffee Sniffer

Back in the 1780s, Frederick II of Prussia didn’t like coffee — he considered it a foreign good that didn’t help the local economy. So he tried curbing coffee consumption by imposing an exorbitant 150% tax on it. Citizens revolted. A black market for unroasted, smuggled beans boomed. When Frederick realized this underground market for coffee was hurting his bottom line, he hired a league of 400 “coffee sniffers” to, quite literally, sniff out the smuggled beans.

A view of the interior of St Paul's Cathedral, 18th century.
Credit: Hulton Archive via Getty Images

Sluggard Waker

Churchgoers know that it can be difficult sometimes to keep your eyelids open during a dull sermon. Back in 18th-century England, this problem was remedied by a sluggard-waker, a man whose sole job was to prowl the pews and wake sleepy parishioners — sometimes by hitting them over the head with a brass-tipped stick.

Vacuum pump for removing night soil from cesspools.
Credit: Universal History Archive/ Universal Images Group

Night Soil Men

In the 19th century, most cities did not have municipal sewer systems. Instead, people relied on outhouses and privies. These, however, were not bottomless pits — they had to be routinely emptied. The person responsible for this unpleasant task was the night soil man. Named because he usually worked under the cover of darkness, the night soil man emptied privies with long-handled buckets and loaded them onto carts, taking the fertilizer to local farms. (But, more often, dumped into the nearest waterway.)

Aerial view of dried herbs blend and a pile of scattered flowers.
Credit: Anna Ok/ Shutterstock

Herb Strewers

Before the invention of the flush toilet in the 18th century, cities often smelled less than desirable. But if you were wealthy enough in the 17th century, you could hire an herb strewer to keep the aroma fresh. King George III, for instance, employed a herb strewer named Mary Rayner, a woman who spent more than 40 years scattering flowers, herbs, and other natural fragrances throughout the royal residence to make it smell welcoming; popular plants included lavender, roses, chamomile, sweet yarrow, basil, marjoram, and violets.

A crew of two soldiers operate an acoustic listening device.
Credit: PhotoQuest/ Archive Photos via Getty Images

Aircraft Listeners

The first practical demonstration of using radar for aircraft detection occurred in the 1930s. By then, airplanes had already been taking flight for more than three decades. Consequently, during times of war, soldiers had to deploy clever methods to find enemy aircraft. During World War I, aircraft listeners used war tubas, which, according to CNN, were “essentially large horns connected to a stethoscope.” Other aircraft listeners used acoustic mirrors, large concrete dishes that amplified sound coming from above.

A man re-creates the role of a Knocker Upper, one of the most notably extinct professions.
Credit: PA Images /Alamy Stock Photo

Knocker-Uppers

Before the advent of the alarm clock, industrial-era workers who needed help waking up in time for work would hire knocker-uppers. These hardy souls would rise in the early hours of the day and patrol the streets with sticks, tapping on their clients’ bedroom windows each morning. Some knocker-uppers, like Mary Smith, were not fans of the stick method: She roused the local sleepyheads by shooting peas at their window panes.

A link Boy in the 18th Century.
Credit: Chronicle/ Alamy Stock Photo

Linkboys

In William Shakespeare’s Henry IV, Falstaff says, “Thou hast saved me a thousand marks in links and torches, walking with thee in the night betwixt tavern and tavern.” Turns out, that’s a pretty accurate description of a linkboy. Typically a young, low-class male, linkboys escorted pedestrians through dark city streets with a torch. The job eventually became obsolete after cities installed streetlamps. (Incidentally, the phrase “can’t hold a candle to…”  was likely a reference to linkboys; anybody who couldn’t “hold a candle” better than a low-class linkboy was viewed as extremely inferior.)

A man operates a linotype machine.
Credit: Bettmann via Getty Images

Linotype Operators

Starting in the late 19th century, lines in newspapers and magazines were often created with a linotype machine. The linotype machine was revolutionary for its time. Before the machine, each letter of an article was individually set by hand into a mold for print. The linotype machine eliminated this process by having operators type each line with a special 90-key keyboard, creating a “line o’ type” set in lead, and then that stamp was used to print the text. This technology was used for almost 100 years, eventually tapering off in the ‘60s and ‘70s.

View of water carriers in Paris.
Credit: Heritage Images/ Hulton Archive via Getty Images

Water Carriers

Water carriers still exist, but they’re an endangered profession. These vital workers have been around for millennia, trudging water from rivers and wells to people’s homes. Some used buckets hanging from a yoke or leather sacks, while others lugged large tankards over their shoulders. And while the water carrier was mostly replaced by modern plumbing, some places still commemorate the once-vital profession. In Hamburg, Germany, a water-carrier named Hans Hummel is celebrated as the local mascot, with more than 100 statues of his likeness sprinkled throughout the city.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.