Original photo by Mercury Green/ Shutterstock

For nearly 200 million years, Earth was the domain of the dinosaurs. Although many people picture giant, green-skinned reptiles roaming the hothouse jungles of the Mesozoic, dinosaurs were incredibly varied creatures — large and small, warm- and cold-blooded — and roamed every continent (yes, including Antarctica). But with some 66 million years or so of separation between humans and dinosaurs, and with many of these wondrous creatures’ secrets hidden away under layers of rock, paleontologists are still trying to understand these amazing beings. Here are six fascinating facts about dinosaurs that debunk long-lasting myths, and explain why paleontology is one of the most exciting scientific fields today.

Ornitholestes Dinosaur in the act of catching the Jurassic Bird.
Credit: Universal History Archive/ Universal Images Group via Getty Images

An Asteroid Didn’t Kill All the Dinosaurs

According to the prevailing theory among scientists, some 66 million years ago, an asteroid we now call Chicxulub slammed into the coast off the Yucatan Peninsula, triggering Earth’s fifth mass extinction in its more than 4 billion-year-long history. The debris ejected into the atmosphere streaked through the sky, and the resulting friction superheated the atmosphere, causing forest fires around the globe. After a prolonged winter caused by a thick haze of ash blotting out the sun, some 75% of all living species on Earth went extinct. Although many of those species were land-dwelling dinosaurs, one group largely survived the devastation — beaked avian dinosaurs known today as birds.

The first avian dinosaur, archaeopteryx, popped up around 150 million years ago. This proto-bird had teeth, though through evolution, a subsect of these flying dinos dropped teeth for beaks instead. Some scientists theorize that these beaks gave birds a post-apocalyptic advantage, because they could more easily dine on the hearty nuts and seeds found throughout the world’s destroyed forests.

Skeleton of the brontosaurus, the largest land animal of all time.
Credit: Bettmann via Getty Images

Science Is Still Debating the Existence of the Brontosaurus

Paleontologists have been debating the existence of the giant sauropod named brontosaurus for nearly 150 years. The story starts during the fast-and-loose “Bone Wars” period of paleontology in the late 19th century. During that time, a bitter rivalry developed between American paleontologists Edward Drinker Cope and Othniel Charles Marsh. It was Marsh who discovered the skeleton of a long-necked apatosaurus in 1877, but the fossil was missing its skull. Marsh incorrectly paired the body with the skull of another dinosaur (likely a camarasaurus). Two years later, when a more complete apatosaurus skeleton wound up in his possession, the specimen was unrecognizable compared to Marsh’s Frankenstein dino, so he instead created a whole new species — brontosaurus, meaning “thunder lizard.” Scientists spotted the mistake in 1903, but the name stuck in the public’s mind.

However, a century later, scientists examining more fossils determined that a close cousin of apatosaurus who had a thinner and less robust neck did exist, and resurrected the name brontosaurus to describe it. However, not all paleontologists accept the revived name for the genus — as beloved as it is.

Painting from a series by Ernest Untermann in the museum at Dinosaur National Monument.
Credit: Bettmann via Getty Images

Dinosaurs Didn’t Live in Water

Although many aquatic reptiles existed during the Age of the Dinosaurs, they were not dinosaurs. The most famous of these water-dwelling creatures was ichthyosaurus, which is actually a distinct marine vertebrate — not a dino. The term “dinosaur” instead mostly refers to terrestrial reptiles who walked with their legs under them (not to the side like crocodilians). Other factors such as foot and neck size also help define what is and isn’t a dinosaur.

Despite the fact that nearly all dinosaurs were terrestrial, a few lived a semi-aquatic existence. The spinosaurus, which lived 99 million to 93 million years ago, shows evidence of eating fish, and ankylosaurus lived near coastlines.
Similarly, species like the flying pterodactyls (also known as pterosaurs) — which could be as large as a fighter jet or as small as a paper airplane — are distant cousins of dinosaurs, not dinosaurs themselves, although media coverage frequently refers to them that way.

 Illustration of Megazostrodon.
Credit: DE AGOSTINI PICTURE LIBRARY via Getty Images

Dinosaurs and Mammals Coexisted

Mammals and dinosaurs coexisted during most of the Mesozoic Era (252 million to 66 million years ago). The first known mammal, called morganucodontids, appeared around 200 million years ago and was about the size of a shrew. During the Age of the Dinosaurs, mammals remained small, never really exceeding the size of a badger, and were a go-to food source for carnivorous dinos (though sometimes the opposite was also true).

Things changed when a giant asteroid smacked into Earth at the end of the Cretaceous period. Mammals’ small size meant they could burrow underground and escape scorching surface temperatures. As for food, mammals were perfectly content with eating insects and aquatic plant life (which also survived the asteroid’s impact), while large herbivorous dinosaurs went hungry. Over the next 25 million years, mammals underwent a drastic growth spurt as the Age of Mammals began to take shape.

Laura Dern and Sam Neill come to the aid of a triceratops in a scene from the film 'Jurassic Park'.
Credit: Universal Pictures/ Moviepix via Getty Images

The Film “Jurassic Park” Is a Bit of a Misnomer

The entry point for many into the world of dinosaurs is Steven Spielberg’s 1993 film Jurassic Park, which inspired an entire generation of paleontologists. Despite its outsized impact on the field, the film does get a few things wrong about dinosaurs. For one, dinosaurs are now thought to sport feathers, whereas Jurassic Park’s dinos represent the lizard-esque depiction popular in times past. Also, the film’s very name is a misnomer, as the dinosaurs that take up the most screen time — such as the Tyrannosaurus rex, velociraptor, and triceratops — all lived during the Cretaceous period (145 million to 66 million years ago).

This may seem like a small difference, but the Age of the Dinosaurs is surprisingly long. In fact, the T. rex lived closer to humans, separated by more than 60 million years, than to the stegosaurus, which lived in the Jurassic period some 80 million years before the “king of the tyrant lizards.”

 A paleontologist at the Dinopolis theme-park lab in Teruel.
Credit: PIERRE-PHILIPPE MARCOU/ AFP via Getty Images

We’re Living in a Golden Age of Dinosaur Discovery

Paleontology is far from a static field. Every year, an estimated 50 new dinosaur species are discovered — that’s basically a new dinosaur every week. Roughly half of those species are being discovered in China, a country that only recently opened up to paleontological pursuits. Technology has also upended the field, with CT scans able to examine the interiors of dino skulls, while other tomographic image techniques can render 3D recreations of bones. Dinosaurs may be a species buried in Earth’s geological past, but uncovering that past has a bright and exciting future.

Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by Steve Cukrov/ Shutterstock

The earliest pies were valued by anybody who needed to store food for the long haul. A well-baked pie, made with a thick crust called a “coffin,” could last in your pantry for up to a year. Pies were especially beloved by sailors, who required stockpiles of well-preserved food that would take up little space in a ship. As the BBC notes, “having a hold stacked with pies was a far more sensible use of precious square metres than bringing a cook and dozens of livestock along for the journey.”

Before the 16th century, most of these pies featured savory fillings. The sweet pies we enjoy today were rare and pricey, reserved for royalty and anybody willing to pay top-dollar for sweeteners. Dessert pies wouldn’t become common among regular folk until the height of the slave trade, which saw millions of sacks of sugar imported from the West Indies.

Like the traveling pies of the Middle Ages, the word “pie” itself has taken a fascinating journey. According to the Oxford English Dictionary, the word may be a nod to the magpie, a black-and-white bird common to Europe. It’s believed that early pies, with their light crusts and dark fillings, resembled the bird’s plumage.

Another theory is that the word refers to the magpie’s nest, which is famous for being stuffed with anything the bird can get its claws on. (Early pies, after all, were a motley mix of whatever the cook could find in the kitchen: meat, offal, fruits, spices, and more.) Support for this etymology lies in Scotland’s national dish of haggis, which — like early pies — is famed for containing a slew of ingredients. According to Alison Richards at NPR, “the word haggis or haggesse turns out to be an alternative name for magpie.”

In any case, pie as we know and define it now was in common rotation by the 19th century. Today it’s a staple of American cuisine, in particular, and the preferred dessert for many holidays. Home cooks and professional chefs alike invent new recipes all the time, sometimes competing in national pie competitions in an attempt to create a new favorite flavor. Nothing beats the classics, though. Here’s a closer look at the origins of five of the world’s most popular pies.

Mince Pies on a cooling rack.
Credit: monkeybusinessimages/ iStock

Mincemeat Pie: Cuisine From the Crusades

In the 13th century, European crusaders returned home with stories of war — and, if legends are true, a few good pie recipes inspired by Middle Eastern cuisine, which fearlessly combined sweet and savory flavors. Clearly impressed, the crusaders told those back home about delicacies containing an array of meats, fruits, and spices available only in distant lands. (A 1390 recipe for “tartes of flesh,” for example, suggests adding saffron to a pastry of sugar, pork, cheese, and eggs.) Expensive to bake, the pie recipes influenced by the crusaders were initially  reserved for the wealthy or presented at feasts and holidays. By the 16th century, though, these “mincemeat” treats were a Christmastime mainstay. Today’s mincemeat pies are actually just mince pies; meat was dropped from the recipe sometime before the Victorian era.

Sweet homemade blueberry pie, ready to eat.
Credit: Brent Hofacker/ Shutterstock

Blueberry Pie: A Wartime Treat

Berry and drupe-based pies have existed since the 16th century, when Queen Elizabeth I famously took a bite of the world’s first cherry pie. But when pies came to the New World, non-native fruits took precedence over blueberries. That changed during the Civil War. As brother fought brother, sardine canneries in New England lost most of their business in the Deep South. Thankfully, Maine was (and is) the largest producer of wild blueberries in the world, so the factories pivoted to canning local fruits instead. Soon, the struggling canneries captured a new market: Soldiers who had never tasted Maine blueberries were downing the stuff by the dozens, transporting them in the form of pies. An American classic was born.  

Slice of apple pie with a scoop of ice cream on top.
Credit: Charles Brutlag/ Shutterstock

Apple Pie: Britain’s Gift to America

The phrase “as American as apple pie” is a misnomer: The dish is decidedly British. Unlike blueberries, apple trees are not native to North America. (Rather, America’s first apple seeds and cuttings were brought over by Jamestown colonists for the purpose of making cider.) Britain’s first apple pie recipe was recorded back in 1381 by Canterbury Tales author Geoffrey Chaucer, who called for figs, berries, saffron, and more. Here it is:

Tak gode Applys and gode Spyeis and Figys and Reysons and Perys and wan they re wel ybrayed colourd wyth Safron wel and do yt in a cosyn and do yt forth to bake wel.

As with blueberry pie, America’s love affair with apple pie may be traced back to the United States military. By the early 20th century, America had become one of the world’s largest apple producers. During World War II, it was common for soldiers abroad to say they were fighting “for mom and apple pie.”

Fresh homemade pumpkin pie.
Credit: Brent Hofacker/ Shutterstock

Pumpkin Pie: Star of America’s First Cookbook

When you think of it, it’s odd to transform a gourd into a sweet dessert. But Americans have been doing it since the mid-17th century. In 1655 in New Netherland — now New York state — a Dutch lawyer named Adriaen van der Donck observed that “the English, who are fond of tasty food, like pumpkins very much and use them also in pies.” These early pastries, however, did not resemble modern pumpkin pies. “They contained layers of sliced (sometimes fried) pumpkin, combined with sugar, spices, and apple slices,” Ellen Terrell writes for the Library of Congress blog. The first modern custard-style pumpkin pie recipe wouldn’t be recorded until  141 years later, when Amelia Simmons wrote the first American cookbook. (You can view the recipe here.)

Aerial view of a slice of key lime pie.
Credit: PamelaJoeMcFarlane/ iStock

Key Lime Pie: The Pride of Florida

Floridians are defensive about their state pie — and for good reason. Key limes, with their uniquely pleasant pucker, are named for their association with the Florida Keys, where they first thrived in the United States. But the pie itself may not be a Sunshine State creation. According to some sources, the dairy-loving masterminds at the Borden Company concocted the recipe that would become key lime pie in a New York City test kitchen in 1931. (The recipe was a ploy to sell sweetened condensed milk.) Floridians, however, still insist that the original key lime pie was invented by a cook with the mysterious name of “Aunt Sally,” who allegedly adapted the recipe after acquiring it from a sponge fisherman working off the Florida Keys.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by AF archive/ Alamy Stock Photo

Four decades ago, an unlikely character wove its way into film history with its glowing heart and desire to phone home. E.T. the Extra-Terrestrial opened in theaters on July 11, 1982, and won over audiences with his penchant for Speak & Spells and Reese’s Pieces.

The film was another breakthrough for Steven Spielberg, who had been on a roll with Jaws (1975), Close Encounters of the Third Kind (1977), and Raiders of the Lost Ark (1981). It also marked the star-making turn for Drew Barrymore, who was just 7 years old when she played Gertie, as well as Henry Thomas, who won the title (human) role at the age of 9 of Elliot, the boy who bridges worlds by forming a tight friendship with an alien. Here are 10 facts you may not know about the Academy Award-winning film, which grossed almost $800 million worldwide (roughly $2.3 billion today).

The Extra-Terrestrial and Steven Speilberg poses for a portrait in Los Angeles, California.
Credit: Aaron Rapoport/ Corbis Historical via Getty Images

Spielberg Came up With the Idea While Directing Another Movie

While working on his 1977 sci-fi classic Close Encounters of the Third Kind, the director wondered about another alien concept and played out what could happen if the creature didn’t go back to the mothership. Also around that time, he had been thinking of making a film exploring the impact of divorce on teens, since his own parents had gotten divorced when he was 15. Combining the two, he created “the most personal thing I’d done as a director,” he said.

One of the famous shots from the E.T. film.
Credit: AF archive/ Alamy Stock Photo

Everything Was Filmed With Code Names for Fear of Plagiarism

Spielberg was worried that his innovative plot might be ripped off quickly, so he had the production go to great lengths to keep everything under wraps while they filmed from September to December 1981. Actors had to read the script behind closed doors and everyone on the set also had to wear an ID card to ensure no unauthorized people snuck in for a peek. And the entire project was filmed under the codename “A Boy’s Life.”

Close-up of the E.T. movie poster.
Credit: Blueee/ Alamy Stock Photo

One of the Movie’s Posters Was Inspired by Michelangelo

If the movie’s poster of the universe with a human hand reaching out looked familiar, it’s because the late artist John Alvin was inspired by Michelangelo’s “The Creation of Adam,” the centerpiece of his Sistine Chapel fresco masterpiece. Alvin’s daughter was the hand model for the image that was used to promote the film. The original artwork hung on writer and producer Bob Bendetson’s office wall until it was auctioned off for $394,000 in 2016.

Child looking at men with flashlights in a scene from the film 'E.T. The Extra-Terrestrial'.
Credit: Archive Photos/ Moviepix via Getty Images

Another Actor Was Almost Cast as Elliot

The on-screen chemistry between the child actors was crucial to the film. So before casting director Marci Liroff finalized her choices, she invited the finalists — including a boy she had honed in on to play Elliot — over to screenwriter Melissa Mathison’s home to play the role-playing game Dungeons and Dragons. “In about three minutes, it became very clear that nobody liked this little boy,” Liroff said. “I just think when you play a game sometimes, your true character comes out … He became very bossy. It just showed that he was not our kid. So I basically had to start over.”

Peter Coyote leaning over to talk to Henry Thomas in a scene from the film.
Credit: Archive Photos/ Moviepix via Getty Images

Thomas Nailed the Role With a Teary Audition

Soon, they called in Thomas, who had just been in a film called Raggedy Man, and flew him in from Texas for the audition. Liroff said they set up an improv-like scenario about the NASA officials coming to take E.T. away. The young Thomas stepped into the character so deeply that he had tears in his eyes — which, in turn, led all the others in the room to bawl as well. “He just became this little boy. He used, I think, his fear and anxiety, to really push further in the role and he moved us so deeply and so fully,” she said and called it one of the most moving auditions she’d ever experienced.

A young Drew Barrymore in a scene of the E.T. film.
Credit: ScreenProd / Photononstop/ Alamy Stock Photo

Barrymore Was Cast After Being Turned Down for “Poltergeist”

Although Barrymore and Spielberg ended up having such a close relationship that he later became her godfather, she had first auditioned for the role of clairvoyant Carol Anne (“they’re heeeere!”) in his 1982 horror classic Poltergeist. Heather O’Rourke got the role, but the director turned to Barrymore for his following project, E.T. Barrymore now remembers her time fondly with a souvenir she took from the set: the red cowboy hat. “It is in [my daughters’] room somewhere and reminds me that I was 6 years old wearing that hat,” she told Domino. “I’m so glad I still have it.”

ET looking around door in a scene from the film.
Credit: Archive Photos/ Moviepix via Getty Images

Eighteen People Contributed to E.T.’s Voice

The primary voice behind the alien was an older woman named Pat Welsh, who smoked two packs of cigarettes a day to get that certain vocal timbre. But when it came to E.T.’s other sounds, like burping and snorting, they were sourced from all over, including from the film’s sound effect creator’s wife and Spielberg himself. Ultimately, there were a total of 18 people who took part in giving the fictional friend a voice, and at some points, even sea otters, raccoons, and horses were used.

Close-up shot of Thomas during a scene on set.
Credit: TCD/Prod.DB/ Alamy Stock Photo

Thomas Ate a Lot of Candy on Set

E.T.’s favorite treat, Reese’s Pieces — which became the choice snack after Mars. Inc passed on the use of M&Ms — also became Thomas’ obsession. “I made myself sick from eating them because we always had those two-pound bags lying around,” Thomas told CNN. “They were set dressing in Elliott’s room, so in between takes, I was constantly eating those things.“

A look at a scene from the E.T. film.
Credit: Allstar Picture Library Ltd./ Alamy Stock Photo

The Movie Was Shot From a Kids’ Eye Level

To emphasize the story from Elliot’s point of view, the entire movie up until the final act was shot from the eye level of a child. In fact, no adult face was ever shown in the film with one big exception: Elliot’s mom, Mary. “She was like one of the kids,” Spielberg told Entertainment Weekly.

Director Steven Spielberg and actor Harrison Ford on set of the E.T. film.
Credit: Pool GARCIA/URLI/ Gamma-Rapho via Getty Images

Harrison Ford Had a Cameo That Was Cut

Among the grown-ups who appeared without their faces shown was Harrison Ford — then at the peak of his Indiana Jones fame — playing the part of the school principal who scolds Elliot after the frog rescue scene. In the cut scene, Elliot’s chair starts to levitate until he hits the ceiling and crashes down with a perfect landing. Ford’s character was oblivious to it all since he was too busy reprimanding the child to notice.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Trinity Mirror / Mirrorpix/ Alamy Stock Photo

Diana, Princess of Wales, was — and arguably, still is — one of the most famous women in the world. From the time she began dating Prince Charles in 1980 to her tragic death in 1997 at age 36, she was constantly photographed by paparazzi, surrounded by crowds, and the subject of daily headlines, whether truthful or not. Detractors and fans alike scrutinized almost every facet of her life. “It took a long time to understand why people were so interested in me,” Diana once said.

While it feels like every detail about Diana’s short but famous life is well-known — thanks to a constant stream of books, articles, TV, and film projects — some stories haven’t grabbed as much attention. Here are eight lesser-known facts about the People’s Princess.

The wife of Prince Charles, on her first birthday at Park House, Sandringham.
Credit: Hulton Archive/ Hulton Royals Collection via Getty Images

Baby Diana Waited a Week for Her Name

When Diana was born on July 1, 1961, her parents had been hoping for a son. They already had two girls and had lost a baby boy who died shortly after his birth in January 1960. Her father, due to inherit an earldom, desperately wanted a male heir. Diana’s parents were so focused on having a boy that they hadn’t come up with names in case their newborn turned out to be a girl. A week passed before Diana was named Diana Frances Spencer. Frances honored her mother, while Diana was a nod to the Spencer family tree.

Prince Charles and Princess Diana on board the Royal yacht Britannia.
Credit: Mirrorpix via Getty Images

Diana Had Her Own Royal Heritage

Before Diana married into the British royal family, she had her own royal connections via her ancestors; illegitimate offspring of Kings Charles II and James II had joined the aristocratic Spencer line. Thanks to her lineage, Diana actually had more English royal blood than Prince Charles, as the Windsors have strong Germanic ties. Charles’ great-grandfather, King George V, changed the family name from Saxe-Coburg-Gotha to Windsor in 1917, due to tensions with Germany during World War I.

Lady Diana Spencer aged 19 at the Young England Kindergarden.
Credit: Tim Graham/ Time Graham Photo Library via Getty Images

Diana Left School at 16

When Diana was a 15-year-old student in June 1977, she took her O level (ordinary level) exams. These standardized tests are supposed to demonstrate mastery of different subjects; in Diana’s case, English literature, English language, history, art, and geography. Unfortunately, she failed all these exams, perhaps due to anxiety or lack of studying. She then failed a second attempt at her O levels later that year.

After her O level failures, Diana had to leave school when she was 16. Even after becoming a princess, she remembered this setback with a degree of shame. A 1985 documentary recorded her telling a boy at a children’s home, “I never got any O levels: brain the size of a pea, I’ve got.”

Prince Charles & Lady Diana on their wedding day.
Credit: Express Newspapers/ Hulton Royals Collection via Getty Images

Diana Didn’t Say “Obey” in Her Marriage Vows

Diana was only 20 when she wed Prince Charles, who was 12 years her senior. Despite being so young, she was willing to buck royal tradition when it came to her 1981 wedding vows. Other royal brides, even Queen Elizabeth II, had stuck to traditional Church of England wording from 1662 and promised to “obey” their husbands (men were not required to say they would obey their wives). Diana instead opted for the church’s updated marriage service. At the altar, she told Charles she would “love him, comfort him, honor, and keep him, in sickness and in health.”

Though Diana never met future daughter-in-laws Kate Middleton and Meghan Markle, they followed in her footsteps by omitting “obey” in their wedding ceremonies.

Princess Diana and Prince Charles dancing together in Melbourne Australia.
Credit: Mirrorpix via Getty Images

Diana Loved To Dance

Dance was a longtime passion of Diana’s. After years of ballet, tap, and ballroom lessons, she won a school dance competition in 1976. And she didn’t abandon dancing when she became a princess. She even asked ballet dancer Wayne Sleep for lessons in the early 1980s; his schedule couldn’t accommodate her, but he found a colleague to teach her.

After seeing a performance of the musical Cats, Diana and Charles visited Andrew Lloyd Webber backstage. According to Webber’s memoir, Charles remarked on the dancing and Diana demonstrated some splits herself. At the White House in November 1985, First Lady Nancy Reagan prompted John Travolta to ask Diana to dance; they impressed onlookers as they shared the floor in one of the famous photo ops of Diana’s life. In December 1985, Diana stunned Charles at the Royal Opera House — though not in a good way — with an onstage choreographed number with Sleep, set to Billy Joel’s “Uptown Girl” (an incident depicted on The Crown). Sleep later said, “She loved the freedom dancing gave her.”

Freddie Mercury of Queen performs on stage at Live Aid at Wembley Stadium.
Credit: Phil Dent/ Redferns via Getty Images

Diana Went Clubbing With Freddie Mercury

According to actress Cleo Rocos, in the late 1980s she, Diana, comedian Kenny Everett, and rock star Freddie Mercury once got together to watch reruns of The Golden Girls, the sound muted so they could spice up the dialogue themselves. Diana then wanted to join the group on their outing to a gay bar that night. Some were hesitant, but Mercury said, “Go on, let the girl have some fun.” Hidden by sunglasses and a cap, Diana was able to sneak into the bar. She remained unrecognized and, per Rocos, “She loved it.”

That wasn’t the only time Diana went under disguise for a night out on the town. Shortly before her sister-in-law Sarah Ferguson (aka Fergie) wed Prince Andrew on July 23, 1986, Diana, Fergie, and others donned police outfits and staged a fake arrest in front of Buckingham Palace for a bachelorette party prank. They were picked up by a police van, but released once the officers realized who their passengers were. After this, Diana and the gang, still in disguise, headed to a nightclub. They only left when they were recognized.

Princess Diana at Balcony of Royal Enclosure.
Credit: Trinity Mirror / Mirrorpix/ Alamy Stock Photo

Diana Considered Starring in a Sequel to “The Bodyguard”

After the success of 1992’s The Bodyguard with Whitney Houston, Kevin Costner wanted to replicate the successful formula in a sequel that would feature his bodyguard character watching over a post-divorce Diana instead of a popular singer. And, of course, the pair would fall in love. With help from Fergie, Costner was able to speak to Diana about the project. She was interested enough to discuss her lack of acting experience, and also asked if there would be “a “kissing scene.” However, Diana passed away before anything came to fruition.

Diana, Princess of Wales wearing protective body armour & a visor, visiting a landmine minefield.
Credit: Tim Graham/ Tim Graham Photo Library via Getty Images

Diana Walked Through a Cleared Minefield … Twice

Following her divorce from Prince Charles, Diana decided to bring attention to the dangers and devastation of landmines. In January 1997, she traveled to Angola to meet with victims of these mines. She famously walked through a cleared — but still dangerous, should any explosives have been missed or improperly deactivated — path in an active minefield.

But what some may not know is that when some photographers said they needed a second take, Diana didn’t object — she walked through the field once more because she realized how important those images would be. Pictures of Diana made it to the front pages of papers around the world. Mike Whitlam of the British Red Cross said, “It was Diana’s involvement in the anti-personnel landmines that made this appalling weapon of war a global issue and persuaded many countries to sign the Ottawa Convention. Her involvement made a real difference, not just to those people running the charities, but to those people who were helped by them.” In 1997, after Diana’s death, the Nobel Peace Prize was awarded to campaigners to ban landmines.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Allstar Picture Library Ltd/ Alamy Stock Photo

Few first ladies are as recognizable as Jacqueline Kennedy Onassis, who rose to prominence alongside her husband President John F. Kennedy, and then became a preservationist, patron of the arts, and fashion icon. From her time in the White House — cut short by the tragic assassination of JFK as he sat next to her in a motorcade in Dallas — to her later life in New York City, she remained ever-present in the public eye, establishing a legacy as one of 20th-century America’s most admired figures. Here are six facts about Jackie Kennedy that highlight her contributions to the culture and history of the United States.

Jackie Kennedy posing with a flash camera.
Credit: ullstein bild Dtl via Getty Images

Her First Role in D.C. Was as a Journalist

In 1951, shortly after finishing her studies at George Washington University, Jackie (then known as Jacqueline Lee Bouvier) embarked on a journalism career. Working as the “Inquiring Camera Girl” and producing a daily column of the same name for the Washington Times-Herald, Bouvier roved the streets of D.C. with her camera in hand, taking pictures of people she encountered and interviewing them about pressing current affairs. She also covered major events of the time, including President Dwight Eisenhower’s first inauguration in 1951 and the coronation of Queen Elizabeth II in June 1953, the latter of which was one of her final assignments.

While many of her columns featured everyday Americans, she asked questions of high-profile figures as well. One example was her brief interview with then-Vice President Richard Nixon for the April 21, 1953, edition of her column, in which she asked about his views regarding Senate pages. The column also included an answer on the same topic from then-Senator John F. Kennedy, whom she had met at a dinner party the year prior and would go on to marry five months later. Nixon, of course, went on to lose to JFK in the 1960 U.S. presidential election.

Jacqueline Kennedy on CBS White House Tour.
Credit: Bettmann via Getty Images

She Earned an Emmy for a Televised Tour of the White House

In 1941, long before she became a resident of the White House, Jackie toured the building with her mother and sister, and was dismayed by the lack of historical furnishings and informative pamphlets. Shortly after moving in with her husband in 1961, she made it her mission to overhaul the White House experience. As a young, attractive couple, John and Jackie did away with the archaic conventions of administrations past, and began cultivating a more comfortable environment. But it was Jackie’s physical renovation of the building that really stood out.

Enlisting the help of Americana collector Henry Francis du Pont, French designer Stéphane Boudin, and decorator Dorothy Parish, the first lady began work on a massive restoration project. Her goal was not merely to redecorate but to showcase the history of the mansion and the country itself. “It must be restored, and that has nothing to do with decoration,” she told Life magazine of her plans. “That is a question of scholarship.”

Within a mere two weeks, she had used all of the initial $50,000 budget to refurbish the private living quarters — and that was just the beginning. From outfitting the Blue Room with French furniture that President James Monroe had ordered back in 1818, to redesigning the Treaty Room in a Victorian style, Jackie left no corner of the White House untouched. Life featured the project in a September 1961 issue, but it found its biggest spotlight on February 14, 1962, when Mrs. Kennedy unveiled her stunning work on television. Accompanied by CBS News correspondent Charles Collingwood, Jackie led a guided tour of the building on CBS and NBC, drawing an estimated 80 million viewers and earning an honorary Emmy Award for the production.

Jacqueline Kennedy and Prime Minister Nehru of New Dehli.
Credit: Bettmann via Getty Images

She Spoke Multiple Languages

John F. Kennedy may be known for the line “ich bin ein Berliner,” but Jackie was the true polyglot of the family. A lover of languages from a young age, Jackie helped John translate French research books into English when he needed to study up on politics in Southeast Asia, where the French had a heavy presence. But her linguistic prowess really shone through on the campaign trail.

When JFK campaigned for reelection to the U.S. Senate in 1958, Jackie gave her first campaign speech in the native tongue of a French-speaking group in Massachusetts. As she continued to tour the country, she also showcased her familiarity with Italian, Polish, and Spanish. In fact, during the lead-up to the 1960 presidential election, Jackie was the star of a minute-long campaign ad conducted entirely in Spanish.

Full length shot of the President and Mrs. John F. Kennedy.
Credit: Bettmann via Getty Images

She Coined the Term “Camelot” About the Kennedy Administration

Shortly after her husband’s funeral, Jackie welcomed Life magazine reporter Theodore H. White to the family compound in Hyannis Port, Massachusetts, in an effort to ensure JFK’s lasting legacy. During the interview, she coined a term that’s now synonymous with her husband’s administration: “Camelot,” a reference to both Arthurian legend and JFK’s favorite Broadway musical. In likening his presidency to the storied court, Jackie sought to establish her husband as an almost mythical figure. Quoting the musical, she stated, “Don’t let it be forgot, that once there was a spot, for one brief, shining moment that was known as Camelot.” She went on to add that while there would be other Presidents, there would “never be another Camelot again.” Editors at Life reportedly objected to the Camelot theme running throughout the interview, but Jackie was insistent on keeping it and even added her own edits to White’s notes.

Jacqueline Onassis in a department of Viking Press in NYC.
Credit: Bettmann via Getty Images

She Became a Successful Book Editor in New York City

Working with words proved to be one of Jackie’s strong suits, and she spent the final decades of her life in publishing. Having not had a paying job since 1953, she returned to the workforce as a book editor in 1975, after the death of her second husband, Aristotle Onassis. Tommy Guinzburg, the president of Viking Press, brought her in as a consulting editor working primarily on titles that aligned with her interests in history and art. The first title she edited was a work called Remember the Ladies, about the role of 18th-century American women.

In 1977, Viking controversially published a fictional book involving a plot to assassinate a President based on JFK’s brother Ted Kennedy, which led to Jackie’s resignation. The next year, she became an associate editor at Doubleday, where she worked with pop singer Michael Jackson on his 1988 memoir, Moonwalk, among other titles. She continued to work in publishing until her passing in 1994.

Jacqueline Onassis attending a rally to save Grand Central Terminal.
Credit: WWD/ Penske Media via Getty Images

She Helped Save Grand Central Terminal From Being Demolished

Much like she did in preserving the history of the White House, Jackie played a key role in maintaining one of New York City’s most prominent landmarks. In the mid-1970s, developers hatched a plan to demolish part of Grand Central Terminal to build an office tower. The former first lady was among a group of notable New Yorkers who objected to the plan, and in 1975, she spoke at a press conference at Grand Central’s famed Oyster Bar restaurant to protest the destruction of the Beaux Arts-style structure. She and other preservationists worked to ensure the building’s protection, which was ultimately assured by the U.S. Supreme Court decision Penn Central Transportation Co. v. New York City. A plaque dedicated in 2014 at the entrance on 42nd Street and Park Avenue honors Jacqueline Kennedy Onassis for her role in saving the indelible Manhattan icon.

And Grand Central Terminal isn’t the only NYC landmark to commemorate her legacy. Located at the northern end of Central Park, where Jackie was known to jog, the Jacqueline Kennedy Onassis Reservoir pays homage to the former first lady’s contributions to the city. The artificial body of water, constructed between 1858 and 1862, spans 106 acres and was the largest human-made body of water in the world at the time of its creation.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by IHervas/ iStock

The Atacama Desert in Chile, one of the world’s oldest deserts, is also one of the driest places on Earth. While parts of Antarctica have never recorded any precipitation, the Atacama’s rainfall statistics are still quite impressive. Until the early 1970s, some portions of the desert hadn’t seen rainfall for around 400 years. It’s rare to see heavy rainfall even now (though occasionally flash flooding can occur), and when it does, it’s a spectacular sight. The desert blooms, transforming into a beautiful carpet of wildflowers. But even when there isn’t any rain, the vivid colors of its mineral-rich rock and intense hues of its lagoons and salt flats make this a truly breathtaking place. Here are five things you might not know about the Atacama Desert.

View of NASA robot called Zoe, at the Atacama desert near Domeyko range.
Credit: AFP via Getty Images

NASA Uses the Desert to Mimic Mars

When NASA decided to look for life on Mars, it started right here on Earth. In fact, one of the Atacama’s most famous valleys – Valle de Marte – translates to Mars Valley due to its resemblance to the red planet. The rough rocky surface, characterized by bumpy nodules of rock salt or halite, is as close as you’ll get without setting off for space. The Atacama Rover Astrobiology Drilling Studies project, or ARADS for short, has conducted a series of experiments in the region, from growing trees to testing vehicles.

Unsurprisingly, the Atacama Desert’s otherworldly landscape has made it the choice of several filmmakers, too, including in the British series Space Odyssey: Voyage to the Planets and the 2008 James Bond film Quantum of Solace (though not as Mars).

Stars of the Atacama desert.
Credit: donwogdo/ iStock

It’s One of the Best Places in the World for Stargazing

This remote, high-altitude locale — reaching elevations of 13,000 feet — also happens to be one of the best on the planet to observe the night sky. On average, the Atacama Desert experiences 330 cloud-free nights every year, a fact not overlooked by the world’s top astronomers. If you’re used to stargazing from a town or city, the sight of so many stars glittering against a pitch-black sky is sure to be jaw-dropping. Stargazing tours depart from the main tourist town of San Pedro de Atacama to an array of nearby telescopes, where an astronomer guide will help you spot constellations, nebulae and even the rings around Saturn.

Scientists flock here too. On the Chajnantor Plateau, the Atacama Large Millimeter Array, or ALMA for short, boasts a collection of 66 radio antennas — making it the largest radio telescope in the world. Collectively, those antennas are capable of identifying an object the size of a golf ball from a distance of 9 miles.

The European Southern Observatory operates another two sites in Chile’s Atacama Desert, at La Silla and Paranal. They’re also building what’s known as the Extremely Large Telescope (ELT), which should be able to collect 100 million times more light than the human eye, enabling it to search for planets circling stars and help boost understanding of black holes and galaxies.

Snow-covered volcanoes Pomerape and Parinacota, llamas (Lama glama).
Credit: imageBROKER/ Shutterstock

Though It’s One of the Driest Places on Earth, the Fauna Is Surprisingly Diverse

Visitors to the Atacama Desert are often taken aback at just how much wildlife can exist in what appears to be such an inhospitable place. But there are places where rainfall is sufficient to support vegetation and animals, including wild Andean foxes, which live off lizards and small rodents. The viscacha, a type of chinchilla, can also be seen. Herders tend flocks of llamas, bringing them down to mountain lakes to graze. Their wild cousins, vicuñas and guanacos, are harder to locate, but migrate towards water sources.

Birdlife is also abundant. Some of those dazzling high altitude lakes and salt flats boast colorful flocks of flamingos. Where the desert meets the coast, Humboldt penguins nest in cliffs overlooking the ocean. Hummingbirds visit seasonally, drawn by nectar, seeds and insects. When there’s sufficient water to bring out the blooms on the region’s flowers, you might even spot birds of prey, such as burrowing owls.

A panoramic view over the Atacama Desert valleys with humidity called "Camanchaca".
Credit: abriendomundo/ iStock

Water Is Harvested From Fog to Grow Crops — And Even Brew Beer

Around a million people live in the Atacama Desert, many of them making a living from copper or lithium mining or from tourism. But while the annual rainfall is less than 1 millimeter per year, some residents manage to grow crops via an ingenious method of fog harvesting called “camanchaca.” Near the coast, parts of the Atacama Desert are susceptible to thick fog, which rolls in off the Pacific Ocean. In the 1950s, a scientist called Carlos Espinosa Arancibia came up with the idea of a fog catcher — basically, a net with holes to capture the water vapor, which would collect  and drip down the netting into a channel underneath. From there, the moisture could be piped to where it was needed and used to irrigate crops. Since then, research has continued and at the Atrapaniebla (Fog Catcher) Brewery in Peña Blanca, this precious water has even been used to make beer. The owners claim it is the only beer in the world to be produced in this way.

Skull with a coca leaf on a cemetery of mummies, Chauchilla, near Nasca, Atacama Desert.
Credit: imageBROKER/ Alamy Stock Photo

It’s Home to Mummies That Are Older Than Egypt’s

If you thought the mummies in Egypt’s ancient pyramids were the oldest on the planet, think again. The oldest Chinchorro mummy, the Acha man, dates back to approximately 7020 BCE, several thousand years before the first of the Egyptian mummies.

Around a third of these mummies, like Acha man, were mummified naturally, with the dry desert climate helping to preserve the bodies. Later, embalmers replaced internal organs with animal hair and created a clay mask in place of what would have been skin and flesh. Unusually, the Chinchorro people didn’t reserve mummification for royalty, nor did they favor one sex over the other. Archaeologists have recovered 282 Chinchorro mummies from the Atacama since the first discovery just over a century ago.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by PA Images /Alamy Stock Photo

Gone is the soda jerk and milkman. The telegraphists and the bowling alley pinsetters are relics of the past. The haberdasher, town crier, and lowly VCR repairman have all gone the way of the lamplighter. And that’s just during the 20th century! Due to advances in technology and the evolution of society, many occupations that were once considered essential no longer exist in today’s job market. Here’s a look at 13 common professions that have disappeared over the past few centuries.

Close-up of a royal toilet used for the hierarchal power.
Credit: Goss Images/ Alamy Stock Photo

Groom of the Stool

Starting back in Tudor times, the Groom of the Stool handled all of the English king’s toilet-related needs. Whenever the monarch had to evacuate his bowels, the groom would accompany him to the toilet — a bowl of water and towel in tow. While the job might sound (literally) crappy, it was a powerful position. Because of the intimacy involved, it was common for the Groom of the Stool to become one of the king’s closest confidantes. The role wasn’t abolished until 1901.

Female "human computers" perform mathmatics calculations for NASA.
Credit: Smith Collection/Gado/ Archive Photos via Getty Images

Computer

For centuries, “computer” was a job description for flesh-and-blood humans. (The word literally means “a person who computes.”) During the Enlightenment, freelance computers helped scientists double- check their math. Through the 1800s, human computers analyzed data and did calculations at astronomical observatories, helping publish nautical almanacs and predicting the passing of comets. At the turn of the 20th century, human computers in the U.S. government — usually women — did calculations that eventually helped launch people into space. Their story is now immortalized in the 2016 film Hidden Figures.

A hurrier transporting coal in a corf.
Credit: Universal History Archive/ Universal Images Group via Getty Images

Hurrier

Coal mines were never fun places to work. But in the early 1800s, some coal chutes and tunnels were only 2 feet high. Coal was transported up these tunnels in baskets called “corfs,” which were pushed and pulled along a system of rails. The people doing the pushing and pulling? They were hurriers — small boys or girls, sometimes as young as 4 years old. Because the tunnels were so tight, many hurriers pushed the coal by walking on all fours, pushing the corf with their heads (causing some children to develop bald spots).

View of a Women's Health Clinic.
Credit: Mark McMahon/ Corbis Historical via Getty Images

Uinyo

During the early days of Korea’s Joseon dynasty, it was taboo for women to visit male doctors. (Confucian principles — and a spoonful of social shame — demanded strict segregation of the sexes.) Predictably, many women died. So, in the early 1400s, the Korean government allowed some women to practice as female-only doctors, calling them uinyo. Uinyo specialized in giving medical care to other women at state-sponsored health clinics.

Coffee sniffers disturb a coffee party.
Credit: Bildagentur-online/ Universal Images Group via Getty Images

Coffee Sniffer

Back in the 1780s, Frederick II of Prussia didn’t like coffee — he considered it a foreign good that didn’t help the local economy. So he tried curbing coffee consumption by imposing an exorbitant 150% tax on it. Citizens revolted. A black market for unroasted, smuggled beans boomed. When Frederick realized this underground market for coffee was hurting his bottom line, he hired a league of 400 “coffee sniffers” to, quite literally, sniff out the smuggled beans.

A view of the interior of St Paul's Cathedral, 18th century.
Credit: Hulton Archive via Getty Images

Sluggard Waker

Churchgoers know that it can be difficult sometimes to keep your eyelids open during a dull sermon. Back in 18th-century England, this problem was remedied by a sluggard-waker, a man whose sole job was to prowl the pews and wake sleepy parishioners — sometimes by hitting them over the head with a brass-tipped stick.

Vacuum pump for removing night soil from cesspools.
Credit: Universal History Archive/ Universal Images Group

Night Soil Men

In the 19th century, most cities did not have municipal sewer systems. Instead, people relied on outhouses and privies. These, however, were not bottomless pits — they had to be routinely emptied. The person responsible for this unpleasant task was the night soil man. Named because he usually worked under the cover of darkness, the night soil man emptied privies with long-handled buckets and loaded them onto carts, taking the fertilizer to local farms. (But, more often, dumped into the nearest waterway.)

Aerial view of dried herbs blend and a pile of scattered flowers.
Credit: Anna Ok/ Shutterstock

Herb Strewers

Before the invention of the flush toilet in the 18th century, cities often smelled less than desirable. But if you were wealthy enough in the 17th century, you could hire an herb strewer to keep the aroma fresh. King George III, for instance, employed a herb strewer named Mary Rayner, a woman who spent more than 40 years scattering flowers, herbs, and other natural fragrances throughout the royal residence to make it smell welcoming; popular plants included lavender, roses, chamomile, sweet yarrow, basil, marjoram, and violets.

A crew of two soldiers operate an acoustic listening device.
Credit: PhotoQuest/ Archive Photos via Getty Images

Aircraft Listeners

The first practical demonstration of using radar for aircraft detection occurred in the 1930s. By then, airplanes had already been taking flight for more than three decades. Consequently, during times of war, soldiers had to deploy clever methods to find enemy aircraft. During World War I, aircraft listeners used war tubas, which, according to CNN, were “essentially large horns connected to a stethoscope.” Other aircraft listeners used acoustic mirrors, large concrete dishes that amplified sound coming from above.

A man re-creates the role of a Knocker Upper, one of the most notably extinct professions.
Credit: PA Images /Alamy Stock Photo

Knocker-Uppers

Before the advent of the alarm clock, industrial-era workers who needed help waking up in time for work would hire knocker-uppers. These hardy souls would rise in the early hours of the day and patrol the streets with sticks, tapping on their clients’ bedroom windows each morning. Some knocker-uppers, like Mary Smith, were not fans of the stick method: She roused the local sleepyheads by shooting peas at their window panes.

A link Boy in the 18th Century.
Credit: Chronicle/ Alamy Stock Photo

Linkboys

In William Shakespeare’s Henry IV, Falstaff says, “Thou hast saved me a thousand marks in links and torches, walking with thee in the night betwixt tavern and tavern.” Turns out, that’s a pretty accurate description of a linkboy. Typically a young, low-class male, linkboys escorted pedestrians through dark city streets with a torch. The job eventually became obsolete after cities installed streetlamps. (Incidentally, the phrase “can’t hold a candle to…”  was likely a reference to linkboys; anybody who couldn’t “hold a candle” better than a low-class linkboy was viewed as extremely inferior.)

A man operates a linotype machine.
Credit: Bettmann via Getty Images

Linotype Operators

Starting in the late 19th century, lines in newspapers and magazines were often created with a linotype machine. The linotype machine was revolutionary for its time. Before the machine, each letter of an article was individually set by hand into a mold for print. The linotype machine eliminated this process by having operators type each line with a special 90-key keyboard, creating a “line o’ type” set in lead, and then that stamp was used to print the text. This technology was used for almost 100 years, eventually tapering off in the ‘60s and ‘70s.

View of water carriers in Paris.
Credit: Heritage Images/ Hulton Archive via Getty Images

Water Carriers

Water carriers still exist, but they’re an endangered profession. These vital workers have been around for millennia, trudging water from rivers and wells to people’s homes. Some used buckets hanging from a yoke or leather sacks, while others lugged large tankards over their shoulders. And while the water carrier was mostly replaced by modern plumbing, some places still commemorate the once-vital profession. In Hamburg, Germany, a water-carrier named Hans Hummel is celebrated as the local mascot, with more than 100 statues of his likeness sprinkled throughout the city.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Sensay/ Shutterstock

Think back to your school days: Are you nostalgic for flipping through a dusty library card catalog or clacking away on a typewriter? Some subjects you remember from those days are probably things of the past, although the finer points of how schools have changed might surprise you. These six subjects are either fading from U.S. high school curriculums or fundamentally changing.

Page of shorthand notes, with sharpened pencil.
Credit: Robyn Mackenzie/ Shutterstock

Shorthand

Shorthand alphabets help people write things down more quickly by hand, making them valuable for recording court testimony, legislative proceedings, or interviews — not to mention reading those notes after they were taken. By the early 20th century, shorthand was taught in public schools. Yet in the ensuing decades, more efficient ways to take notes dominated, like audio recording and typing. Shorthand was mostly phased out of schools by the 1990s.

Ancient Book from the 18th century written in Latin.
Credit: JCVStock/ Shutterstock

Latin

Today, around 8% of U.S. high schools have some sort of Latin language class, but it used to be standard practice, especially when many colleges required it for admission. High school Latin education took a hit during World War II, when liberal arts education became less popular. It continued to decline slowly in the 1960s and 1970s in favor of more immediately practical languages, such as French and Spanish.

Male students in a woodwork class.
Credit: Phovoir/ Shutterstock

Shop

Shop class usually refers to hands-on education in building and fixing — as in a woodshop, metalwork, or automotive repair. Now, these classes would fall under the umbrella of career technical education, or CTE. CTE credits took a nosedive between 1990 and 2009, with manufacturing being among the hardest hit. Many blame the focus on standardized tests for the decline (since resources are directed to academic subjects like reading and math rather than vocational classes). There is now a renewed interest in CTE classes, but that includes vocational training in fields like health care and communications, not just traditional “shop” classes.

Students in aprons cooking during a home economics class.
Credit: Juice Flair/ Shutterstock

Home Economics

Home ec developed a reputation for taking in high school girls and making them into perfect homemakers, but it was originally designed to demonstrate the science behind domestic skills and elevate what was considered “women’s work.” Over the years, the topics were increasingly devalued, and some unfortunate teaching tools emerged — like using real human “practice babies.” In schools that still have classes on domestic skills, they’re usually rebranded as family and consumer sciences. Even those are on the decline; enrollment dropped 38% between 2002 and 2012.

Female nutritionist with food pyramid chart at table.
Credit: New Africa/ Shutterstock

The Food Pyramid and the Food Wheel

Depending on when you attended school — and what the USDA (United States Department of Agriculture) was recommending at the time — you may have learned about nutrition with a cleanly divided triangle or circle, each outlining several food groups. The food wheel, with differently sized wedges to recommend how much to eat from each food group, came out in the 1980s. In the early 1990s, it was replaced by the food pyramid, showing food with a higher recommended intake at the bottom and lower recommended intake at the top. It was briefly replaced by MyPyramid, a triangle with vertical bands and a staircase running up one side to represent physical activity, in the mid-2000s. Since 2011, the go-to infographic is MyPlate, which shows a place setting with simplified food categories.

Three boys writing cursive on a chalk board in school.
Credit: PeopleImages.com – Yuri A/ Shutterstock

Cursive (Kind Of)

Writing in cursive used to be a standard part of school curriculums, but it started fading from classrooms in the early 2010s when states began adopting the federal Common Core State Standards, which didn’t require cursive. A pro-cursive backlash came soon after; in 2016, 14 states required that schools teach cursive. That number is now more than 20. While you probably don’t see cursive in as many classrooms as you did a couple of decades ago, the reports of cursive’s death are greatly exaggerated.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by DoraDalton/ iStock

The modern airplane is an airborne chariot made of steel. We’re so used to these impressive machines ferrying passengers, mail, and commerce, it can be difficult to fathom that they’re barely more than a century old. And while airplanes need to operate in some of the most extreme temperatures on or near Earth (it’s -70 degrees Fahrenheit at an airliner’s cruising altitude of around 40,000 feet), planes are by far the safest way to travel. Here are six facts that’ll make you marvel at airplanes, and explain why they’re one of the most impressive technological achievements in human history.

Serving a fish dish as a business class food on board a plane.
Credit: gerenme/ iStock

Flying on Planes Affects Your Sense of Taste

Humans aren’t designed to cruise comfortably at around 40,000 feet, and some strange biological phenomena can occur when airplanes take our bodies out of their terrestrial-based environment. Studies have discovered that our ability to sense sweet and salty foods diminishes by significant percentages when we fly in an airplane. That’s because at high altitudes, air inside an airplane cabin hovers at around 12% humidity — drier than most deserts on Earth. Since taste is also closely entwined with our sense of smell, the extremely dry air messes with our odor receptors and makes food taste more bland.

Surprisingly, even the constant drone of airplane engines can cause some foods to taste less intense, according to a separate study on the “effects of background noise on food reception.” Long story short, an airplane does not release the inner gourmand in humans, so maybe it’s best to stick to those small packets of (highly salted) peanuts.

Passenger airplane traveling through sky against a stormy bolt cloudscape.
Credit: photoncatcher/ iStock

Planes Are Struck by Lightning One or Two Times a Year

Lightning seems like one of the obvious hazards of pointing a plane nose-first into a storm cloud — and the fear is at least somewhat warranted. In an average year, a plane experiences one or two lightning strikes (though geographic location is an important factor). However, commercial aircraft are designed to handle this unexpected electrical load, so passengers experience little more than the bright flash of the lightning itself. Many airplanes are made of aluminum, which is an excellent conductor and will direct the current of a lightning strike through the skin and toward the tail. Airplanes designed with composite materials include conductive fibers that pull off the same trick. Other grounding and shielding technologies also help protect sensitive wiring and instruments on board the aircraft. If a lightning strike occurs, the aircraft is thoroughly inspected on the ground. This can cause significant delays, but there hasn’t been a lightning-related accident on a commercial aircraft for decades.

Close-up of a plane turbine engine.
Credit: katueng/ iStock

Planes Can Fly With Just One Engine

Your average airliner can take a significant amount of punishment and keep on flying. For example, commercial airliners are designed so that if one entire engine malfunctions for whatever reason, whether a bird strike or just a technological snafu, the plane can continue flying using its other engine. This isn’t just a safety feature agreed upon by the world’s airline manufacturers; it’s the law. Although a plane likely wouldn’t have the thrust necessary to take off on one engine, it can fly and land without problem.

Vapor trails from airplanes in mid-air against a blue sky.
Credit: oliale72/ iStock

Contrails Come From Water Vapor

Glance at the sky on a clear day and you’ll likely spot a contrail. This ice cloud forms as water vapor dispersed by an aircraft’s engine, and in the surrounding air, condenses around tiny exhaust particles from the engine. Contrails were discovered during the first high-altitude flights in the 1920s, and became a particular nuisance in World War II, when contrails effectively gave away a bomber’s strategic location.

Contrails are categorized into three groups, and each group is essentially defined by how long the contrail lasts in the air. Contrails have been the subject of an erroneous yet persistent conspiracy theory around “chemtrails,” which involves the belief that these clouds are actually some sort of biological agent deployed over the Earth for some undefined, nefarious purpose. Fortunately, water vapor is generally pretty benign.

Close-up of a fight recorder, known as a black box, used in aircrafts.
Credit: narvikk/ iStock

Black Boxes Are Not Black

When an aircraft crashes for any reason, investigators and ground crews will hunt for what’s called a “black box.” The term refers to an airplane’s dual flight recorders, which together contain crucial data regarding transmissions, the pilot’s voices, aircraft sounds, and other important information. However, a “black box” isn’t black (and sometimes  isn’t even shaped like a box). Instead, “black boxes” are usually bright orange so they’re easy to find after a crash.

So where does this misnomer come from? No one knows for sure, but one guess dates back to 1939, when aviation engineer François Hussenot developed the means to record flight information onto photographic film. Because this film was sensitive to light, this box needed to be pitch black inside — hence the name.

Passenger airplane flying above clouds during sunset.
Credit: spooh/ iStock

Flying at 33,000 Feet, Planes Are About 10% of the Way to Space

Traveling around 8 miles above sea level, planes easily traverse even the most impressively high natural features on Earth. At its lowest cruising altitude, around 33,000 feet, an airliner still clears Mount Everest by a comfortable 4,000-foot margin.

Space begins at the Kármán line — the boundary where Earth’s atmosphere ends — at about 62 miles up, so an aircraft has a long way to go (not that it’d have any hope of getting there). The highest-altitude flight of any aircraft, not including rocket-equipped spacecraft, belongs to a Soviet MiG E-266M that flew up to 123,523 feet above sea level in 1977. While 24 miles up is an impressive feat, it’s not even halfway toward the outer reaches of space.

Image Ad
Darren Orf
Writer

Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.

Original photo by De'Andre Bush/ Unsplash

For nearly a century, the Hollywood sign has been an emblem of the film industry — a glittering embodiment of the L.A. dream, emblazoned high atop Mount Lee in Griffith Park. An iconic photo-op spot in the City of Angels, the Hollywood sign represents the fame, fortune, and glamor many seek out in the entertainment and film industry. A symbol that looms this large over a town is bound to break a few expectations, so here are five things you probably didn’t know about one of California’s most famous landmarks.

Sign advertises the opening of the Hollywoodland housing development in the hills on Mulholland.
Credit: Underwood Archives/ Archive Photos via Getty Images

The Sign Was Supposed to Be Temporary

The Hollywood sign wasn’t always the legendary landmark that it is today. When it was first built, it was nothing more than a real estate marketing ploy. In 1923, Harry Chandler, a publisher from the Los Angeles Times, was a major investor in a housing development called Hollywoodland. In an effort to advertise, Chandler spent $21,000 erecting a “Hollywoodland” sign on Mount Lee. The sign was meant to be akin to the billboard, which was a newly popular form of advertising at the time, with one major exception — Chandler employed around 4,000 20-watt light bulbs to illuminate the sign at night. The electric advertisement was considered a marvel, especially since the sign’s lights were timed, with the words “HOLLY,” “WOOD,” and “LAND” lit in consecutive order. In 1949, the L.A. Parks Department made the decision to remove the word “land” so that the sign would better represent the district as a whole.

Aerial view of Griffith Observatory with the Hollywood Sign seen in the distance.
Credit: simonkr/ iStock

The Origin Behind the Name Is Murky

While Hollywood is known to be one of the most star-studded locations on the planet, it wasn’t always so. The town for which the Hollywood sign is named has surprisingly humble origins. In 1883, Kansas real estate developer Harvey Henry Wilcox and his wife, Deida, purchased 150 acres of land in California’s Cahuenga Valley. Since he preferred the quiet life, Wilcox intended to create a community of other like-minded people who practiced temperance. He and his wife dubbed their settlement “Hollywood” and the name stuck, although there is much speculation on how the moniker came to be. Some believe the name was inspired by Christmas holly, which grew in abundance in the nearby mountains. Others claim that Deida borrowed the idea from a friend who lived in Holly Canyon. A final postulation claims that Wilcox misheard the words “holly wood” when a Scottish immigrant told him he was “hauling wood.” Since there’s no official evidence behind any of these origin stories, we may never know the truth.

Hollywood sign changed to Hollyweed on January 01, 2017 in Los Angeles, California.
Credit: Axelle/Bauer-Griffin/ GC Images via Getty Images

It’s a Magnet for Mischief

The Hollywood sign has been subject to numerous attempts at vandalism over the years. One of the most recent pranks occurred on January 1, 2017, when an unknown man used large tarps to change the spelling from “Hollywood” to “Hollyweed.” But this wasn’t the first time the sign was given a pro-pot makeover. In fact, the stunt was a copycat — the original act was carried out exactly 41 years prior. On January 1, 1976, a small cohort of young college activists changed the sign to “Hollyweed” to celebrate California’s recently relaxed marijuana laws. The same group of people also changed the sign to “Ollywood” in 1987, in support of Lieutenant Colonel Oliver L. North’s testimony during the Iran-Contra scandal. Since then, surveillance cameras have been installed to protect the sign, although these security measures did not protect it in 2017 and the vandal was never caught.

Portrait of Peg Entwistle.
Credit: ARCHIVIO GBB/ Alamy Stock Photo

It Has a Resident Ghost

Within 10 years of the Hollywood sign’s construction, it became the site of a tragic death. In the 1930s, Peg Entwistle was an actress who began her career on the Broadway stage and was idolized by a young Bette Davis, who saw Entwistle perform on stage in The Wild Duck. Despite her early success, Entwistle had many setbacks, including a failed marriage to a fellow actor and being cut from her first major picture performance in Thirteen Women. After failing to receive a contract renewal from RKO Pictures, Entwistle used a maintenance ladder to climb to the top of the sign’s “H”  before jumping to her death. She was only 24 years old. Since then, sightings of a blonde woman dressed in 1930s garb have been reported by numerous visitors. The ghostly figure of the failed actress is also said to be accompanied by the overwhelming scent of gardenias, which is said to have been the young woman’s preferred fragrance.

Hugh Hefner poses for a photo on Nov 17 2005 in Los Angeles, California.
Credit: Dan Tuffs/ Getty Images Entertainment via Getty Images

It Was Saved by Hugh Hefner Twice

Although the Hollywood sign had been restored during the post-war years, it was in a complete state of deterioration by the late 1970s. An estimated $25,000 was needed by the Hollywood Chamber of Commerce to restore the sign to its former glory or the “eyesore” would be removed. In an effort to save the iconic sign, Hugh Hefner hosted a charity event at the Playboy Mansion and auctioned off the sign’s individual letters to celebrities willing to pay a hefty price tag for a piece of L.A. history. In the end, enough money was raised to rebuild the sign from scratch. Years later, in 2010, the Hollywood sign was once again in danger, as real estate developers were attempting to purchase the surrounding land for building purposes. To thwart the venture, the original Playboy donated $900,000 to a conservation group that was trying to save the land. Thanks to Hefner’s last-ditch effort, the acreage surrounding the Hollywood sign is now protected parkland, filled with walking trails available to the public.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.