French cuisine is often considered the epitome of fine dining, and that could be because French cooks are said to have launched the modern restaurant — and even invented the word “restaurant” itself. Many etymologists and historians attribute the origins of both to A. Boulanger, a Parisian soup vendor who set up shop in 1765. Boulanger peddled bouillons restaurants — so-called restorative meat and vegetable broths, from the French restaurer, meaning “to restore or refresh” — an act that wasn’t entirely revolutionary, since other cooks were selling healing soups from “health houses” around the same time. But Boulanger’s approach was different because he also offered a menu of other meals at a time when most taverns and vendors served just one option, dictated by the chef. Boulanger’s concept of seating guests and allowing them to choose their desired meal exploded in popularity after the French Revolution at the end of the 18th century, as kitchen workers who formerly served aristocratic households set up their own dining rooms or joined new eateries. By 1804, French diners could choose from more than 500 restaurants across the country.
Waffle House doesn’t just sling breakfast; the 24-hour diner has also pressed its own jukebox records since the mid-1980s. Restaurant-themed songs across genres (such as gospel, bluegrass, and R&B) are released under the Waffle Records label and exclusively played at the chain’s diners.
Some historians disagree with this long-told tale of the restaurant’s origin, suggesting there isn’t much evidence by way of historical documentation to prove Boulanger was a real person. And others believe attributing the public dining room to French ingenuity isn’t wholly accurate, since humans have been offering up their cooking talent to the hungry masses for millennia. Take, for example, how Chinese chefs in major cities such as Kaifeng and Hangzhou customized menus to appeal to traveling businessmen looking for familiar meals nearly 700 years before France’s iteration of the restaurant. Or the excavated ruins at Pompeii dating to 79 CE that include ornately decorated food stalls called thermopolia, where hungry Romans could choose from a variety of ready-to-eat dishes. Though the names have differed, smart humans have been selling snacks to each other for a long, long time.
Founded in 1921, White Castle is America’s oldest fast-food burger restaurant.
Advertisement
The first American diners were mobile.
Most of the diners Americans patronize today are stationary spots, but the country’s earliest greasy spoons were more like modern food trucks. First called “night lunch wagons” by Rhode Island inventor Walter Scott in 1872, the horse-drawn diners served hot meals to patrons who were often late-shift workers or partiers looking for meals long after other restaurants had closed. Soon after, ingenious restaurateurs developed rolling eateries complete with seats, some providing both a meal and transportation to hungry diners looking to travel across town. By the 1890s, trains began incorporating the concept (ticket holders were previously responsible for supplying their own meals), debuting dining cars that fed patrons on long journeys across the growing West. The original dining carriages, however, quickly fell out of style; maintenance costs, city bans, and competition from brick-and-mortar restaurants pushed many proprietors out of business by the early 1900s. Those that survived swapped their carts for permanent locations often resembling their original carts or made from modified railroad dining cars — an iconic look that remains today.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Whales are some of the most majestic creatures on the planet. The blue whale is the largest animal to ever exist, the bowhead whale can live for more than 200 years, and a few humpback whales saved the future of humanity in Star Trek IV: The Voyage Home. In fact, these creatures are so amazing that even their earwax is a vital tool — at least for helping scientists understand the mysterious mammals themselves. Take, for instance, the 10-inch-long earplug of an adult blue whale (Balaenoptera musculus). Cetologists — scientists who study whales — can cut into a plug of earwax and learn the whale’s age, much as dendrochronologists do with tree rings. Earwax from blue whales (and other large whales such as humpbacks) forms rings, known as “laminae,” every six months, which give scientists a snapshot of the creature’s entire life through cycles of summer feeding and winter migration.
Fifty million years ago, the early ancestor of all cetaceans walked on four legs. This goat-like mammal, dubbed Pakicetus, lived on riverbanks in India and Pakistan. Slowly, its descendants became more comfortable in water until they eventually evolved into today’s whales.
And these waxy earplugs can tell scientists more than just a whale’s age. Earplugs also capture a chronological “chemical biography” that shows what chemicals and pollutants were found in the animal’s body throughout its life, including levels of the stress hormone cortisol. Scientists have compared whale cortisol levels with whaling data, using records from 1870 to 2016, and found an unmistakable positive correlation. The only discrepancy was during World War II, when whale stress levels increased despite a decrease in whaling overall (scientists assume increased military activity was the likely culprit). Despite a near-international moratorium on whaling in the 1980s, whales still exhibit high cortisol levels thanks to increased ship noise, climate change, and other factors. But with the help of whale earwax, scientists can at least continue to examine the health of these majestic beasts and the oceans they inhabit.
The scientific name for earwax is actually cerumen.
Advertisement
Using Q-tips to clean your ears is a bad idea.
If you see or feel excess wax in your ear, you should grab a Q-tip, right? Not so fast. Earwax actually plays an important role in auditory health. Produced by the skin in the ear canal, earwax prevents dust and other debris from damaging deeper structures such as the eardrum. However, an excess of earwax can cause “impaction,” which produces symptoms including irritation, hearing loss, and even dizziness. But removing earwax buildup with a cotton swab is not recommended. Otolaryngologists (doctors who treat the ears, neck, throat, and other areas) warn that cotton swabs can actually exacerbate impaction by pushing wax toward the eardrum, where it can harden. If your ears do become impacted, see your local ENT or primary care physician — but don’t toss those Q-tips. You can still use them for cleaning your outer ear or other hard-to-reach spots like faucets, computer keyboards, or car interiors.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Three Midwestern babies born in 1958 grew up to change the course of pop music and culture forever. Prince Rogers Nelson was born on June 7 in Minneapolis, Minnesota; Madonna Louise Ciccone was born on August 16 in Bay City, Michigan; and Michael Joseph Jackson was born on August 29 in Gary, Indiana. Each of the future genre-busting superstars was welcomed into a big family: Eventually, Prince had eight siblings, Madonna had seven, and Jackson had 10. Jackson was the first to begin his performing career, at age 5. Six years later, he made his television debut as the youngest member of the Jackson Five when the group sang “It’s Your Thing” at the Miss Black America Pageant. Prince debuted at 21 on American Bandstand, singing "I Wanna Be Your Lover" and "Why You Wanna Treat Me So Bad?”; Madonna made one of her earliest TV appearances on the same show four years later, performing “Holiday,” and telling Dick Clark of her wish to “rule the world.”
In addition to exhilarating music, era-defining music videos, and various business ventures, all three made films. Precocious Jackson appeared in the first movie of the group, 1978’s The Wiz, a straight-from-Broadway reimagining of The Wizard of Oz featuring an all-Black cast (Jackson played the Scarecrow). In 1985 — the year Prince and Madonna briefly dated — Prince won an Oscar for Best Music, Original Song Score for his first film, the semi-autobiographical Purple Rain. Meanwhile, shortly after Madonna (in a Marilyn Monroe-inspired look) took Jackson as her plus-one to the 1991 Academy Awards, she shot A League of Their Own, a tribute to the all-female baseball teams that entertained fans during World War II. (The so-called Queen of Reinvention has also appeared in dozens of other films over the years.)
Madonna holds the world record for the most costume changes by a character in a film.
Costume designer Penny Rose dressed Madonna in 85 outfits when she portrayed Argentinian actress, activist, and First Lady Eva Perón in the 1996 film Evita. Those clothes were accessorized with 45 pairs of shoes, 56 pairs of earrings, and 39 hats. In a savvy branding tie-in, Bergdorf Goodman began selling Perón-inspired hairpieces, and Madonna — pregnant with her first child, Lourdes Leon, during filming — even channeled her alter-ego for the cover of Vogue. Madonna won a Golden Globe for her performance in director Alan Parker’s musical biopic (an adaptation of the Andrew Lloyd Webber musical), and Rose received a BAFTA nomination for her costumes.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
By the time most of us reach age 20 or so, the bones in our body are pretty much done growing. The growth plates that cause us to put on inches in our youth are now hardened bone, and in fact, adults tend to drop an inch or two in height as worn-out cartilage causes our spines to shrink over time. However, there are a few bones that buck this biological trend. Skulls, for example, never fully stop growing, and the bones also shift as we age. A 2008 study from Duke University determined that as we grow older, the forehead moves forward, while cheek bones tend to move backward. As the skull tilts forward, overlying skin droops and sags.
The largest skull ever discovered belongs to a T. rex.
The world’s largest skull does belong to a dinosaur — just not a T. rex. The record-breaking skull, standing some 10 feet, 6 inches tall, belongs to a Cretaceous Period Pentaceratops, currently on display at the Sam Noble Oklahoma Museum of Natural History.
The skull isn’t the only bone that has a positive correlation with age. Human hips also continue to widen as the decades pass, meaning those extra inches aren’t only due to a loss of youthful metabolism. In 2011, researchers from the University of North Carolina School of Medicine discovered that hips continue to grow well into our 70s, and found that an average 79-year-old’s pelvic width was 1 inch wider than an average 20-year-old’s. So while it’s mostly true that humans stop growing after the age of 20, nature always likes to throw in a few exceptions to the rule.
The pirate flag bearing a skull and crossbones is famously known as the Jolly Roger.
Advertisement
The idea that your ears and nose never stop growing is a myth.
It’s a common misconception that our ears and noses continue to grow throughout our lifetime — though they sometimes do appear to take on almost cartoonish proportions toward the end of our lives. So what’s going on here? Turns out, it’s not really the cartilage in our nose and ears that’s to blame. Instead, the culprit is decades of experiencing gravity. This constant gravitational pull — along with general degradation due to age — causes the collagen and elastic fibers in our ears and nose to droop and elongate. The surrounding skin supporting these structures also breaks down and droops over time while simultaneously losing volume. Studies have shown that the average human ear elongates approximately 0.22 millimeters a year. It’s a process that’s so reliable, ears can actually be used as a tool for determining a person’s age.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Andrea Skjold Mink/ Shutterstock
If Tater Tots are your favorite fast-food side, you have the ingenuity of two brothers — Golden and Francis Nephi Grigg — to thank. However, when the pair invented the crispy potato composites in the 1950s, they didn’t set out to change snack food history. Instead, their potato creation came from a quest to reduce the amount of food waste produced at their frozen foods plant.
Before becoming successful spud salesmen, Golden and Francis sold frozen corn. Around 1949, they decided to diversify into other fruits and vegetables, and converted a factory in Ontario, Oregon (on the border with Idaho), into a potato-processing plant they were later able to purchase. In 1952, the Griggs launched the Ore-Ida brand, which became popular for its frozen french fries. The crispy potato spears were a hit among home cooks at a time when prepared meals and frozen foods were becoming more widely available thanks to postwar technology.
“Tater Tot” is not a generic term — Ore-Ida trademarked the name in 1957. In the years since, competitors have come up with creative alternatives such as “potato gems,” “potato crunchies,” and “spud puppies.”
The downside to booming french fry sales, however, was the waste left behind. Initially, the Griggs sold vegetable byproducts to farmers as livestock feed, but they soon looked for a way to nourish humans instead. They began experimenting with chopping up the potato scraps, mixing them with flour and spices, then shaping the result into a rectangle with the help of a simple, homemade plywood mold. The first Tater Tots — named, by one account, after an employee won a contest by suggesting “tater” for potato and “tot” for small — debuted in 1956. At first, shoppers seemed skeptical of the inexpensive scrap-based snack, but after prices were raised slightly to suggest an air of sophistication, Tater Tots quickly found a permanent home in frozen food aisles, where they continue to reign today.
Tater Tot producer Ore-Ida’s name is a nod to Oregon and Idaho, two potato-growing states.
Advertisement
Miners traded gold for potatoes during the Klondike gold rush.
Between 1848 and 1855, an estimated 300,000 people made their way to California, hoping to strike it rich by mining the supposedly plentiful gold just beneath the Earth’s surface. Unfortunately, many miners at the time faced a common foe: malnutrition. Food costs were often inflated in remote mining towns, and nutritious fresh food was generally hard to come by, meaning many miners had limited diets of shelf-stable goods like bread, salt pork, and beans. For gold hunters who trekked farther north to Alaska for the Klondike gold rush, which kicked off in 1896, that often meant an increasing risk of scurvy — a vitamin C deficiency that can cause fatigue and tooth loss, among other effects. Scurvy could be remedied with potatoes, a vegetable that Klondike miners could more easily source than many other fresh produce items. However, shortages and unscrupulous peddlers increased the price of potatoes, forcing many prospectors to trade their hard-won gold for spuds in an effort to ward off the effects of the illness.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
While books are a fixture of today’s libraries, humans long constructed great centers of learning without them. That includes one of the oldest known significant libraries in history: the Library of Ashurbanipal. This library, established in modern-day Mosul, Iraq, by the Assyrian King Ashurbanipal in the seventh century BCE, contained nothing we would recognize today as a book. Instead, it was a repository of 30,000 clay tablets and writing boards covered in cuneiform — the oldest writing system in the world. Much like your local public library, this royal collection covered a variety of subjects, including legislation, financial statements, divination, hymns, medicine, literature, and astronomy.
The ancient Greeks can be thanked for a great many things, but paper isn’t one of them. On March 11, 105 CE, court official Cai Lun presented his papermaking process to Emperor He of the Han dynasty in China. It’d take many centuries, but the idea transformed the world.
While we know when this library flourished, defining the appearance of the earliest book is slightly murkier. The Egyptians, for example, are known to have written on papyrus scrolls; when the Library of Alexandria in Egypt burned in the first century BCE, 40,000 priceless scrolls were lost. By about the second century CE, Romans began using bound codexes, a kind of proto-book that consisted of papyrus or parchment sheets between wooden covers. The arrival of Christianity made the codex immensely popular, and it basically replaced the scroll by the sixth century CE. Block-printed books showed up in China around 700 CE, although Europe didn’t see anything similar until Johannes Gutenberg invented mechanical movable type around 1448. So while libraries haven’t always housed books, they have been repositories of human knowledge — in whatever form it might take.
The oldest surviving literary work is thought to be “The Epic of Gilgamesh,” composed 4,000 years ago.
Advertisement
The Library of Congress has burned twice.
The U.S. Library of Congress contains the largest collection of published books in world history — but it wasn’t an easy journey getting there. During the War of 1812, the British set fire to the Capitol building, which contained the 3,000-volume Library of Congress. It was completely consumed in the conflagration. To reestablish the library, Congress purchased former President Thomas Jefferson’s book collection — the largest in the U.S. at the time — for $23,950. Sadly, a second fire, on Christmas Eve 1851, consumed a large portion of that library, including many of Jefferson’s original volumes. Two years later, Congress constructed a new library, this time made from flame-resistant cast iron.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The Library of Congress has one of the largest collections of books in human history — and that includes comic books. With more than 12,000 titles stretching across 140,000 issues in its collection, the library holds the largest public collection of comic books in the United States. There’s a little bit of everything — Golden Age superheroes like Superman and Captain America, translated reissues of Japanese manga, foreign translations of American comics, underground comics from the ’70s and ’80s, bande dessinées (French comics) from the ’60s, and much more. The library also has many older comics available on microfiche, including the always-elusive Action Comics #1 (the first appearance of Superman), and it regularly adds comics electronically as they become available.
The Library of Alexandria is one of the Seven Wonders of the Ancient World.
The Egyptian city of Alexandria does make the list, but it’s their famous lighthouse that nabs the accolade — not the library. Both were built around 280 BCE. The lighthouse was a ruin by the 14th century, but its remains were rediscovered in the 20th century.
Of course, this isn’t a collection you can simply peruse at your leisure. Due to the fragility of the comics (the oldest issue is nearly 90 years old, after all), these specimens are only available to “researchers under special conditions” — but that includes people such as scriptwriters, pop culture historians, avid collectors, and graphic artists. The Library of Congress has also put on several public comic-centric exhibitions to showcase some of the most intriguing artifacts in its collection.
The first comic book is often said to be 1897’s “The Yellow Kid in McFadden’s Flats.”
Advertisement
The Library of Congress used to have a book-delivering conveyor belt.
In 1897, the Library of Congress moved out of the Capitol building and into new digs across the street. Congressmen grumbled at having to cross the street to get books, so the Army Corps of Engineers created a tunnel between the two buildings, and Boston’s Miles Pneumatic Tube Company designed a “book conveying apparatus.” Requests for books came to the library via pneumatic tube, after which the requested edition was whisked away in a tray that rode the rails at a speed of about 10 feet per second. Once at the Capitol, the book would travel up three stories until it arrived near the House floor. The whole trip took about a quarter-mile, and lasted only five minutes. The library built updated conveying systems in the following years to connect annex buildings constructed in 1939 and 1980, but the original tunnel conveyor system was dismantled in the 2000s to make way for a subterranean visitor center, and the other belts have since been abandoned due to high repair costs. Today, though most senators and representatives just use the internet to get the books they need, a Library of Congress staffer can still sometimes be seen hand-delivering books to the Capitol.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
More than 200 years after the 10th President of the United States was born, one of his grandsons is still alive. As impossible as that may seem, the math — and biology — checks out. John Tyler, who was born in 1790 and became President in 1841 after William Henry Harrison died in office (possibly of pneumonia), had a son named Lyon Gardiner Tyler in 1853. This son was born to the then-60-something Tyler and his second, much younger, wife, Julia Gardiner. Lyon then had two sons of his own in his 70s (also with a much younger second wife), one of whom — Harrison Ruffin Tyler, born in 1928 — is still gracing the Earth in his 90s.
George H.W. Bush and George W. Bush were the only father and son to both become President.
The Bushes were preceded in this exceedingly rare feat by John Adams (1797–1801) and John Quincy Adams (1825–1829), the second and sixth Presidents, respectively.
It may make this feat slightly less surprising to know that Tyler had 15 children, more than any other POTUS in U.S. history. Tyler’s actual presidency is less remarkable than this biographical oddity, alas — he was referred to as “His Accidency” upon assuming office and wasn’t renominated in the following election. (He was also an enslaver whose profitable plantation ran on the labor of 40 to 50 enslaved people.) Though his grandsons haven’t had major political aspirations, you might say it was in Tyler’s blood to seek office: His father, John Tyler Sr., was roommates with Thomas Jefferson at the College of William and Mary and later served as the 15th governor of Virginia.
Theodore and Franklin Roosevelt were fifth cousins.
It’s well known that the two commanders in chief were related, but how they were related is less-common knowledge. FDR assumed the presidency 24 years after his fifth cousin left the Oval Office, and their mutual political aspirations were no coincidence: The longest-serving President in U.S. history greatly admired his distant relative and intentionally followed in his footsteps. Teddy largely approved: Upon Franklin’s engagement to Eleanor — Franklin’s fifth cousin once removed — the younger Roosevelt received a note from the elder saying he was “as fond of Eleanor as if she were my daughter; and I like you, and trust you, and believe in you.”
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
It’s likely that at some time in your life, a certain smell, whether the earthy aroma of freshly cut grass or the unmistakable fragrance of your grandparents’ house, triggered a powerful and strangely detailed memory. Well, there’s a biological reason for that. Unlike our four other best-known senses, whose electrical impulses are first sent to our thalamus before being sent to appropriate areas for memory, such as the hippocampus, our sense of smell takes a different route. Instead of being sent to the thalamus, scents go straight to the smell center known as the olfactory bulb. The fact that this bulb is directly connected to the hippocampus and the amygdala (which is responsible for emotional processing) is likely why smell evokes such powerful memories compared to our other senses. These memories can be extremely distinct, and they’re often linked to our childhood, likely because they were first stored when we experienced the scent at a young age.
Most of our sense of taste comes from our sense of smell.
Smell is by far the most important facet of taste. The gustatory nerve cells located in taste buds can only sense sweet, sour, bitter, salty, and umami (savory). Because smell accounts for 80% of taste, without it, humans would be limited to only those five basic tastes.
According to a 2017 study, a part of the olfactory bulb known as the piriform cortex is what allows certain smells to be deposited in our long-term memory, though this cortex requires other parts of the brain to pull this off. The olfactory bulb essentially consults our orbitofrontal cortex, an area of the brain responsible for higher-level taste and smell functions, about whether a smell should be stored in long-term memory.
Companies are very aware that smell can be a powerful reminder of memory and emotion, which is why some of them have even trademarked certain scents (yes, you can do that). Verizon, for example, owns the rights to its “flowery musk scent” used in its stores, and olfactive branding companies work with clients like Nike to leverage the power of smell. Because when it comes to unlocking human emotions and memories, it might be the strongest sense we have.
A heightened sense of smell, sometimes causing nausea, is called hyperosmia.
Advertisement
Humans can smell fear.
Human communication is usually associated with sight, hearing, and touch. But our sense of smell also plays an unrecognized role in how we interact, and several scientific studies have discovered a latent human ability to detect chemical signals in human sweat. One study in 2008 collected sweat from novice skydivers about to jump out of a plane, as well as sweat from the same participants when they were running on a treadmill. Then, another batch of participants sniffed the “fear” and “non-fear” sweat pads while inside a brain scanner. Although the subjects could not consciously differentiate between the smells of the two pads, the scanners detected increased activity in the amygdala, the region of the brain associated with emotional processing, when sniffing the “fear” sweat pad. Many other studies have since reproduced similar conclusions, but scientists remain stumped as to what chemical is triggering this response. While some people have what’s called a “vomeronasal organ,” which in other animals detects the scent of prey and sex pheromones, the organ is vestigial in humans (meaning it once played a role in our primate past but is no longer functional, sort of like the appendix). Despite this, some scientists argue that the ability to sense chemosignals provides significant advantages for animals that thrive in groups (like humans), and could also explain the mostly noncommunicative bond between a mother and newborn. In other words, there’s still some lingering mystery around what the nose knows.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The Earth has been around for a while — about one-third as long as the universe itself. By comparison, Homo sapiensare the new kids on the block. Earth’s story began at the outset of the Hadean eon, about 4.6 billion years ago. It took 600 million years just for the Earth’s crust to take shape, another 300 million years for the first signs of microbial life to pop up, and about 3.2 billion years after that for life to really get going thanks to the evolutionary burst known as the Cambrian explosion. Several mass extinction events and some 465 million years later, mammals finally took center stage, but modern humans didn’t enter the biological limelight for another 65 million years. With the first Homo sapiens appearing around 300,000 years ago, humans have only been on planet Earth for 0.0067% of its existence.
Pangea is the only supercontinent in Earth’s history.
Before Pangea, there was Rodinia (which formed about 1 billion years ago), and geologic evidence suggests that another supercontinent called Nuna formed around 1.6 billion years ago. Scientists believe supercontinents occur in cycles stretched across hundreds of millions of years.
In those 300,000 years, humans have been pretty busy. For a couple thousand years, we harnessed fire and lived a nomadic existence, until around the fourth millennium BCE, when the very first civilizations began to take shape. Since then, humans have been on a meteoric trajectory, going from hunter-gatherer to spacefarer in less than 6,000 years. Carl Sagan famously displayed the universe’s history on a 365-day calendar, with the Big Bang on January 1 and our current moment starting at 12:01 a.m. the next year. On that timeline, it’s only at 10:30 p.m. on December 31 that humans first appear, and all of recorded history is squeezed into just a few seconds — but what a few seconds it’s been.
Scientists think the moon formed when the Earth collided with a planet named Theia 4.5 billion years ago.
Advertisement
A growing number of scientists want to declare a new epoch because of humanity’s impact on the Earth.
Geologists divide the life of the Earth into various categories of time. First, there are eons, which stretch for millions and sometimes billions of years. Then come eras, followed by periods, epochs, and finally ages. For the past 11,700 years, since the end of the Paleolithic ice age, the rise of humanity has coincided with the Holocene Epoch (which is part of the Quaternary period of the Cenozoic era). For most of this epoch, the world’s climate remained stable, but with the rise of modern society, the Earth has undergone rapid changes in a very short time. That’s why many scientists believe that a new epoch, called the Anthropocene (meaning “recent age of humans”), should be adopted, beginning around 1950 and the dawn of the nuclear age. For the Anthropocene to become official, both the International Commission on Stratigraphy and then the International Union of Geological Sciences need to sign off. Despite support within these groups, the epoch has yet to be officially recognized.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.