Original photo by Volgi archive/ Alamy Stock Photo
In early 1901, English inventor Hubert Cecil Booth traveled to Empire Music Hall in London to witness a strange invention — a mechanical aspirator designed to blow pressurized air to clean rail cars. Booth later asked the demonstrator why the machine (invented by an American in St. Louis) didn’t simply suck up the dust rather than blow it around. “He became heated,” Booth later wrote, “remarking that sucking out dust was impossible.” Unconvinced, Booth set about creating such a contraption, and later that same year he filed a patent for a vacuum machine he named the “Puffing Billy.”
The Hoover vacuum is named after President Herbert Hoover.
Herbert Hoover’s presidency (1929 to 1933) arrived decades after the debut of the Hoover vacuum company, named after Ohio businessman William H. Hoover. The vacuum mogul has no relation to the nation’s 31st president.
This machine wasn’t quite as fancy as modern Dust Busters, Dirt Devils, Hoovers, or Dysons. Instead, the Puffing Billy was red, gasoline-powered, extremely loud, and big — really big. So big, in fact, that the machine needed to be pulled by horses when Booth’s British Vacuum Cleaner Company made house calls. Once outside a residence, 82-foot-long hoses snaked from the machine through open windows. Because turn-of-the-century carpet cleaning wasn’t cheap, Booth’s customers were often members of British high society; one of his first jobs was to clean Westminster Abbey’s carpet ahead of Edward VII’s coronation in 1902. By 1906, Booth had created a more portable version of the Puffing Billy, and two years later, the Electric Suction Sweeper Company (later renamed Hoover) released the “Model O,” the first commercially successful vacuum in the United States.
The world’s largest vacuum chamber is located at a NASA facility in the U.S. state of Ohio.
Advertisement
Engineers in the 19th century used horses to power boats.
Although an animal-powered boat can trace its origins back to Roman times, team boats (also known as “horse boats” or “horse ferries”) became especially popular during the 19th century in the United States. Horses walked either in a circle or in place to turn wheels that moved the boat forward. The first commercially operated horse boat (or any other animal-powered boat) in the U.S. plied the waters of the Delaware River around 1791. Well suited for journeys of only a few miles, horse boats were soon sailing the waters of Lake Champlain as well as the Hudson River before eventually spreading to the Ohio and Mississippi rivers, and the Great Lakes. By the 1850s, these horse-powered creations were largely replaced by paddle steamers — the beginning of the horse’s decades-long slide from supremacy to irrelevancy, at least when it comes to transportation.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
When George Washington died in 1799, Congress could think of no better way to honor the first president than by laying him to rest in the U.S. Capitol. The building had been under construction since Washington himself laid the cornerstone in 1793, and plans were quickly approved to add a burial chamber two stories below the rotunda with a 10-foot marble statue of Washington above the tomb. Visitors would be able to view the grave via a circular opening in the center of the rotunda floor. There was just one problem: Washington had already designated his Mount Vernon estate to be his final resting place, meaning neither he nor anyone else is actually buried in what’s still called the Capitol Crypt.
George Washington won both of his presidential elections unanimously.
He ran essentially unopposed in both 1788 and 1792, thereby winning every available electoral vote — 69 the first time, 132 the second — in each election.
This crypt, which was finally completed in 1827, has gone by a few different names over the years. The 1797 plan by architect William Thornton labeled the space the “Grand Vestibule,” whereas architect Benjamin Henry Latrobe’s 1806 plan referred to it as the “General Vestibule to all the Offices” and an 1824 report of the Commissioners of Public Buildings simply called it the “lower rotundo.” For those who’d like to see the crypt today, it’s included in most tours of the Capitol.
The Capitol Crypt’s sandstone floor was sourced from a quarry in Seneca, Maryland.
Advertisement
The U.S. Capitol was burned down in the War of 1812.
The United States has engaged in many international conflicts, most of which haven’t been fought on the country’s own soil. One exception to this is the War of 1812, a kind of sequel to the Revolutionary War in which the U.S. once again went to battle against its British frenemies across the pond.
Mostly spurred by violations of maritime rights, the war reached a retaliatory pitch when, in response to American troops burning the Canadian capital, York (now Toronto), British troops made their way to D.C. and burned everything they could — including the Capitol. This happened on August 24, 1814, a day that also saw the White House set ablaze. Restoration began immediately, and though the Library of Congress’ 3,000-volume collection was ultimately lost, the Capitol was rebuilt and a new library was begun with the donation of Thomas Jefferson’s personal collection of 6,487 books.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
In 1851, German physician Carl Wunderlich conducted a thorough experiment to determine the average human body temperature. In the city of Leipzig, Wunderlich stuck a foot-long thermometer inside 25,000 different human armpits, and discovered temperatures ranging from 97.2 to 99.5 degrees Fahrenheit. The average of those temperatures was the well-known 98.6 degrees — aka the number you hoped to convincingly exceed when you were too “sick” to go to school as a kid. For more than a century, physicians as well as parents have stuck with that number, but in the past few decades, experts have started questioning if 98.6 degrees is really the benchmark for a healthy internal human temperature.
Primates have the highest known internal body temperature.
Although primates (such as Homo sapiens) are warm-blooded creatures, birds have some of the highest internal body temperatures in the animal kingdom. Some hummingbirds, for example, have body temperatures as high as 112 degrees Fahrenheit.
For one thing, many factors can impact a person’s temperature. The time of day, where the temperature was taken (skin, mouth, etc.), if the person ate recently, their age, their height, and their weight can all impact the mercury. Furthermore, Wunderlich’s equipment and calibrations might not pass scientific scrutiny today. Plus, some experts think humans are getting a little colder, possibly because of our overall healthier lives. Access to anti-inflammatory medication, better care for infections, and even better dental care may help keep our body temperatures lower than those of our 19th-century ancestors.
In 1992, the first study to question Wunderlich’s findings found a baseline body temperature closer to 98.2 degrees. A 2023 study refined that further and arrived at around 97.9 degrees (though oral measurements were as low as 97.5). However, the truth is that body temperature is not a one-size-fits-all situation. For the best results, try to determine your own baseline body temperature and work with that. We’re sure Wunderlich won’t mind.
Many mammals — from the humble ground squirrel to the majestic grizzly — practice some form of hibernation, slowing down certain bodily functions to survive winters. Naturally, that raises a question: “Humans are mammals. Can we hibernate?” While the answer is slightly more complicated than it is for a pint-sized rodent, the answer is yes … with caveats. The main component of hibernation is lowering body temperature. When this occurs, the body kicks into a low metabolic rate that resembles a state of torpor, a kind of extreme sluggishness in which animals require little to no food. Because most of our calories are burned up trying to keep our bodies warm, de-prioritizing that requirement would essentially send humans into hibernation — but this is where it gets tricky for Homo sapiens. First, humans don’t store food in our bodies like bears do, so we’d still need to be fed intravenously, and second, sedatives would be needed to keep us from shivering (and burning energy). In other words, it would be a medically induced hibernation, but hibernation nonetheless. A NASA project from 2014 looked into the possibility of achieving this kind of hibernation for long-duration space travel, and while the findings weren’t put into practice, there were no red flags suggesting a biological impossibility. Today, NASA continues its deep sleep work by gathering data on the hibernating prowess of Arctic ground squirrels.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Winter, spring, summer, fall — outside of the tropics and the planet’s poles, most temperate areas of the globe experience the four seasons to some extent, although how we choose to view those weather changes can differ from country to country. Take, for example, ancient Japan’s calendar, which broke the year into 72 microseasons, each lasting less than a week and poetically in tune with nature’s slow shifts throughout the year.
There are two ways to celebrate the start of a new season. Astronomical seasons are marked on the solstices and equinoxes, which fluctuate annually. For consistent data collection, scientists prefer meteorological seasons (for example, meteorological fall starts on September 1).
Japan’s microseasons stem from ancient China’s lunisolar calendar, which noted the sun’s position in the sky along with the moon’s phases for agricultural purposes. Adopted by Japanese citizens in the sixth century, the lunisolar calendar broke each season into six major divisions of 15 days, called sekki, which synced with astronomical events. Each sekki was further reduced into three ko, the five-day microseasons named for natural changes experienced at that time. Descriptive and short, the 72 microseasons can be interpreted as profoundly poetic, with names like “last frost, rice seedlings grow” (April 25 to 29), “rotten grass becomes fireflies” (June 11 to 15), or “crickets chirp around the door” (October 18 to 22).
In 1685, Japanese court astronomer Shibukawa Shunkai revised an earlier version of the calendar with these names to more accurately and descriptively reflect Japan’s weather. And while climate change may affect the accuracy of each miniature season moving forward, many observers of the nature-oriented calendar find it remains one small way to slow down and notice shifts in the natural world, little by little.
Nearly 80% of Japan’s land is covered in mountains.
Advertisement
Japan recently added more than 7,200 new islands to its territory.
Japan’s archipelago has four major islands and thousands of smaller ones, though a recount in 2023 found that there were far more islets than previously known. The island nation has recognized 6,852 islands in its territory since 1987, though advancements in survey technology led to the realization that there are many more — a staggering 14,125 isles in total. In the 1980s, Japan’s Coast Guard relied on paper maps to count islands at least 100 meters (328 feet) in circumference, though surveyors now realize many small landmasses were mistakenly grouped together, lowering the total number. Today, the country’s Geospatial Information Authority uses digital surveys to get a glimpse of the chain’s smaller islands for a more accurate count, while also looking for new islands created from underwater volcanoes. However, only about 400 of Japan’s islands are inhabited, while the rest remain undeveloped due to their size, rugged terrain, and intense weather conditions.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Tiny, hidden survival tools packed into the waistband of your pants may sound like something fantastical from a spy movie, but in the case of British wartime pilots, they were a reality. During World War II, the Royal Air Force sent its aviators skyward with all the tools they’d need to complete a mission, along with a few that could help them find their way home if they crash-landed behind enemy lines. One of the smallest pieces of survival gear pilots carried was a compass built into the button of their trousers.
Britain was the first country to create an air force.
Established in April 1918, seven months before the end of World War I, Great Britain’s Royal Air Force was the first of its kind. The sky-patrolling force launched with 300,000 service members nearly three decades before the U.S. designated its own Air Force branch in 1947.
Three months after entering World War II, the British military launched its MI9 division, a secret intelligence department tasked with helping service members evade enemy forces or escape capture. Between 1939 and 1945, masterminds at MI9 created a variety of intelligence-gathering and survival tools for troops, such as uniform-camouflaging dye shaped like candy almonds, ultra-compressed medications packed inside pens, and button compasses. The discreet navigational tools were typically made from two buttons, the bottom featuring a tiny needle. When balanced on the spike, the top button acted as a compass that rotated with the Earth’s poles; two dots painted on the metal with luminous paint represented north, and one indicated south.
MI9 distributed more than 2.3 million of its button compasses during the war. They could be paired with secretive maps that were smuggled to captured service members inside care packages delivered to prisoner-of-war camps. Often printed on silk for durability and waterproofing, the 44 different maps (sent to different camps based on location) were tucked discreetly into boot heels and board games. The ingenuity worked — by the war’s end, MI9 was credited with helping more than 35,000 Allied soldiers escape and make their way home.
Historians believe the first magnetic compasses were invented in China.
Advertisement
Some American colonists were banned from wearing fancy buttons.
Buttons can be an innocuous way to add panache to a piece of clothing … unless you were a colonist living in Massachusetts during the 17th century, that is. Choosing the wrong type of buttons for your garment during that time could have landed you in court and required paying a fine. Puritans in Massachusetts during the 1600s were ruled by a series of sumptuary laws, aka legal codes that restricted how people dressed and interacted with society based on moral or religious grounds. Massachusetts passed its first sumptuary law in 1634, prohibiting long hair and “new fashions” (aka overly swanky clothes), and five years later even banned short-sleeved garments. By 1652, the colony further restricted lower-wage earners from wearing “extravagant” accessories such as silks, fine laces, and gold or silver buttons, unless they had an estate valued at more than 200 British pounds — more than $38,000 in today’s dollars. However, the law did include some loopholes: Magistrates, public officers, and militia members (and their families) were free to choose buttons and other adornments without fear of penalty, as were the formerly wealthy and those who had advanced education.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Uniforms convey a sense of competency across professions ranging from delivery person and airline staff to chef and firefighter. The psychological implications may be even stronger when it comes to matters of health: According to one study published in the medical journal BMJ Open, doctors who don the traditional white coat are perceived as more trustworthy, knowledgeable, and approachable than those who administer to patients in scrubs or casual business wear.
Physicians at the acclaimed Mayo Clinic do not wear white coats.
The Mayo Clinic's founders felt that the white coat would create an unnecessary separation between doctor and patient, and thus established a tradition in which physicians wear business attire to demonstrate respect for the people they serve.
The 2015-16 study drew from a questionnaire presented to more than 4,000 patients across 10 U.S. academic medical centers. Asked to rate their impressions of doctors pictured in various modes of dress, participants delivered answers that varied depending on their age and the context of proposed medical care. For example, patients preferred their doctors to wear a white coat atop formal attire in a physician's office, but favored scrubs in an emergency or surgical setting. Additionally, younger respondents were generally more accepting of scrubs in a hospital environment. Regardless, the presence of the white coat rated highly across the board — seemingly a clear signal to medical professionals on how to inspire maximum comfort and confidence from their patients.
Yet the issue of appropriate dress for doctors isn't as cut and dry as it seems, as decades of research have shown that those empowering white coats are more likely to harbor microbes that could be problematic in a health care setting. In part that’s because the garments are long-sleeved, which offers more surface area for microbes to gather — a problem that’s compounded because the coats are generally washed less often than other types of clothing. Although no definitive link between the long-sleeved coats and actual higher rates of pathogen transmission has been established, some programs, including the VCU School of Medicine in Virginia, have embraced a bare-below-the-elbows (BBE) dress code to minimize such problems. Clothes may make the man (or woman), but when it comes to patient safety, the general public may want to reassess their idea of how our health care saviors should appear.
The term for anxiety-induced high blood-pressure readings in a doctor's office is “white coat syndrome.”
Advertisement
Western doctors dressed in black until the late 1800s.
If the idea of a physician or surgeon wearing black seems a little morbid, well, that may have been part of the point in the 19th century. After all, the medical field had more than its share of undertrained practitioners who relied on sketchy procedures such as bloodletting, and even the work of a competent doctor could lead to lethal complications. However, Joseph Lister’s introduction of antisepsis in the 1860s dramatically cut the mortality rate for surgical patients, and with it, the perception of the possibilities of medicine underwent a major shift. While black had once been worn to denote seriousness, doctors began wearing white lab coats like scientists to demonstrate their devotion to science-based methodology, a sartorial presentation that also reflected an association with cleanliness and purity. By the turn of the century, the image of the black-clad physician was largely consigned to the remnants of an unenlightened age.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
April Fools’ Day is a time when dubious pranksters run rampant, so it’s wise to be a little more skeptical of everything you see and hear on April 1. But when the clock strikes midnight and the calendar flips over to April 2, we celebrate the polar opposite of trickery and deception: International Fact-Checking Day. This global initiative is spearheaded by the Poynter Institute’s International Fact-Checking Network (IFCN) — a community established in 2015 to fight misinformation and promote factual integrity. IFCN works with more than 170 fact-checking organizations to encourage truth-based public discourse, debunk myths, and ensure journalistic accountability.
Taco Bell once claimed to have purchased the Liberty Bell.
On April 1, 1996, Taco Bell pulled off a legendary April Fools’ prank, claiming it had purchased the Liberty Bell and renamed it the “Taco Liberty Bell.” Full-page ads ran in seven major newspapers, leading to hundreds of Americans calling the National Park Service to express outrage.
The organization held the first official International Fact-Checking Day in 2016 as part of an effort to highlight the importance of fact-checkers’ role in a well-informed society. The occasion has been celebrated on April 2 every year since, standing in stark contrast to the prior day’s foolishness. Celebrants may choose to honor the holiday however they please, be it by promoting trusted sources on social media or by taking advantage of the events and information produced by IFCN. In past years, the organization has held a public webinar to discuss the “State of the Fact-Checkers” report, and published lists of relevant essays written by fact-checkers around the world. Be sure to capitalize on this opportunity to get your facts straight, as April 4 ushers in yet another day of duplicity: National Tell a Lie Day.
A 1957 BBC April Fools’ broadcast pranked viewers into thinking spaghetti grows on trees.
Advertisement
The Spanish-American War was fueled by “yellow journalism.”
In the late 19th century, the American press found itself in the grip of a phenomenon known as “yellow journalism” — a sensationalized form of reporting that prioritized eye-catching headlines ahead of the cold, hard facts. These unverified claims sometimes had serious consequences, most notably in the case of the Spanish-American War.
On February 15, 1898, the USS Maine battleship exploded and sank in Havana Harbor in Cuba (a country controlled by Spain at the time). Within days, major newspapers including William Randolph Hearst’s New York Journal and Joseph Pulitzer’s New York World published accusations that Spain was responsible for the sinking, despite a lack of evidence. But the exaggerated headlines still swayed public opinion, fueling a desire to go to war.
Tensions escalated to the point that on April 20, the U.S. Congress issued an ultimatum for Spain to withdraw from Cuba, which Spain declined to do, opting to sever diplomatic ties with the U.S. instead. Spain declared war on the U.S. on April 23, with Congress following suit two days later.
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
What do you call someone who’s fallen for a prank? There’s no punchline here — in most English-speaking places, you’d probably just call them gullible. But in France, you might use the term poisson d’avril, meaning “April fish.” The centuries-old name is linked to a 1508 poem by Renaissance composer and writer Eloy d’Amerval, who used the phrase to describe the springtime spawn of fish as the easiest to catch; young and hungry April fish were considered more susceptible to hooks than older fish swimming around at other times of year. Today, celebrating “April fish” in France — as well as Belgium, Canada, and Italy — is akin to April Fools’ Day elsewhere, complete with pranks; one popular form of foolery includes taping paper fish on the backs of the unsuspecting.
April Fools’ celebrations last two days in Scotland.
In Scotland, April Fools’ is also called “Huntegowk,” a day where “gowks” (the Scottish word for cuckoo birds) are sent on phony errands. The second day of celebrations, called Tailie Day, is a bit more mischievous, when tails or “kick me” notes are placed on people’s backsides.
While the first reference to poisson d’avril comes from d’Amerval’s poem, historians aren’t sure just how old the April Fools’ holiday is. It’s often linked to Hilaria, a festival celebrated by the ancient Romans and held at the end of March to commemorate the resurrection of the god Attis. However, many historians believe that while Hilaria participants would disguise themselves and imitate others, there’s little evidence that it’s the predecessor of April Fools’. Other theories suggest that April 1 trickery stems from switching to the Gregorian calendar. One such explanation dates to 1564, the year French King Charles IX moved to standardize January 1 as the start of the new year, which had often been celebrated on Christmas, Easter, or during Holy Week (the seven days before Easter). Despite the royal edict, some French people kept with the Holy Week tradition and celebrated the new year in late March to early April, becoming the first “April fools.”
An 1878 newspaper hoax reported Thomas Edison’s newest invention could turn dirt into food and wine.
Advertisement
The BBC once claimed spaghetti noodles grew on trees.
The most convincing April Fools’ pranks often come from the most unexpected sources, which could be why the BBC has a history of successful hoaxes. This includes a 1957 joke, considered to be one of the first April Fools’ TV pranks, wherein the British broadcaster aired a two-and-a-half-minute segment claiming spaghetti noodles grew on trees in Switzerland. Footage showed Swiss noodle harvesters on ladders collecting noodles and drying them in the sun before dining on a large pasta dinner. While the prank likely would have fallen flat today, spaghetti wasn’t commonly eaten in the U.K. during the 1950s, which meant the dish was entirely unfamiliar to most viewers. But the hoax didn’t just prank viewers. Many BBC staffers were also fooled after being purposefully kept in the dark about the fictitious story — the production brainchild of cameraman Charles de Jaeger and a small crew — and were taken aback by a deluge of callers looking to acquire their own spaghetti trees.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
If you think meeting new people nowadays is difficult, imagine doing it without a phone, a car, or even a bike. This was the case for much of human history, of course, and it had such a profound effect on social interactions thatthe average distance between the birthplaces of spouses in England was just 1 mile prior to the invention of bicycles. Those calculations come from Yale’s Stephen Stearns, a professor of evolutionary biology, who studied parish registries of birth, marriage, and death. He found that most couples weren’t just from the same village — they were often from neighboring farms.
Karl von Drais’ original invention was known as a Laufmaschine (“running machine”), and early bikes were also known as velocipedes and dandy horses.
The bicycle was invented in 1817, which geneticist Steve Jones called “the most important event in recent human evolution” in his book The Language of the Genes. In addition to increasing genetic diversity by making it easier for couples who lived farther away from each other to meet and reproduce, bicycles were also intrinsically linked to women’s rights because they allowed women greater autonomy in the early 20th century. No less an authority than Susan B. Anthony proclaimed in an 1896 interview with journalist Nellie Bly that bicycling “has done more to emancipate women than anything else in the world. I stand and rejoice every time I see a woman ride by on a wheel.”
The world’s largest bicycle manufacturer is Giant.
Advertisement
In the Netherlands, there are more bicycles than people.
With a population of 18.3 million, the Netherlands is the 69th-most-populous country in the world. Yet with 23.9 million bicycles in the country, Holland contains quite a few more bikes than people. It’s widely considered the most bicycle-friendly country on Earth, as 43% of the population rides at least once per day. Amsterdam in particular has been deemed the bike capital of the world, with one expert estimating that more than 60% of all trips in the city center take place on two wheels.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
If asked what a dinosaur sounded like, many people would likely recall the roaring T. rex of Jurassic Park. However, that earth-shaking bellow seems to be a case of Hollywood exercising some creative liberty. While we don't know what these reptiles really sounded like, since they mostly died out some 66 million years ago, scientists at least have some reasonable ideas based on the anatomical structures of well-preserved fossils, combined with studies of the dinosaurs and their close relatives that exist today.
Although it was long believed that non-avian dinosaurs lacked the flying capabilities of early birds and Pterosaurs, newer research has shown that other small, feathered dinosaurs evolved the ability for powered flight multiple times.
Yes, dinosaurs do still exist, in the form of birds, which branched off from non-avian dinosaurs around 160 million years ago. Although birds mainly produce noises via a soft-tissue organ called the syrinx, which has yet to be uncovered from a non-avian dinosaur fossil, many of our feathered friends also engage in closed-mouth vocalization, in which sounds are pushed out from a pouch in the neck area. Another modern animal that utilizes closed-mouth vocalization is the crocodile, which just so happens to share a common ancestor with dinosaurs. Given the family ties, it's logical to conclude that some dinosaurs emitted something resembling the cooing of a dove, the booming of an ostrich, or the rumbling of a croc. Since larger animals with longer vocal cords produce lower-frequency sounds, it's also likely that enormous sauropods like Brachiosaurus delivered noises that, to our ears, would dip into an octave of infrasound — felt and not heard. On the other hand, the ear structures of the dinosaur-crocodile predecessor indicate a sensitivity to high-pitched noises, possibly the chirping of babies.
The field continues to evolve as new information comes to light; the recent discovery of the first known fossilized dinosaur larynx, from an ankylosaur, suggests these creatures were able to modify noises in a birdlike way despite the lack of a syrinx. And none of this even touches on the sound capabilities of hadrosaurs like Parasaurolophus, which almost certainly delivered a distinct call from the air passages that funneled through a conspicuous head crest. All in all, while a roar from a Jurassic-bred beast may have been the work of a Hollywood studio, there’s no movie magic needed to recognize that Earth’s prehistoric hills were alive with all sorts of reptilian sounds of music.
Some musicians are using dinosaur skull replicas as instruments.
Given the ever-increasing understanding of dinosaur sounds, it’s appropriate that a few experimental musicians have found ways to make music out of these potentially noisy monsters from yesteryear. One such artist is Southern Methodist University assistant professor Courtney Brown, who initially attached a mouthpiece and a synthetic larynx to the replica of a hadrosaur skull for an interactive exhibit, before recording the sounds produced by this creation alongside other instruments and vocals for her Dinosaur Choir project. Similarly, composers Anže Rozman and Kara Talve devised replica skull-based instruments as part of their assignment to score the Apple TV+ series Prehistoric Planet. Thanks to their atmospheric arrangements on innovations including the Triceratone, which fuses an electric double bass with a triceratops skull, and the Fat Rex, a combined frame drum and cello fingerboard topped with a 3D-printed T. rex skull, the Rozman-Talve-led team claimed Best Original Score for a Documentary Series honors at the 2023 Hollywood Music in Media Awards.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the optimism network
Interesting Facts is part of Optimism, which publishes content that uplifts, informs, and inspires.
Enter your email to receive facts so astonishing you’ll have a hard time believing they’re true. They are. Each email is packed with fascinating information that will prove it.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.