Juneteenth is a more recent addition to most American calendars, and the first new federal holiday in nearly four decades. So it might come as a surprise that the celebration already has a flag of its own, and one that’s been around for more than 25 years. Designed back in 1997, the flag came about at a time when advocates were rallying for the holiday — which celebrates the end of slavery in the U.S. — to gain federal recognition. In the years since, the design (originally crafted by activist Ben Haith) has undergone minor changes but remains heavy with symbolism. The banner’s solid white star represents freedom and nods to Texas (aka the Lone Star State) for its role in the creation of Juneteenth; on June 19, 1865, enslaved people were officially emancipated in Texas more than two years after President Abraham Lincoln signed the Emancipation Proclamation to abolish slavery in Confederate states. Also featured on the flag is a white sunburst, portraying new beginnings. Both stars sit atop an arching blue and red horizon that signifies optimism for the future. The three-toned flag purposely uses the same colors as the American flag as a reminder that formerly enslaved people and their descendants are Americans, too, despite the country’s history of unequal rights.
The last enslaved people in the U.S. were set free on Juneteenth.
Juneteenth honors a date when enslaved Americans in Texas were liberated, but it wasn’t the true ending of slavery. Despite Lincoln’s famous decree, some areas (such as Delaware and Kentucky) permitted the practice until the 13th Amendment ending slavery was ratified in December 1865.
Juneteenth’s official flag is raised alongside the Stars and Stripes above government buildings, on college campuses, and in front yards around the country, but it also sometimes appears next to the Pan-African flag at reunions, block parties, and other Juneteenth festivities. Introduced in 1920 by a group led by political activist Marcus Garvey, the three-striped banner of the Pan-African flag is identifiable by its horizontal red, black, and green bands, which represent the blood, people, and growth of the African diaspora. Both flags are meant as inspiring symbols of unity and remembrance — ideas worth celebrating on America’s second Independence Day.
Texas was the first state to make Juneteenth an official holiday, in 1980.
Advertisement
The Emancipation Proclamation is rarely on display.
The United States’ most famous founding documents are relatively easy to see in person. Take the Declaration of Independence and the Constitution, for example, which are on permanent display at the National Archives Museum in Washington, D.C. But despite its historical significance, the original Emancipation Proclamation is seldom showcased. That’s because the handwritten, double-sided document hasn’t aged well (physically). Archivists attribute its deterioration to the paper used back in 1863, which hasn’t withstood time as well as some older documents such as the Constitution, which were penned on animal-skin parchments. In addition, the U.S. Department of State had custody of the original Emancipation Proclamation until transferring it to the National Archives in 1936, by which point it had sustained considerable damage from handling and light exposure. Today, the Emancipation Proclamation is stored in an environmentally controlled vault and viewable on rare occasions; the pages never leave storage all at once, and are always displayed under extremely low light for short periods of time to maximize their life span.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
The woody, warming spice we sprinkle with abandon on top of holiday cookies, baked goods, and seasonal coffees is native to Sri Lanka, Myanmar, and India. But very few people knew where cinnamon came from when merchants first began selling spices throughout Europe, Asia, and Africa as far back as 3,000 years ago — and spice traders capitalized on that lack of knowledge to charge high prices. Harvested from the inner bark of Cinnamomum trees, cinnamon has been used for thousands of years as medicine, for religious practices and funerals, and in cuisine, but with a big price tag: It was once considered more precious than gold.
Several tree species produce cinnamon, but only one is considered the real deal. Bark produced by the Cinnamomum verum (aka Ceylon cinnamon) has a lighter flavor and a heftier price tag than other kinds (aka cassia cinnamons). Yet many taste-testers say they can’t tell the difference.
In an effort to conceal cinnamon’s origins from competitors and explain the extravagant markup to wary customers, spice traders of the past provided elaborate backstories. By some fifth-century accounts, cinnamon traders asserted that collecting the spice was a dangerous task thanks to angry “winged creatures” that lived in the trees; cinnamon harvesters supposedly donned protective outerwear made of thick hides and risked their personal safety to collect a few measly pieces of cinnamon bark. Other vendors claimed cinnamon was transported from far-off lands by birds who used it as nesting material (in this tale, harvesting cinnamon sticks from nests required a cow sacrifice to provide the birds with a meaty distraction). Yet another story declared that cinnamon grew in dangerous, snake-infested valleys. Cinnamon’s origins remained an enigma for centuries, but luckily for chefs and bakers today, the secret eventually got out thanks to global exploration brought on by a surging interest in spices. Now, the flavoring is a low-cost mainstay in modern pantries.
Sticks of curled and dried cinnamon bark are called quills.
Advertisement
Scientists have recreated a cinnamon perfume Cleopatra may have worn.
What did our ancestors smell like? Archaeologists and historians have pieced together how numerous cultures ate, dressed, relaxed — in short, lived — but it’s generally been harder to tell how people once smelled. Thanks to one archaeological find, however, we have a clue as to how Egyptians may have perfumed themselves, perhaps even Cleopatra — a royal known for a cinnamon-laced scent so seductive, it’s credited with attracting Julius Caesar. In 2012, archaeologists unearthed ruins north of Cairo suspected to be an ancient Egyptian perfume factory; that dig inspired a team of historians and perfume experts to recreate fragrances that hadn’t been worn in nearly 2,000 years. Using recipes from ancient Greek texts that may have borrowed from Cleopatra’s own formulas — a book of recipes that no longer exists, but was often referenced by other perfumers — researchers blended cinnamon, myrrh, and other herbs with olive oil to create a viscous fragrance akin to what ancient Egyptians once donned. While we’ll never know for sure if Cleopatra wore this specific scent, the experiment gives us an olfactory link with history.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
While you might associate the development of modern universities with intellectual movements like the Renaissance or the Enlightenment, the first universities predate those major periods in history — not by years but by centuries. One of the oldest universities in the world is Oxford University, where teaching began back in 1096. That’s much older than Harvard (established in 1636) or Yale (1701), and it’s even older than some well-known Indigenous civilizations in the Americas, including the Incas, who lived in the Andean region of South America from around the 13th century CE to the mid-16th century. (Other groups and empires have occupied the Andes since at least 10,000 BCE.)
The University of al-Qarawiyyin in Fez, Morocco, was built by Fatima al-Fihri, the daughter of a rich merchant, in 859 CE. By the 10th century, it had evolved into the largest Arab university in North Africa. Today, it is the oldest continually operating university in the world.
The first universities were not like the sprawling campuses of today. Instead, they were more like guilds devoted to certain subjects or crafts. Slowly, the influence of these schools grew throughout the High Middle Ages (1000–1300), and many of them became hot spots during future intellectual movements. Meanwhile, as Europe was busy cementing the importance of its universities (and fighting in half-a-dozen Crusades), the Incas were building sprawling road networks and reliable postal systems — they even had highly skilled brain surgeons.
Machu Picchu means “old hill” in the Quechua language.
Advertisement
The Incas used string and knots to record information.
Although the Incas had no known written language, they weren’t without a means of recording important information. Quipu were Andean textiles that used a system of colored string and knots to record data. These textiles were both recorded and read by officials known as “quipucamayocs.” Evidence suggests that quipu were first developed by the Wari civilization, who lived in Peru between about 450 and 1000 CE. Scholars believe the Incas used quipu both to record hard data — such as census figures, inventory, and other administrative information — and as a way to encode Incan myths and histories. Because of the Andes’ arid climate, the quipu were well-preserved for centuries. Today, hundreds of quipu are displayed in museums around the world, with the biggest collection now residing at the Ethnological Museum of Berlin in Germany.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Olesya Semenov/ Alamy Stock Photo
In the 1950s, Americans were looking for ways to spend less time in the kitchen. Generations of home cooks, the overwhelming majority of them women, had made food preparation the focus of their day; historians estimate that in 1900, an average household spent 58 hours per week on housework. But a few decades later, postwar innovations such as affordable appliances created more free time — and so did a new wave of commercially prepared and processed foods, an emerging industry fueled by scientists such as William A. Mitchell. While Mitchell’s name isn’t widely known today, his most popular inventions are major name brands, including Cool Whip, Pop Rocks, and Tang.
The vitamin-infused powder is forever linked with space exploration, but General Foods originally planned Tang as a travel-friendly drink mix for consumers. NASA, looking for easier ways to transport beverages to space, took notice and first stocked it on the 1962 Friendship 7 mission.
Growing up in Minnesota, Mitchell spent his teenage years as a farmhand and carpenter, working to fund his college tuition. It took a few years for the future inventor to venture into food production after graduation, chemistry degree in hand; he worked at Eastman Kodak creating chemical developers for color film, as well as at an agricultural lab. He then went to work at General Foods in 1941, contributing to the war effort by creating a tapioca substitute for soldier rations. (Overseas, GIs renamed the gelatin and starch blend “Mitchell’s Mud.”) The postwar years saw Mitchell churn out a few flops, like carbonated ice, as well as now-iconic hits. In 1956, his quest to create a self-carbonating soda led to the accidental invention of Pop Rocks. A year later, he developed Tang Flavor Crystals, which skyrocketed to popularity after NASA used the powder in space to remedy astronauts’ metallic-tasting water. And by the time he’d retired from General Foods in 1976, Mitchell had developed a quick-set gelatin, powdered egg whites, and a whipped cream alternative — the beloved Cool Whip that now dominates grocery store freezers.
Pop Rocks were originally named Gasified Confection.
Advertisement
Pop Rocks were briefly discontinued because of safety concerns stemming from a notorious urban legend.
Pop Rocks are known as a totally rad treat of the 1980s, but the candy’s first release in the 1970s was a dud. General Foods initially released the candy in 1975, hoping to capitalize on its innovative appeal. But soon after the confection hit stores, rumors began to spread that it was dangerous, even deadly — supposedly, the carbon dioxide that caused the miniature explosions could mix with carbonated soda and cause children’s stomachs to explode. General Foods and inventor William Mitchell tried to combat the unfounded stories with newspaper ads, a telephone hotline, and by sending letters to 50,000 school principals around the U.S. But amid persistent rumors and slumping sales, General Foods stopped marketing the candy and sold the brand to Kraft in 1985, who marketed it as “Action Candy” — though today’s sweet tooths can once again find the candy under its original name.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Should you ever come across an Academy Award on eBay, there’s a good chance it shouldn’t be there. That’s because Oscar winners aren’t allowed to sell their statuettes without first offering them back to the Academy of Motion Picture Arts and Sciences for the nominal fee of $1, which is meant to maintain their prestige. As the Academy’s official regulations explain, honorees “have no rights whatsoever in the Academy copyright or goodwill in the Oscar statuette or in its trademark and service mark registrations” and “shall not sell or otherwise dispose of the Oscar statuette, nor permit it to be sold or disposed of by operation of law” before first giving the Academy the chance to buy it back. Presumably, the Academy always accepts that $1 offer in order to protect the brand, though it’s not clear how often, if ever, it’s actually happened.
Both “The Godfather Part II” (1974) and “The Lord of the Rings: The Return of the King” (2003) won Best Picture. Just as impressive, all three films in both trilogies were nominated for Best Picture — and the original “Godfather” won it as well.
The rule is strictly enforced, with winners having to sign a contract before taking possession of their statuette. It also applies to their family members and descendants. Not everyone has abided by it, however. To take just one example: The trophy awarded to art director Joseph C. Wright, who won for his work on 1942’s My Gal Sal, was sold to an auction house for $79,200 in 2015. This led to the Academy winning a lawsuit enforcing the rule — and likely discouraging any future honorees from trying to break it.
The first person presented with an Oscar was German actor Emil Jannings.
Advertisement
The Oscars weren’t televised until 1953.
For nearly 25 years, you had to be in the room to truly know what went down at the Oscars. That changed on March 19, 1953, when NBC aired the ceremony live from the RKO Pantages Theatre in Hollywood. Bob Hope hosted, something he went on to do a record-setting 19 times. Prior to that, the Academy Awards were broadcast on the radio — except for the first ceremony, a private affair that lasted just 15 minutes and was exceptionally undramatic, given the fact that the winners had been announced several months earlier.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
At 3:45 p.m. on September 9, 1947, a computer programmer working on the Mark II at Harvard recorded in a logbook that the team had discovered the “first actual case of [a] bug being found.” But the programmer wasn’t referring to some poorly written lines of code — this was an actual bug. A moth, to be exact, which had flown into a room where the Mark II, one of the world’s first computers, was housed at the university. Attracted by the warmth of the 25-ton machine, the winged creature met its end in one of the many electromagnetic relay contacts. The team removed the moth with tweezers.
Nikola Tesla was the first to coin the engineering term “bug.”
American inventor Thomas Edison made several references to “bugs” in his notebooks in the mid-1870s, defining them as “bug — as such little faults and difficulties are called.” By 1889, newspapers reported on how Edison was hard at work fixing a “bug” in his phonograph.
While this event is often mistakenly cited as the birth of the programming term “bug” to mean a flaw or imperfection, the word had actually been used in engineering circles for over half a century. But the 1947 moth misadventure was popularized by Grace Hopper, a mathematician and computer science pioneer who worked with the team as they “debugged” the Mark II. Early computers such as Harvard’s Mark series were responsible for other modern computer programming lingo, though: For example, a “patch” comes from the punched cards used in early machines that programmers physically “patched” with tape to fix errors. Today, the original Mark II logbook — with the original “bug” taped to it — is at the Smithsonian Museum of American History.
The word “bug” likely first appeared in an early English translation of the Bible.
Advertisement
“Spam" took on the additional meaning of junk email thanks to a sketch by the British comedy troupe Monty Python.
The sketch begins with a simple request: A couple in a diner wants to order food. Unfortunately, the proprietor of the establishment serves a very Spam-heavy menu, including “Spam Spam Spam Spam Spam Spam Spam baked beans Spam Spam Spam and Spam.” Originally airing on Monty Python’s Flying Circus in 1970, the sketch later became associated with annoying floods of data, ads, or massive amounts of useless text. The word likely first appeared online in late 1980s MUDs (multi-user dungeons), where users could “spam the database” by using a program to create lots of objects in the shared digital space, among other pesky, repetitive behaviors. By 1990, archived MUD chats show that the use of the term “spam,” along with its sketch comedy origins, had been officially established.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Today Leonardo da Vinci’s “Mona Lisa” is probably the most famous painting in the world — and it deserves the accolade. Painted between 1503 and 1519, this portrait (commonly believed to be of Italian noblewoman Lisa del Giocondo) reflects the Renaissance polymath’s deep understanding of his art form and has been analyzed in depth for decades, if not more. Although certain sectors of the art world regarded the portrait as a masterwork by the 1860s, the general public knew little about it until the 20th century. Then, the unthinkable happened — the “Mona Lisa” was stolen.
The “Mona Lisa” once hung in Napoleon Bonaparte’s bedroom.
First displayed by a Leonardo da Vinci patron, French King Francois I, the “Mona Lisa” was a mainstay of royal residences but rode out the French Revolution (at the end of the 18th century) in a warehouse. Later, it hung in Napoleon’s bedroom in Paris’ Tuileries Palace for four years.
In the early morning hours of August 21, 1911, after spending the night hiding in an art-supply closet in the Louvre, three Italian “handymen” snuck over to the “Mona Lisa,” unhooked it from its protected location, tossed a blanket over their pilfered prize, and snuck away undetected, boarding a train at the Quai d’Orsay station at 7:47 a.m. The theft became an international scandal, and newspapers around the world ran stories about the more than two-year-long search for the missing masterpiece. Finally, in December 1913, the painting was found in Florence, Italy, after an attempted sale by the heist’s ringleader, Vincenzo Perugia — who had actually worked at the Louvre for a time, installing glass cases over the paintings. The treasure then went on a tour of Italy until it returned to the famous French museum in early 1914. Although the “Mona Lisa” and her mischievous smile survived unharmed, the painting’s reputation had changed forever, with the many headlines about the theft making her a household name that has endured to this day.
The hazy background of the “Mona Lisa” is a specific painting style known as sfumato.
Advertisement
Edvard Munch’s “The Scream” was also stolen… twice.
No painting captures existential dread quite like Edvard Munch’s “The Scream.” Created in 1893, Munch’s masterpiece depicts a ghostly figure, not mid-scream as many assume, but instead hearing “the great scream throughout nature,” according to the artist’s own inscription on a lithograph edition of the work. The painting is so famous, it’s one of the few works of art to receive the rare honor of its own emoji. Of course, popularity can also inspire the wrong kind of attention, and in February 1994, on the opening day of the Winter Olympics in nearby Lillehammer, Norway, two thieves stole Munch’s masterwork from Oslo’s National Gallery. The burglars left behind only a brief note: “Thousand thanks for the poor security.” Fortunately, the painting was recovered — identified as genuine thanks to a splash of candle wax on its front — three months later in Åsgårdstrand, Norway, a town where Munch lived and worked for years. Then, in 2004, another version of “The Scream” (Munch painted several) was stolen from the Munch Museum in Oslo; it was recovered two years later. As happened with the “Mona Lisa,” these thefts — though terrible crimes — only added to the painting’s international renown.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by White House Photo/ Alamy Stock Photo
In the sibling department, every President has had, at minimum, one half-brother or half-sister. However, a few Presidents are sometimes considered to have been raised as only children — most notably Franklin D. Roosevelt, whose only half-sibling (his father’s oldest son, James) was 28 years FDR’s senior. Bill Clinton’s half-brother, Roger, is about a decade younger than him. Barack Obama also has a 10-year age gap with his younger half-sister Maya, although he learned later in life that he possessed at least five more half-siblings on his father’s side. Meanwhile, Gerald Ford is the only child his mother and father produced, but he was raised with three younger half-brothers after his mother remarried, and as a teen, learned that he also had three younger half-sisters, via his father.
Almost one-third of U.S. Presidents were born in either Ohio or Virginia.
Of the 46 commanders-in-chief so far, seven have been born in Ohio, while eight were born in Virginia when it was either a colony or a state. Only 21 states have produced a President so far.
The no-only-children rule isn’t the only presidential birth quirk. Fifteen Presidents, including Joe Biden, are firstborns. Just seven occupants of the Oval Office have been the babies of their families, among them Andrew Jackson and Ronald Reagan. That means 23 Presidents have fallen somewhere in the middle of the birth order, with the likes of Grover Cleveland and Herbert Hoover being true middle children (they were born to families with nine and three offspring, respectively). John Tyler, the 10th President, fathered the most youngsters himself: 15.
Before she was a First Lady, only child Laura Bush worked as a public school teacher and librarian.
Advertisement
The only U.S. President to get married at the White House was Grover Cleveland.
Grover Cleveland is often remembered for being the sole President elected to non-consecutive terms. Yet he was also the only U.S. President to serve as a groom while in office. His wedding to Frances Folsom fell on June 2, 1886, less than 15 months after his first inauguration. The bride, 22, was a recent Wells College graduate, and Cleveland was the 49-year-old commander-in-chief. Once law partners with Frances’ father, Oscar, Cleveland had known her since she was an infant. After Oscar died in an 1875 carriage accident, Cleveland oversaw the Folsom estate and Frances’ schooling. A decade later, Cleveland proposed in a letter; the pair kept their engagement secret until five days before the wedding. The ceremony occurred in the Blue Room and was attended by 28 guests. Rather than use the line vowing to “honor, love, and obey” in the bride’s vows, Frances and Cleveland replaced the last word with “keep.” During Cleveland’s time as America’s 24th President, Frances also gave birth to the lone child ever born to a sitting President in the White House: Esther Cleveland came into the world on September 9, 1893. The couple’s second child of five, Esther was born in her parents’ bedroom.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Ravens are smart — really smart. Studies have shown that they can use tools, remember human faces, and even plan for the future. This behavior cuts both ways for humans: Edgar Allan Poe’s favorite birds have demonstrated a tendency to both favor people who show them kindness and hold grudges against those who treat them poorly. These preferences aren’t fleeting, either — they may last for years.
Like all corvids, ravens are referred to as socially monogamous. This means that mated pairs stick together for life. Other birds that do likewise include the bald eagle, black vulture, and whooping cranes.
Raven intelligence is comparable in some cases to that of chimpanzees, which are among the smartest members of the animal kingdom. What’s more, they aren’t the only ones upending the “bird brain” stereotype: Other members of the corvid family — namely crows, jays, and magpies — have displayed exceptional intelligence as well. So the next time you encounter a raven, be sure you get on its good side. You may make a new friend who won’t forget you anytime soon.
English lore has long claimed that the kingdom will fall if ravens ever leave the Tower of London. With that in mind, it’s little surprise that the Ravenmaster has been an official — and important — position at the landmark since the 1960s. The current Ravenmaster is popular on social media and has written a well-received memoir about his experiences tending to the clever birds, whose small stature belies the near-mythical status they occupy in England’s collective imagination.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
With an ever-expanding catalogue of more than 171 million items occupying 838 miles of bookshelves, the Library of Congress is the largest library on the planet as measured by collection size. Its prodigious holdings include more than 40 million books and other print materials, 74 million manuscripts, and the largest collection of rare books in North America. Old King Cole, which is about a millimeter tall (tinier than a grain of rice), is the library’s smallest book, while a 5-by-7-foot collection of photos of Bhutan is the largest.
The Library of Congress is the country’s oldest cultural institution.
Founded in 1800, the library predates every other federal cultural institution in the U.S. — it's so old, in fact, that it was brought into existence by the same bill that relocated the capital to Washington, D.C., from Philadelphia.
The library doesn't just house printed materials, of course. It contains everything from the contents of Abraham Lincoln’s pockets on the night he was assassinated to hundreds of billions of tweets and Amelia Earhart’s palm print. The British Library and its massive catalogue is next on the list of the world’s largest libraries, with the top five rounded out by the New York Public Library, Library and Archives Canada, and the Russian State Library.
The Library of Congress was first proposed by James Madison.
Advertisement
The original library was burned down in the War of 1812.
The Library of Congress was comparatively tiny for the first 14 years of its existence, but that didn’t make it any less tragic when its collection of 3,000 books was destroyed along with the Capitol building on August 24, 1814. The conflagration that took it down, part of the War of 1812, necessitated a new location. Enter Thomas Jefferson, who offered his own collection of 6,487 books (then the largest personal library in the nation) as a replacement for the lost volumes. Though he didn’t do so for free — Congress paid him $23,950 — Jefferson did provide the foundation for what the library would eventually become. Sadly, a second fire destroyed most of his contribution as well as nearly two-thirds of the entire collection on Christmas Eve 1851, but the institution rose from the ashes once again.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.