Original photo by GoodLifeStudio/ E+ via Getty Images
Inspiration can come from the unlikeliest of places. While living in Jamaica in the early 1950s, author Ian Fleming was in search of a name for the secret agent main character in his new book, Casino Royale. “I wanted to find a name which wouldn’t have any romantic overtones,” Fleming later said. “I wanted a really flat, quiet name.”
The film “Thunderball” is named after Ian Fleming’s estate in Jamaica.
The 1995 film “GoldenEye” is named for Fleming’s estate in Jamaica, where he wrote his Bond novels. The name may be a reference to a WWII operation he oversaw as a British naval intelligence officer.
Fleming was an avid bird-watcher, and one of his favorite books was Birds of the West Indies, written by American ornithologist James Bond. “I thought, ‘Well, James Bond, that’s a pretty quiet name,’” Fleming continued, “so I simply stole it and used it.” For years, Bond (who actually went by Jim) had no idea that his name adorned a series of spy novels, but as the popularity of the books grew — and particularly after the premiere of the first 007 film, Dr. No, in 1962, turned Bond into a bonafide pop culture phenomenon — the ornithologist learned about his moniker’s double life. Fleming eventually apologized to Bond, offering his own name for “a particularly horrible species of bird” if Bond ever discovered one. The two met on February 5, 1964, when Jim Bond and his wife showed up at Fleming’s house while on a trip to Jamaica, and left as friends. Fleming even gave Bond a signed copy of the then-unreleased novel You Only Live Twice, inscribed “to the real James Bond.”
Ornithologist Jim Bond’s great-grandfather designed the Brooklyn Bridge.
Advertisement
The inspiration behind James Bond’s code name, 007, remains a mystery.
A 16th-century occultist. A Rudyard Kipling short story. A British bus line. The international dialing code for Russia. A hotel room number. All of these things have been suggested as the inspiration behind Bond’s numerical nickname, 007. In the universe of the novels, the double-0 prefix denotes Bond’s “license to kill,” and the most compelling evidence for its real-world origin also comes from the spy world. In one of his last interviews, Fleming said that he borrowed the “00” because “in the Admiralty, all top-secret signals had the double-0 prefix” at the beginning of WWII. Sounds good, but what about the “7”? Some have posited that it came from part of a code used to encrypt the famous 1917 “Zimmerman Telegram,” in which German foreign secretary Arthur Zimmermann made secret overtures to Mexico calling for an alliance against the U.S. (British decryption of the telegram was considered a major triumph and part of the reason the initially neutral U.S. finally entered the war.) Others think that “7” is just a lucky number. Whatever the case, it makes sense that an international man of mystery has an equally mysterious code name.
Darren Orf
Writer
Darren Orf lives in Portland, has a cat, and writes about all things science and climate. You can find his previous work at Popular Mechanics, Inverse, Gizmodo, and Paste, among others.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Cornerstone Photos/ Alamy Stock Photo
In 1954, Sir Hugh Beaver, the managing director of Guinness, thought up a way to reduce pub disputes so bartenders could focus on pouring his company’s signature beers. He suspected that every bar could benefit from a book filled with verified facts and stats about subjects that might arise mid-conversation over a drink. Two events in particular prompted his decision: Earlier in the decade, he and fellow guests at a hunt in Ireland memorably argued about Europe’s fastest game bird, which they had no means of identifying. Then, on May 6, 1954, English athlete Roger Bannister became the first person to run a mile in less than four minutes, causing public interest in records-related news to surge. Norris McWhirter had served as the stadium announcer during Bannister’s historic run, and Beaver hired both him and his identical twin, Ross McWhirter — another sports journalist — to assemble The Guinness Book of World Records. At the time, the pair had already begun working at a London-based agency that supplied facts to newspapers and advertisers.
The U.S. is the second-largest market for Guinness beer.
That claim belongs to Nigeria. The dark brew has been sold there since 1827, although in glass bottles rather than cans. The country that drinks the most Guinness is the U.K., while Ireland comes in third, and the U.S. is fourth.
The McWhirter twins spent about three months working feverishly on their 198-page compendium. Released in the U.K. on August 27, 1955, the book featured about 4,000 records, ranging from the world’s tallest man to the smallest pub. Eight pages of black-and-white photographs broke up the text, along with a few ink drawings. Although initially meant to be given out for free at bars to promote Guinness, the book became so popular, the company started selling it, soon to great success. To date, more than 150 million books from the series — eventually renamed Guinness World Records — have been purchased, educating readers in 40-plus languages. But the brand is no longer beverage-based: Diageo, the alcohol conglomerate that now owns Guinness, sold Guinness World Records in 2001, and it’s now owned by a Canadian conglomerate called the Jim Pattison Group.
From 1976 to 1995, New York City's iconic Empire State Building was home to a Guinness World Records museum.
Advertisement
Guinness prompted the Irish government to adjust the trademark of its coat of arms.
All Guinness bottles and cans share the same harp-shaped logo, a nod to a national treasure — a famed 14th-century harp — preserved inside the library of Trinity College Dublin. The harp has been incorporated into Guinness labels since 1862, and the beverage titan trademarked the design 14 years later, although the design has been updated over the years. A harp (a reference to the same instrument at Trinity College) has also been Ireland’s emblem since the Irish Free State was established in 1922, starring in its seal of state, coat of arms, and coins. In the early 1980s, Ireland’s office of the attorney general suggested attempting to trademark the harp under international intellectual property jurisdictions with the instrument facing in both directions, but the government had concerns that the move would elicit a lawsuit from Guinness, which is associated with a harp that has a left-sided straight edge. So since 1984, the official, nine-string Irish harp is always pictured with its straight edge to the right.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Honey is often credited as a multiuse wonder, known to soothe sore throats, heal burns, and add a little sweetness to drinks and desserts. But if a bottle in the back of your pantry has been collecting dust, you might be wondering if it’s safe to eat. Don’t worry: As long as it’s stored properly, honey will never expire. Honey has an endless shelf life, as proven by the archaeologists who unsealed King Tut’s tomb in 1923 and found containers of honey within it. After performing a not-so-scientific taste test, researchers reported the 3,000-year-old honey still tasted sweet.
Earth is home to more than 20,000 species of bees, the vast majority of which do not produce honey. Less than 4% of all bees — around 800 species — are known to turn nectar into honey; in the U.S. that job is most commonly undertaken by Apis mellifera, aka the European honey bee.
Honey’s preservative properties have a lot to do with how little water it contains. Some 80% of honey is made up of sugar, with only 18% being water. Having so little moisture makes it difficult for bacteria and microorganisms to survive. Honey is also so thick, little oxygen can penetrate — another barrier to bacteria’s growth. Plus, the substance is extremely acidic, thanks to a special enzyme in bee stomachs called glucose oxidase. When mixed with nectar to make honey, the enzyme produces gluconic acid and hydrogen peroxide, byproducts that lower the sweetener’s pH level and kill off bacteria.
Despite these built-in natural preservatives, it is possible for honey to spoil if it’s improperly stored. In a sealed container, honey is safe from humidity, but when left open it can absorb moisture that makes it possible for bacteria to survive. In most cases, honey can be safely stored for years on end, though the USDA suggests consuming it within 12 months for the best flavor.
Ancient conqueror Alexander the Great was reportedly embalmed with honey.
Advertisement
Nearly 500 containers of ancient butter have been found in Ireland.
Finding food offerings inside burial chambers and tombs isn’t unusual in the archaeological world — and can be a useful tool for researchers to understand how people of the past ate. But not all ancient foods are found as grave goods. Take, for example, a barrel of 3,000-year-old butter found in an Irish bog. In 2009, workers in a peat deposit unearthed a wooden barrel in eastern Ireland; the barrel was revealed to be around 3,000 years old, with the butter inside perfectly preserved. While it was an unusual find, the 77-pound bucket of dairy isn’t the first — or possibly last — to be unearthed; nearly 500 similar containers have been found in Ireland. Historians have dubbed the preserved spreads “bog butter,” and believe they were likely packed and sunk into cool bogs to preserve or protect against theft at a time when butter was so valuable that it could be used to pay taxes.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Like a lot of strange happenings, it was first noticed in the 1960s: a small seismic pulse, large enough to register on seismological instruments but small enough to go otherwise unnoticed, occurring every 26 seconds. Jack Oliver, a researcher at the Lamont-Doherty Geological Observatory, documented the “microseism” and sussed out that it was emanating from somewhere “in the southern or equatorial Atlantic Ocean.” Not until 2005 was it determined that the pulse’s true origin was in the Gulf of Guinea, just off Africa’s western coast, but to this day, scientists still don’t know something just as important — why it’s happening in the first place.
Not that anyone is in a rush to confirm this theory, but the Richter scale — which measures the size of earthquakes — doesn’t max out at 10, 20, or any other number. Thankfully, most earthquakes are so small as to not even register.
There are theories, of course, ranging from volcanic activity to waves, but still no consensus. There does happen to be a volcano on the island of São Tomé in the Gulf of Guinea near the pulse’s origin point, not to mention another microseism linked to the volcano Mount Aso in Japan, which has made that particular explanation more popular in recent years. Though there’s no way of knowing when (or even if) we’ll learn the why of this phenomenon, one thing’s for sure: better a microseism than a macroseism.
Tori Amos’ 1992 debut solo album was titled “Little Earthquakes.”
Advertisement
California isn’t the most earthquake-prone state.
That would be Alaska, which isn’t just the most earthquake-prone state in the country — it’s one of the most seismically active areas in the world, with 11% of all earthquakes occurring there. That’s because Alaska is part of the Ring of Fire, a nearly 25,000-mile-long area along the Pacific Ocean, characterized by volcanic and seismic activity. The second-largest earthquake ever recorded (a staggering 9.2 on the Richter scale) took place in the Prince William Sound region there on March 27, 1964, lasting about 4.5 minutes and causing a tsunami that was felt as far away as California. Beyond that, three of the eight largest recorded earthquakes in the world have also been in Alaska, as were seven of the 10 largest in America. It has experienced an average of one magnitude 7 to 8 earthquake every year since 1900 and one “great” earthquake (magnitude 8 or higher) every 13 years.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Marie Antoinette’s most famous line has echoed for more than 200 years, reportedly adding fuel to the fire of France’s revolution. The only problem is the French queen’s supposed declaration is a myth — historians don’t think Marie Antoinette ever said, “Let them eat cake,” after being told her subjects had no bread. Researchers point to two main plot holes in the quote’s supposed backstory, the first being its phrasing in English. In fact, the French queen is supposed to have said, “Qu’ils mangent de la brioche,” or “Let them eat brioche,” a reference to a decadent bread made with eggs and butter.
Marie Antoinette helped popularize potatoes in France.
Eighteenth-century botanists adapted potatoes to Europe’s climate, though many Europeans believed they were unsafe to eat — until Marie Antoinette got involved. The queen wore a spray of potato flowers in her hair in 1785, leading spuds to become a fashionable food for high society.
The second problem is that the outline of the tale predates Marie Antoinette’s reign. At least one similar story cropped up around the 16th century in Germany, wherein a noblewoman suggested the poorest citizens in her kingdom eat sweetened bread. However, the first person to print the line about brioche was likely Jean-Jacques Rousseau, a French philosopher who mentioned the story around 1767 in his book Confessions, attributing the comment to a “great princess.” Rousseau’s text was published when Marie Antoinette was still a child in Austria, though it’s possible the story inspired French revolutionaries decades later, and was repeated with the addition of Marie Antoinette’s name as propaganda against the French monarchy. Yet there is no historical evidence (aka printed materials) that proves the queen ever uttered the phrase.
While Marie Antoinette was known for her excessive spending, some historians say the centuries-long smear to her reputation has long overshadowed her philanthropic side. As queen, she established a home for unwed mothers, personally adopted and cared for orphans, and even sold the royal flatware in 1787 to cover the cost of grain for impoverished families — all activities befitting a benevolent ruler who just so happened to love shopping.
The Ohio town of Marietta was named for Marie Antoinette in 1788.
Advertisement
Baking powder wasn’t invented until 1856.
Today, baking a cake can be as quick as whipping together a store-bought mix with eggs and oil, but until the mid-19th century it was an arduous task for home cooks. That’s because baking powder — the leavening agent that gives baked goods their light and fluffy texture — wasn’t invented until 1856. Before then, baking pastries and breads required advanced planning, since achieving an airy texture meant using yeast — which wasn’t commercially available until 1822. Bakers had to create their own yeast, by fermenting fruit, vegetables, or grains. Even with a successful infusion of yeast, batter had to rise between 12 and 24 hours; in some cases bakers tried other strategies, like whipping eggs thoroughly to add air bubbles, using caustic pearlash (which could add a bitter flavor), or by 1846 using the newly invented baking soda mixed with an acidic liquid like sour milk. In 1856, chemist and Harvard professor Eben Norton Horsford patented the first baking powder containing monocalcium phosphate, an acidic compound extracted from boiled animal bones. Horsford’s unique product blended the ingredient with baking soda in a shelf-stable, easy-to-use compound that would become popular among chefs and turn the baking powder business into a multimillion-dollar industry by 1900.
Interesting Facts
Editorial
Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Along with such phrases as “too much of a good thing” and “the clothes make the man,” we can also thank Shakespeare for the name Jessica. The Bard first used it in his play The Merchant of Venice (likely written around 1596), as the name of the moneylender Shylock’s defiant daughter. Some scholars think Shakespeare may have been inspired by the Hebrew name Iskah from the Bible, which was spelled “Jeska” in some English translations of the Old Testament. The name means both “to see” and “to possess foresight.”
Shakespeare was baptized on April 26, 1564, leading scholars to believe he was probably born on the 23rd (baptisms usually occurred within three days after a birth). He died exactly 52 years later, on April 23, 1616 — at which time his legacy was still only in its infancy.
Though it took several hundred years, Jessica eventually became an extremely popular first name. It consistently ranked among the 10 most popular baby names for girls born in the U.S. between 1976 and 2000, reached the top spot 1985–1990, and reclaimed it 1993–1995. Its popularity has waned over the last decade, however, and in 2020 it ranked No. 399. If you’re a Jessica fan, fret not: A successful Merchant of Venice adaptation may be all it takes for the name to reclaim its former glory.
Actress Anne Hathaway shares a name with Shakespeare’s wife.
Advertisement
Some still doubt that Shakespeare wrote his own work.
Though there’s little evidence to support the theory, the humble circumstances of Shakespeare’s life and his lack of a university education have led some scholars to suggest that he was not the true author of his sophisticated, extraordinarily influential body of work. Dozens of other authors have been put forward as the man behind the pen, with Sir Francis Bacon, Christopher Marlowe, and Edward de Vere, the 17th Earl of Oxford, among the most notable names suggested by “anti-Stratfordians.” Others believe that a group of writers collaborated under the name of Shakespeare. The vast majority of scholars reject the theory, but it’s likely that Shakespeare himself would understand why it persists — there’s nothing like a little scandal and intrigue to pique a reader’s interest.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Can you picture an Olympic hopeful waking up at the crack of dawn to spend hours hunched over a drafting table, perfecting their blueprints? Thanks to International Olympic Committee co-founder Pierre de Coubertin, the concept became a reality when the IOC began awarding medals in the categories of sports-related architecture, music, literature, painting, and sculpture at the 1912 Stockholm Games.
The first gold medal in architecture went to the Swiss team of Eugène-Edouard Monod and Alphonse Laverriére for their "Building Plan of a Modern Stadium." By 1928, the architecture competition had been divided into the subcategories of town planning and design, with the Netherlands' Jan Wils winning gold in the latter for his still-standing Olympic Stadium Amsterdam. However, the subjective process of selecting artistic champions ultimately produced some questionable results. Sometimes, finicky judges refused to award gold (or silver, or bronze) medals when the quality of submissions failed to meet their lofty standards. Other times, such as during the 1936 Berlin Games, the host country’s creative teams tallied a suspiciously disproportionate share of winning hardware.
Nero reportedly bribed officials into letting him compete in the 67 CE Olympics (the Games were traditionally restricted to Greeks). He won several arts competitions, and was also named winner of the chariot race despite falling and failing to finish.
Artistic competitions remained part of the Olympics following a hiatus for World War II, with Austria's Adolf Hoch and Finland's Yrjö Lindegren claiming architecture gold in 1948. However, the writing was on the wall for these Jim Thorpes of the compass and T-square, as new IOC President Avery Brundage (who started in 1952) strongly discouraged the proliferation of professionals in the amateur realm. The creative arts were permanently relegated to the sideshow of Olympic exhibitions in 1952, and the hard-earned efforts of champion builders, singers, and writers from the first half of the 20th century were banished to obscurity when their medals were stricken from the Olympic record books.
Greece's Panathenaic Stadium, site of the first modern Olympics in 1896, was built entirely from marble.
Advertisement
The 1900 Olympics represented the high-water mark for bizarre Olympic events.
While obscure sports have come and gone from the Olympics over the years, the 1900 Paris Games stick out for the sheer number of off-the-wall competitions. This can at least partly be explained by the fact that the Olympics coincided with the spectacle of the 1900 Paris Exposition, resulting in events that ranged from weird (horse long jumping) to cruel (live pigeon shooting) to pointless (underwater swimming). Yet these Olympics were also memorable for some of the more inspired moments of innovation, which included multinational teams competing in tennis, polo, football, rowing, and tug-of-war. The 1900 Games also marked the first year that women were allowed to compete, an accomplishment barely dimmed by the meager presence of the lone fan who showed up to watch the ladies square off in croquet.
Tim Ott
Writer
Tim Ott has written for sites including Biography.com, History.com, and MLB.com, and is known to delude himself into thinking he can craft a marketable screenplay.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Lady Liberty has pushed her torch high into the New York City skyline since 1886, but at one time, the grand statue did more than just inspire Americans — it was also a lighthouse. The same year French sculptor Frédéric-Auguste Bartholdi oversaw completion of his copper creation (formally named “Liberty Enlightening the World”), President Grover Cleveland approved a plan for the statue to be lit as a lighthouse. Engineers believed the Statue of Liberty’s torch, at 305 feet above sea level, could act as a navigational tool for ships approaching the New York Harbor, and set to work installing nine electric lamps within the torch, plus more along Lady Liberty’s feet and in the statue’s interior.
French sculptor Frédéric-Auguste Bartholdi was issued a U.S. patent for his Statue of Liberty design in 1879, seven years before the statue was completed. The design patent protected Bartholdi from replicas of all sizes (including miniature versions), but lasted only 14 years.
At 7:35 p.m. on November 1, 1886, engineers flipped on the power switch, washing the Statue of Liberty in light for the first time. However, the lights stayed on for just one week due to a lack of funding, and it took two weeks of darkness before the U.S. Lighthouse Board could secure an emergency budget. Even once the lights were turned back on, some questioned the statue’s efficacy as a lighthouse: Newspapers reported that while the lights were initially planned to reach 100 miles or more out at sea, in reality the torch was visible just 24 miles from the harbor. By the early 20th century, the lighthouse was considered “useless” for boat navigation, and on March 1, 1902, the U.S. War Department, with approval from President Theodore Roosevelt, extinguished the light permanently.
The island on which the Statue of Liberty stands was originally called Bedloe’s Island.
Advertisement
Lady Liberty’s original torch was destroyed in an explosion.
Despite being nearly 140 years old, most of the Statue of Liberty’s copper frame is original. However, one portion, the torch, was replaced in the 1980s due to extensive damage caused by an explosion. In 1916, amid World War I, German saboteurs attempted to stop the U.S. from supplying Britain with ammunition, stores of which were held on Black Tom Island, not far from Lady Liberty in the New York Harbor. The saboteurs set the stockpile ablaze, resulting in an enormous explosion equivalent to a 5.5 magnitude earthquake, which was felt as far as Philadelphia. The Statue of Liberty took more than $100,000 in damage from shrapnel (about $2.8 million today), including structural mangling of the torch that led to its permanent closure (it was once open to visitors). In 1984, Lady Liberty underwent a multiyear restoration that included replacing the severely damaged torch, and today sightseers can see the original up close on ground-level at the Statue of Liberty Museum on Liberty Island.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
There are more people on Earth today than ever before — nearly 8 billion, to be exact — which represents a full 7% of all 117 billion people estimated to have ever lived throughout the course of human history. The figure comes from the Population Reference Bureau, which released its first estimate in 1995 and has updated it occasionally in the years since. As with most math on this scale, the calculus wasn’t easy. That’s partly because our knowledge of history is ever-evolving: When the bureau initially calculated the number, modern Homo sapiens were thought to have first appeared around 50,000 BCE, but recent discoveries put the actual date closer to 200,000 BCE.
All the cattle on Earth weigh more than all the humans.
With a biomass of about 386 million tons, humans weigh a lot — but we don’t weigh as much as our bovine neighbors. An estimated 1.3 billion cattle share the planet with us, and their biomass comes out to an absolutely beefy 716 million tons.
Three main factors go into the math: how long humans are thought to have been walking the Earth, the average population during different eras, and the number of births per 1,000 people during said eras. As you might imagine, the growth has been astronomical — there were just 5 million humans in 8000 BCE, 300 million in 1 CE, and 450 million in 1200. And while the bureau acknowledges that this is “part science and part art,” even being off by a few billion gives us a ballpark figure to imagine all the people who came before us.
India is projected to overtake China as the world’s most populous country.
The United Nations estimates that will happen within the next five years, though new projections suggest it may happen even sooner. When the U.N. first made its report in 2019, India was home to 1.37 billion people and China had a population of 1.43 billion. China’s birth rate has been declining in recent years, however, hence the updated timeline. Once India becomes the world’s most populous country, it’s projected to maintain that position for the rest of the century.
Michael Nordine
Staff Writer
Michael Nordine is a writer and editor living in Denver. A native Angeleno, he has two cats and wishes he had more.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.
Original photo by Chris Willson/ Alamy Stock Photo
In 1998, a fur-covered robot hit store shelves just in time for the holiday shopping season, creating a frenzy among parents. Manufacturer Tiger Electronics had released the first real-life robotic pet: Furby. Partially resembling a hamster (thanks to its scruffy acrylic fur) and an owl (complete with pointed ears and a beak), the computerized toy greeted children and sang to them in Furbish, an entirely made-up language. Furby’s main hook was all about interaction; it could be startled by loud noises, responded to petting, and danced when it was happy, just like a real animal might. But the most innovative feature was that the small robots could supposedly learn English, a gimmick that created a whirlwind of conspiracies, including the idea that Furby was an international spy.
Furby was the first robotic toy to use artificial intelligence.
A slew of robotic toys emerged around 2000, heralding the millennium with computerized novelties. But Furby was considered the first of its kind to use (rudimentary) artificial intelligence, equipped with sensors that allowed it to respond to humans and other Furbys.
Because Furby was the first toy of its kind, most people didn’t understand how it “learned” language, and the initial fervor was so intense that it led the National Security Agency to ban the toys from its premises; it was also banned from the Norfolk Naval Shipyard and the Pentagon. NSA agents believed the robots were embedded with recording devices that could allow them to listen in on sensitive topics and later replay classified conversations. Tiger Electronics refuted the ban, explaining that while the toy was unique, “Furby [was] not a spy,” going so far as to reveal that the toys were pre-programmed with around 200 words — meaning they didn’t actually learn anything — and that they slowly unveiled their vocabulary the longer a child played. Meanwhile, the outlandish Furby fears (including the belief that it could launch a space shuttle) didn’t slow its popularity; more than 40 million of the revolutionary robots were sold in the first three years.
Today, personal electronics sometimes seem like the only way to cope with the grueling ordeal of air travel, helping us pass the time with an in-flight movie or music. But that wasn’t always the case — not so long ago, the Federal Aviation Administration (FAA) prohibited using CD players, laptops, and even Furbys on airplanes. The 1990s ushered in a wave of portable electronics, and with their popularity came a theory that many devices could interfere with a plane’s navigation system, creating chaos in the skies. In an effort to protect passengers and pilots, the FAA banned the use of many electronics during takeoff and landing, including the incredibly popular robotic toy, which had to have its batteries removed before takeoff. No plane control issues were ever attributed to a Furby on board, though there likely was one benefit to powering down the robots while in air: their silence, since many people found their constant chatter grating.
Nicole Garner Meeker
Writer
Nicole Garner Meeker is a writer and editor based in St. Louis. Her history, nature, and food stories have also appeared at Mental Floss and Better Report.
Advertisement
top picks from the Inbox Studio network
Interesting Facts is part of Inbox Studio, an email-first media company. *Indicates a third-party property.