Post-Its
In 1968, scientist Spencer Silver was working at the Minnesota Mining and Manufacturing Company, also known as 3M. Founded in 1902, 3M quickly evolved beyond mining, and by the mid-20th century, the company had expanded into the adhesives game. At the time, Silver was trying to make “bigger, tougher, stronger” adhesives, and thus considered one of his creations, known as acrylate copolymer microspheres, a failure. These microspheres could retain their stickiness but could also be removed easily — not exactly big, tough, or strong.
While Silver believed this light-hold adhesive could have some sort of use (he patented it just to be safe), he couldn’t put a finger on what that use was, exactly, until one day when fellow 3M scientist Art Fry was in search of a bookmark that could stick to pages without damaging the paper. Fry immediately thought of Silver’s microspheres, and the two scientists soon found themselves writing each other messages around the office on the world’s first Post-it Notes. “What we have here isn’t just a bookmark,” Fry once said. “It’s a whole new way to communicate.”
Microwave Ovens
Today 90% of American households have a microwave oven — and it’s all thanks to magnetron expert Percy Spencer. In the mid-1940s, Spencer was working at the aerospace and defense company Raytheon when he took a step in front of an active radar set. To his surprise, the candy bar in his pocket melted. Spencer conducted a few more experiments, using popcorn kernels and eggs, and realized that microwaves could vibrate water molecules, causing them to produce heat and cook food. Raytheon patented the invention in 1945, and released the first microwave oven, called the “Radarange,” the next year. It weighed 750 pounds and cost $5,000 (about $52,000 today). It wasn’t until the 1970s that both the technology and price reached that consumer sweet spot, and microwave ovens became a must-have appliance in every U.S. home.
Penicillin
If you ever need to stress to your boss the importance of vacation, share the tale of penicillin. On September 3, 1928, Scottish physician Alexander Fleming returned to his laboratory at St. Mary’s Hospital in London after a vacation of more than a month. Sitting next to a window was a Petri dish filled with the infectious bacteria known as staphylococcus — but it’s what Fleming found in the dish alongside the bacteria that astounded him.
Inside the Petri dish was a fungus known as penicillium, or what Fleming at the time called “mould juice.” Whatever the name, this particular fungus appeared to stop staphylococcus from spreading, and Fleming pondered whether this fungus’s bacteria-phobic superpowers could be harnessed into a new kind of medicine. Spoiler: It could, and in the coming years, Fleming developed the world’s first antibiotic, winning the Nobel Prize for medicine in 1945 for his accidental yet world-changing discovery. “I did not invent penicillin. Nature did that,” Fleming once said. “I only discovered it by accident.”
More Interesting Reads
X-Rays
In November 1895, German scientist Wilhelm Conrad Röntgen was hard at work studying cathode radiation in his Würzburg laboratory when a chemically coated screen 9 feet away began to glow. What followed was seven weeks of what Röntgen’s wife, Bertha, later described as a “dreadful time.” Röntgen worked tirelessly, obsessed with discovering the secrets of the phenomenon he called “X-rays” (named because the rays were unknown, as in “solving for x”) — often coming home in a bad mood, and eating silently before immediately retreating back to his lab. Eventually, he even moved his bed to his lab so he could work around the clock. As Röntgen would later put it, “I didn’t think; I investigated.”
The result of this investigation was a paper published in late December that same year, titled “On a New Kind of Rays.” The work detailed how these X-rays could penetrate objects, and the medical applications for such an invention were immediately apparent. Within a month or two, the first clinical uses of X-rays occurred in Hanover, New Hampshire, and Röntgen became the recipient of the first Nobel Prize in physics in 1901.
Vulcanized Rubber
On its own, natural rubber isn’t immensely useful — it melts in warm weather, cracks in the cold, and adheres to basically everything. But once rubber undergoes a process known as “vulcanization,” in which natural rubber is mixed with sulfur (or some other curative) and heated to between 140 to 180 degrees Celsius, it gains immense tensile strength and becomes resistant to swelling and abrasion.
Although creating this kind of tough rubber is a relatively complicated process, evidence suggests that an ancient Mexican people known as the Olmecs (which means “rubber people”) used some type of vulcanization. But modern vulcanization didn’t arrive until 1839, when American inventor Charles Goodyear accidentally dropped India rubber mixed with sulfur on a hot stove. Recognizing that the rubber held its shape and also gained strength and rigidity, Goodyear soon patented his discovery. Alas, protecting those patents from infringement proved impossible, and Goodyear died in 1860 some $200,000 in debt.
However, Goodyear still saw his life as a success, once writing: “I am not disposed to complain that I have planted and others have gathered the fruits. A man has cause for regret only when he sows and no one reaps.” Thirty-eight years later, American entrepreneur Frank Seiberling started a company to supply tires for the nascent automobile industry. Because creating tires capable of handling the rough terrain of dirt roads relied entirely on the process of vulcanization, Seiberling named his enterprise after the man who made it all possible — calling it the Goodyear Tire & Rubber Company.
Velcro
Amazing inventions come to curious minds, and that’s certainly the case for Swiss engineer George de Mestral. While on a walk in the woods with his dog, de Mestral noticed how burrs from a burdock plant stuck to his pants as well as his dog’s fur. Examining the burrs under a microscope, de Mestral discovered that the tips of the burr weren’t straight (as they appeared to the naked eye), but instead contained tiny hooks at the ends that could grab hold of the fibers in his clothing. It took nearly 15 years for de Mestral to recreate what he witnessed under that microscope, but he eventually created a product that both stuck together securely and could be easily pulled apart. In 1954, he patented a creation he dubbed “Velcro,” a portmanteau of the French words velours (“velvet”) and crochet (“hook”).
Synthetic Dye
For most of human history, dyes and pigments were sourced from natural resources such as metals, minerals, and even bat guano. It was an expensive process, and one of the most costly colors to create was purple, which had to be sourced from a particular mollusk along the coast of Tyre, a city in modern Lebanon. In fact, the dye was so expensive that the color was reserved for royalty, with monarchs like Queen Elizabeth even passing laws to ensure as much.
Then came 18-year-old British chemist William Henry Perkin. In 1856, Perkin was working in a lab, where he was trying (and failing) to produce a synthetic form of quinine, a compound found in the bark of cinchona trees and used to treat malaria. While washing out the brown sludge of one failed experiment with alcohol, the mixture turned a brilliant purple. Calling his creation “mauveine,” Perkin soon realized that not only was this dye cheap to produce, but it also lasted longer than dyes derived from natural sources, which tended to fade quickly.
Perkin’s discovery kick-started a chain reaction of chemical advances that brought cheap, colorful dyes to the fashion industry. Within six years of Perkin’s happy accident, even Queen Victoria herself began wearing colorful garments of bright mauveine.