Original photo by Docinets Vasil/ iStock

Weddings are celebratory occasions filled with dancing, decadent slices of cake, and perhaps one too many libations. But before you show up to the venue and get the party started, it’s worth noting the (usually) unwritten rules you’re expected to follow when invited to someone’s big day.

Many etiquette experts offer advice on topics ranging from how guests should dress to how much to spend on a gift. Following their suggestions ensures everyone enjoys themselves — especially the happy couple. Here are six etiquette tips every wedding guest should vow to uphold upon receiving a save-the-date.

Credit: madisonwi/ iStock

Always RSVP on Time — If Not Early

A typical RSVP due date is three to four weeks before the big day, allowing the couple enough time to make arrangements for seating and food based on the amount of people expected to show up. The RSVP deadline is usually listed on the physical invite and/or wedding website, and you should always respond by that date whether or not you plan to attend. As etiquette expert Myka Meier tells Brides.com, “It can be added stress for the couple to have to follow up with guests who do not RSVP.”

There’s also no such thing as responding too early once invites have been sent. Also, be sure to formally reply by whichever means the couple requests, whether by mail or online, as this keeps all the RSVPs organized in one place. If you tell the couple in person, it could slip their mind later and they will have no written record of your reply.

Credit: staticnak1983/ iStock

Only Bring Invited Guests

When you’re invited to a wedding, take note of the names on the invitation. If it’s just your name, don’t show up with a guest or assume your little ones are welcome. The only folks who should attend a wedding are people listed on the invite. Of course, if the invitation lists “Guest” in addition to your name, it’s acceptable to bring any plus-one of your choosing. 

If you’re asked to attend solo, it may be because the venue has a limited capacity, or the couple may not want children at the event. In fact, the wedding company Zola notes that one in six couples prefer to have adult-only weddings. 

Credit: Halfpoint/ iStock

Avoid Wearing Certain Colors

When deciding what to wear to a wedding, start by looking at the dress code. “Black-tie” suggests a more formal affair, while “garden party” implies more of a colorful, semi-casual vibe. Once you’ve settled on what type of garment to wear, you’ll need to choose a color. 

One of the most widely known wedding etiquette rules is to avoid wearing white and similar colors such as ivory or cream, or even very light pastels that could show up as white in photographs. Wedding planner Brandi Hamerstone tells Martha Stewart, “The bride may or may not wear white … but it’s her color for that day. You don’t want to be mistaken for the bride.” The only exception to this rule is if the couple explicitly requests white attire to be worn for a less traditional affair.

In addition to avoiding white, you should steer clear of whatever color the wedding party wears if you’re made privy to this information, as matching the bridesmaids or groomsmen may cause confusion. Apri Brown, flagship manager of the bridal brand Amsale, tells The Knot, “It’s courteous to avoid wearing the same palette as the bridal party so they can stand out.”

Of course, those colors vary depending on the wedding. To find out which color is taboo, Brides.com recommends asking a member of the bridal party and to avoid asking the couple, as they likely already have enough on their plate. Alternatively, you can check the website or invitation for the color palette to get some insight on what shades to skip.

Credit: LumiNola/ iStock

Don’t Contact the Couple on Their Wedding Day

On the day of the wedding, you don’t want to bother the couple, even if you’re running late or feeling sick. Calling them to say you may not make it is only going to add to their overall stress on an already hectic day. Instead, Myka Meier tells Brides.com that you should “try reaching a wedding planner, bridesmaid, or groomsman who can help you.” They’ll be able to provide you with some guidance and can pass along your message to the couple if and when the time is right.

For common questions such as what time the reception starts or directions to the venue, you should avoid bothering anyone involved in the wedding. Those details should be listed on the invitation or website, and it’s easy for you to find those answers on your own.

Credit: Dmytro Duda/ iStock

Arrive 15 Minutes Early

Running late can be perceived as disrespectful on any normal day, and to do it on the day of someone’s wedding can be seen as especially rude. Weddings often start promptly at the time listed on the invitation, so showing up even five minutes late may cause a big distraction. 

Etiquette expert Lisa Mirza Grotts tells Business Insider, “Guests should plan to arrive at least 15 minutes early,” as that gives you ample time to get seated and clear the aisle. In Brides.com, wedding planner Carina Van Son adds, “The old adage of ‘15 minutes early is on time, and on time is late,’ applies here.”

If you do arrive seconds before the bride is set to walk down the aisle, don’t try to scurry in front of her or sneak in through a side door. Etiquette expert Christin Gomes tells Brides.com that tardy guests should “rely on the ushers to let you know when you can enter the ceremony.” 

However, while arriving 15 to 30 minutes before the ceremony is perfectly acceptable, any earlier than that may create complications. The staff may be distracted by your presence, which can impede their work as they hurry to put finishing touches on the venue. With this in mind, aim to time your arrival appropriately.

Credit: Vasil Dimitrov/ iStock

Stick to the Gift Registry

Wedding invites often come with the expectation of purchasing a gift in return, whether you end up attending or choose to send a present in your absence. If you’re wondering how much to spend, a 2o24 guest study conducted by The Knot suggests the average wedding gift is around $150. Feel free to spend more for a close friend or family member and less for people you don’t know as well, such as a colleague or family friend. 

According to the Emily Post Institute, cash is always an acceptable wedding gift. But if you’re looking to buy something more tangible than money, it’s best to stick to the couple’s registry. In fact, The Knot lists this as their top piece of advice when it comes to wedding gift etiquette, warning that if you buy off registry “you run the risk of getting the happy couple something they already have or something they don’t need (and possibly don’t have space for).” 

Keep in mind that some couples may be starting a new life together in a new space, and they may not have room for two of the same frying pan or the painting you think they’ll love. Buying gifts off the registry helps avoid duplicates and headaches, as all those gifts are items the couple has specifically selected.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by TT News Agency/ Alamy Stock Photo

The world’s most talented songwriters don’t always end up performing the hit songs they write. Maybe the tune isn’t quite right for the album they’re currently working on, or they’re specifically hired to compose a song for someone else. Whatever the case, some tracks are handed off to a different musician, and that version becomes a smash hit. 

World-class talents such as Prince, Willie Nelson, and the Bee Gees have all penned songs that became an indelible part of another artist’s repertoire. Here’s a look at five famous musicians who wrote hit songs for someone else.

Credit: (left)  David Corio/ Rederns via Getty Images, (right) Michael Ochs Archives via Getty Images

Prince: “Manic Monday” by the Bangles

The 1986 single “Manic Monday” was recorded by the all-female rock quartet the Bangles, and many assume the group wrote the track. But the song was originally penned two years earlier by Prince, the same year he released the seminal album Purple Rain

Back in 1983, Prince founded an all-female singing trio called Apollonia 6. The following year he wrote “Manic Monday” for that band’s namesake album, though he ultimately opted to leave it off the record. Two years later, he decided it was the perfect song for the Bangles, who’d just made their debut with the 1984 album All Over the Place. 

According to Bangles founder Susanna Hoffs, Prince was a big fan of that album, specifically the song “Hero Takes a Fall.” Prince later showed up unexpectedly at a Bangles gig in San Francisco, where he hopped on stage and played a solo backed by the rest of the band.

In May 1985, Prince formally offered “Manic Monday” to the Bangles (along with the song “Jealous Girl,” which the band rejected). Songwriting credit was given to “Christopher” — a pseudonym meant to reference Prince’s character in the 1986 film Under the Cherry Moon. “Manic Monday” eventually peaked at No. 2 on the Billboard Hot 100 in April 1986. Coincidentally, the No. 1 song that very same week was Prince’s own song, “Kiss.”

Credit: (left) Rick Diamond/ Archive photos via Getty Images, (right) Michael Ochs Archives via Getty Images

Willie Nelson: “Crazy” by Patsy Cline

“Crazy” is undoubtedly the defining song of Patsy Cline’s career. It’s also her only top 10 Billboard hit, having reached No. 9 on the Billboard Hot 100 in October 1961. But Cline didn’t pen this beautiful ballad; that credit belongs to country legend Willie Nelson. At the time, Nelson was in his mid-20s and just kicking off his own musical career.

In the late 1950s, Nelson used his daily commute to compose new song ideas. On one such journey, he wrote “Crazy” in less than an hour, though it was originally titled “Stupid.” The song was far from a hit at first, and he struggled to sell the tune because it was more complex than most country songs at the time. As Nelson told American Songwriter, “Crazy” “wasn’t your basic three-chord country hillbilly song,” and he ended up having “problems immediately… because it had four or five chords in it.”

In the 2023 book Energy Follows Thought: The Stories Behind My Songs, Nelson explained how the song ended up in Patsy Cline’s hands. It began when he brought a scratchy recording of the tune to Nashville’s Ryman Auditorium one night, where he met Cline’s husband, Charlie Dick. Dick loved the song so much that he drove Nelson over to his house at 1 a.m. and woke up Cline so she could listen to it. Cline went on to put her own spin on the tune, which became the hit we know today.

Credit:  (left) Jack Mitchell/ Archive Photos via Getty Images, (right) Michael Ochs Archives via Getty Images

Neil Diamond: “I’m a Believer” by the Monkees

The Monkees started as a fictional band created for their namesake sitcom, which aired on NBC from 1966 to 1968. Despite the fact that they didn’t even play their own instruments at first (they relied on studio musicians for that), they eventually became a real-life pop sensation. 

Monkees songs topped the Billboard Hot 100 chart three separate times, including “Last Train to Clarksville” and “Daydream Believer,” both of which were written by other songwriters. The band also hit No. 1 with 1966’s “I’m a Believer,” which was written by a then up-and-coming Neil Diamond, who went on to become one of the bestselling musicians of all time.

Diamond cracked the Billboard Hot 100’s Top 10 for the first time on May 21, 1966 with his song “Solitary Man,” and hit No. 6 with “Cherry, Cherry” that August. His work caught the attention of the Monkees’ musical supervisor Don Kirshner, who reached out to Diamond’s producers and asked if he had any other songs similar to “Cherry, Cherry.” 

Diamond agreed to let the Monkees record “I’m a Believer,” which became a major hit for the group. Diamond later released his own version on his 1967 album Just for You, though it proved to be far less popular. During a 2008 interview with Mojo, he was asked if he was upset that the Monkees’ version performed better. He replied that he was “thrilled because, at heart, I was still a songwriter and I wanted my songs on the charts.”

Credit: (left) Hulton Archive/ Archive Photos via Getty Images, (right) Gary Gershoff/ Archive Photos via Getty Images

The Bee Gees: “Islands in the Stream” by Dolly Parton & Kenny Rogers

On October 29, 1983, “Islands in the Stream” topped the Billboard Hot 100 chart, where it remained for two weeks. This beautiful duet between country legends Kenny Rogers and Dolly Parton was a hit, earning Rogers his first No. 1 pop song since 1980’s “Lady” and Parton her first since 1981’s “9 to 5.” But despite both artists being talented songwriters in their own right, the song was actually written by Barry, Maurice, and Robin Gibb — collectively known as the Bee Gees.

In a 2001 interview with Good Morning America, Barry Gibb said the band originally wrote the tune for Diana Ross, although Gibb’s brother, Robin, interjected to say they actually wrote it for Marvin Gaye. Whatever the truth may be, the song ultimately fell into the hands of Kenny Rogers in 1983, with Barry Gibb producing. 

According to American Songwriter, Rogers was brought in to record the song solo, but ended up disliking the tune after four days of singing it. Barry introduced the idea of adding Dolly Parton — who happened to be downstairs at the studio — and making it a duet. Parton was brought into the room and completely transformed the song into the smash hit it remains today.

Credit: (left) Tony Evans/Timelapse Library Ltd/ Hulton Archive via Getty Images, (right) Michael Ochs Archives via Getty Images

Paul McCartney: “Come and Get It” by Badfinger

Badfinger may not have achieved their success if it wasn’t for the mentorship of the Beatles. Back in 1968, they were the first band signed to the Beatles’ Apple Records label, and they even took their name from the working title of a Beatles song. (“With a Little Help From My Friends” was originally called “Bad Finger Boogie” since John Lennon performed it with an injured hand.) Badfinger can also credit their first major success to Paul McCartney, who wrote the top 10 hit song “Come and Get It.”

McCartney originally wrote “Come and Get It” for the 1969 movie The Magic Christian, which starred Beatles drummer Ringo Starr. McCartney recorded a demo of the song on July 24, 1969, and even considered including it on Abbey Road. But he decided to give the song over to Badfinger instead, and held auditions to determine which band member would sing lead. (Rhythm guitarist Tom Evans was the winner.) 

While the band suggested putting their own take on it, McCartney instructed them to perform it exactly like the demo and turned down any other suggestions. The result was a success, as “Come and Get It” peaked at No. 7 on the Billboard Hot 100 chart on April 18, 1970, and spent 15 weeks total on the chart.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Hi-Story/ Alamy Stock Photo

We’ve all learned the basics of minding our manners — say “please” and “thank you,” always hold the door —  but etiquette is constantly evolving. What was once considered proper or even mandatory behavior may become laughably outdated, or even inappropriate, over time. 

One look at phone call behavior, for example, reveals a shift in etiquette norms. According to a recent survey, when asked if calling someone late at night is acceptable, 83% of participants aged 55+ said no, but only 45% of the youngest group (18 to 34) agreed with them. The acceptance of after-hours calls isn’t the only thing that’s changed. Here are seven etiquette tips from the past that no longer hold up in our modern world.

Credit: FreshSplash/ iStock

Waiting for a Formal Introduction

In centuries past, it was considered impolite to introduce yourself. Instead, it was proper to wait until a third party — a host, mutual acquaintance, or colleague — facilitated the introduction. This rule was closely followed in Victorian-era Britain, when introductions also adhered to strict social rankings. In social situations, people of a lower rank or social standing were introduced to their superiors, unless a woman was involved; a woman was always introduced to a gentleman, regardless of rank or social standing.

Formal introductions, however, have become a thing of the past thanks to modern communication’s reliance on speed and spontaneity. According to the experts at Debrett’s, London’s leading authority on modern manners, not only is it perfectly acceptable to introduce yourself to someone, but it’s often “the most practical solution.” It’s efficient, friendly, and spares everyone from awkward silence. Saying “hello” isn’t just acceptable nowadays; it’s expected. 

Credit: Rawpixel/ iStock

Never Toasting With Water

Like many toasting faux pas, this rule is part old-fashioned etiquette, part superstition. Avoiding clinking glasses with only water (instead of alcohol) in your cup is prominent in maritime folklore, purported by sailors who feared that toasting with water would lead to drowning. But it’s no longer considered rude to toast with water. As The English Manner’s website explains, “Rules about not toasting with anything other than alcohol are nonsense.” 

Indeed, modern etiquette experts say “cheers” to your sparkling water. Whether you’re abstaining from alcohol or simply prefer a soda, it’s completely acceptable to participate in a toast with whatever is in your hand. One toasting etiquette tip still stands: Always try to join in the toast, regardless of what you’re drinking. The spirit of the toast matters more than what’s in your glass.

Credit: PrathanChorruangsak/ iStock

Wearing Formal Attire on Airplanes

Passengers once dressed to the nines while flying commercial airlines. Three-piece suits, dresses, hats, and high heels were the norm. The first commercial flights took to the skies during the 1920s, and passengers were fashionably bundled up in their finest jackets and hats, which helped combat the chilliness of early jet planes. 

Air travel continued to improve throughout the ’30s and ’40s, and passengers continued to don the latest fashion trends on their flights. The 1950s, known as the golden age of air travel, brought with it plane cabins akin to swanky dinner clubs, with prime rib, lobster, and top-shelf alcohol on the menu.

By the 1960s, flying became less glamorous, and passengers began to wear more casual clothing. Within a few decades, suits were a thing of the past unless you were flying on business. Today, travelers favor comfort over couture, a switch that may be due in part to cramped airline cabins, unpredictable flight delays, and more strenuous security measures. After all, sprawling airport terminals are no place for high heels.

Credit: xavierarnau/ iStock

Flirting Discreetly

Throughout modern European history, flirting in public was frowned upon and considered a major violation of polite society etiquette. In the Victorian era, flirtation was highly coded and discreet to avoid breaking any rules. As a result, young single women had chaperones who followed them in public and who would shoo away any potential ill-mannered suitors. 

To further mitigate any untoward behavior, “escort cards,” alternatively known as “acquaintance cards,” were used to initiate courtship without indiscretion. These cards featured brief messages such as “May I. C. U. Home Tonight?” that an interested gentleman would slip into a lady’s hand.

Today, the etiquette for expressing your interest in someone has drastically changed. Speaking to someone directly is usually the best course of action. It may even be considered rude or odd to hand them a business card-like note à la the Victorians. According to behavioral scientist and relationship coach Clarissa Silva in an interview with Verywell Mind, the hallmarks of respectful and successful modern-day flirting include eye contact, smiles, humor, genuine compliments, and active listening — no escort cards necessary.

Credit: LaylaBird/ iStock

Handshakes Determined by Gender Roles

Today, you probably stick out your hand for a friendly handshake without much thought. However, for centuries, strict guidelines governed who should initiate a handshake. Traditionally, gender played a role in this, and men were not expected to shake hands with women — rather, they waited for women to extend their hands first. But “ladies first” is a diminishing etiquette rule in many aspects today.

The Etiquette School of New York suggests that in modern business settings, everyone should be free to shake hands with anyone, unless someone is uncomfortable doing so. In Western business culture, the person with the highest perceived status (regardless of gender) — say, the CEO — should initiate the handshake first. But outside of business settings and in most social situations, inclusivity and mutual respect now lead the way, so don’t feel confined to gender norms.

Credit: LeoPatrizi/ iStock

Men Walking on a Woman’s Right Side

Speaking of archaic gender roles, in medieval Europe, it was once considered good etiquette for men to walk on a woman’s right side. According to Primer Magazine, this would allow a man’s right hand to be free to easily reach his sword (worn on the left) should a threat arise, while the left arm was used to escort the woman.

This tradition evolved during the 19th century as carriages became readily available to the public and were no longer a luxury reserved for the wealthy. With streets busier and more dangerous than ever before, men were expected to walk on the side nearest the street to protect their escorts from carriages, mud splashes, or any impromptu duels requiring a quick sword draw.

As for who walks on what side of the sidewalk today, it generally doesn’t matter. As for a man escorting a woman on the right side, this practice is still seen in formal wedding ceremonies, hearkening back again to the days when a man kept his sword hand free.

Credit: FreshSplash/ iStock

Bringing Your Own Napkin

The time-honored saying “do as the Romans do” doesn’t apply here — unless you want to get some peculiar looks at dinner parties. In ancient Rome, dinner guests would bring their own cloth napkins to wipe their hands and faces and package up any leftovers. These napkins were the earliest versions of “doggie bags,”  and it was considered impolite (and impractical) to arrive at a dinner party without one.

While this “BYON” practice has vanished, the spirit lives on in the etiquette practice of bringing a gift to a dinner party. Etiquette experts at the Emily Post Institute say a gift for the host is always appropriate unless you’re close friends who dine together frequently. They suggest gifts such as wine, champagne, flowers, artisanal snacks, or housewares. A nice set of cloth napkins would be a charming nod to Roman tradition.

Rachel Gresh
Writer

Rachel is a writer and period drama devotee who's probably hanging out at a local coffee shop somewhere in Washington, D.C.

Original photo by Mike Bird/ Pexels

Hood ornaments were once a symbol of automotive elegance and brand identity, and no self-respecting luxury vehicle would hit the roads without one for much of the 20th century. These prominent, hood-based miniature sculptures were status symbols, proudly displaying a manufacturer’s identity. 

But at some point, hood ornaments began to disappear — and a casual glance at the traffic rumbling along today’s roads attests to their near extinction. Most modern vehicles feature a flush and comparatively subtle badge or emblem integrated into the hood, bumper, or rear hatch — a far cry from the flashy, protruding mascots of earlier decades. Let’s take a trip through automotive history to examine what exactly caused the decline of hood ornaments. 

Credit: Jordan Domjahn/ Alamy Stock Photo

The Golden Age of Hood Ornaments

The first half of the 20th century was the golden age of hood ornaments — or car mascots, as they were sometimes known — which were originally used for both practical and aesthetic purposes. 

The trend started in 1911, when Rolls-Royce began attaching its iconic Spirit of Ecstasy figurine to its hoods, purely as a status symbol. Around the same time, a new invention called the Boyce MotoMeter became popular. This device was basically a thermometer that screwed directly into the radiator cap, which back then was located on the hood. This allowed the driver to see the engine temperature while driving to help avoid overheating. Motometers were both practical and quite ornate, and they helped to further popularize the concept of hood ornaments. 

Motometers became obsolete as manufacturers began to incorporate coolant temperature gauges into their vehicles, but the idea of affixing objects to hoods purely for decorative purposes remained. By the 1930s, hood ornaments had reached their artistic zenith. Other luxury car manufacturers followed Rolls-Royce’s example and began designing their own distinctive hood ornaments: Mercedes-Benz with its three-pointed star, Jaguar with its leaping cat, and Bentley with its Flying B. 

By this stage, hood ornaments had evolved beyond simple decoration to become powerful branding tools that instantly communicated a manufacturer’s individuality, style, and prestige. They also started to gain popularity among American-made vehicles, with Dodge’s charging ram, Lincoln’s four-pointed star, and Buick’s bombsight, to name just a few. 

Credit:  Peter Brierley/ Alamy Stock Photo

Practical Concerns 

Throughout the 1960s and 1970s, hood ornaments began to disappear, in part due to safety concerns. As the understanding of pedestrian safety improved, metal ornaments protruding from car hoods were increasingly viewed as a potential hazard in collisions. 

The ornaments weren’t officially banned, but many manufacturers did take note. Some eventually removed their hood ornaments altogether, while others came up with innovative solutions to address the safety issues — Bentley, for example, installed a feature that allowed its Flying B to retract inside the hood in the event of an impact. 

Another blow to hood ornaments came from the modern pursuit of aerodynamic efficiency. As cars became faster and manufacturers increasingly sought ways to reduce drag coefficients and increase fuel efficiency, protruding ornaments became something of an engineering liability. 

While the effect may have been negligible in some cases, hood ornaments such as the chunky Bugatti elephant and the bulky Mack bulldog could certainly hinder a car’s aerodynamics. Rolls-Royce, despite being one of the few high-end car manufacturers to retain its hood ornaments, has still taken aerodynamics into account. Its Spirit of Ecstasy figurine was originally a hefty 6 inches tall but has slowly shrunk over time, with the statuette most recently downsized to about 3.25 inches tall in 2022. 

Another practical concern was simple theft. By the late 1980s, the theft of automobile accessories had risen sharply across the U.S., in part to fuel a growing fad among youths of wearing luxury hood ornaments around their necks. (Mercedes-Benz and Cadillac were particularly popular.) In response, manufacturers such as Mercedes-Benz made removable emblems available, while Rolls-Royce and Bentley began making retractable anti-theft ornaments that disappear down into the hood if tampered with. 

Credit: imageBROKER.com/ Alamy Stock Photo

The Decline of Hood Ornaments

Taken altogether, these three practical elements — safety, aerodynamics, and theft — have made hood ornaments less appealing to both manufacturers and car owners. Apart from these practical concerns, modern car design trends have embraced a more minimalistic approach. This design philosophy emphasizes simplicity, clean lines, and unadorned surfaces — all of which run counter to the extravagant nature of hood ornaments. 

Hood ornaments are rarely seen today, and only a handful of luxury manufacturers, including Rolls-Royce and Bentley, still offer them on current models. While hood ornaments have largely vanished from modern roads, however, their influence can still be seen in car design and brand identity — and they remain powerful symbols of automotive history.

Tony Dunnell
Writer

Tony is an English writer of nonfiction and fiction living on the edge of the Amazon jungle.

Original photo by Pixabay/ Pexels

It’s nearly impossible to imagine life without the World Wide Web, which has had an indelible impact on the way we live, work, and connect. Today, roughly two-thirds of the world’s population has access to this vast network of more than one billion websites. But the internet didn’t always look like this. Initially, it was devoid of websites and was primarily used for email by universities, researchers, and government agencies, with early precursor networks dating back to the 1960s

Then, on January 1, 1983, the internet was “born” with the establishment of a universal language, a communications protocol called Transmission Control Protocol/Internet Protocol (TCP/IP). While this allowed for file transferring and text-based directories, “dot-coms” did not yet exist. This lasted until August 6, 1991, when the world’s very first website — a page explaining what a website is — launched. This invention, defined as a collection of materials stored in a file archive for public access via the internet, changed the world forever.

Since that groundbreaking development, billions of users have visited countless web addresses, most of which begin with the familiar letters “WWW” or “HTTP.” Created in the late 1980s, those acronyms, which stand for “World Wide Web” and “Hypertext Transfer Protocol,” respectively, are the building blocks of the modern internet, designed to allow users to connect via websites across a global network. 

So how did we go from one lonely website to billions of them, and what’s the significance of those acronyms? Let’s jump in and decode these mysteries.

Credit: Catrina Genovese/ Hulton Archive via Getty Images

The World Wide Web’s Humble Beginnings

In 1989, British computer scientist Tim Berners-Lee was working at CERN (a French acronym for the European Organization for Nuclear Research) when he had a breakthrough idea that led to the creation of the World Wide Web. “There have always been things which people are good at, and things computers have been good at, and little overlap between the two,” Berners-Lee explained in 1998. 

He noted that computers worked “mechanically in tables and hierarchies” while humans usually opted for “intuition and understanding.” He imagined a system that could bridge the gap.

Berners-Lee’s vision was to develop a way to view and link documents across different computers, using the internet as the backbone. He partnered with Belgian systems engineer Robert Cailliau on the  project and coined the term “World Wide Web” in 1989 to emphasize the global reach and decentralized nature of the web. On a NeXT computer (designed by none other than Steve Jobs), the team created a network of hypertext documents viewable on web browsers. 

By the time Berners-Lee was ready to launch in 1991, he had invented “HTML” (Hypertext Markup Language), “HTTP” (Hypertext Transfer Protocol), “URL” (Uniform Resource Locator), and, of course, the World Wide Web itself.

Credit: Unsplash+ via Getty Images

The World’s First Website

On August 6, 1991, the world’s first website, info.cern.ch, was launched by CERN. The site provided instructions for users to create their own web pages and explained the function of hypertext — coded text that links to content on the internet. Still active today, the site also now explains the history of Berners-Lee’s project.

By 1992, there were only 10 websites on the World Wide Web, but by the late 1990s, in large part due to the launch of Google, that number had ballooned to 2 million. Unfortunately, most early websites weren’t archived and have since vanished. In 2013, however, CERN revived its original site, unveiling what some may call a digital fossil of the early web. The resurgence of info.cern.ch brought about an interesting question: Why didn’t that URL include”WWW” or “HTTP”? 

Credit: mariusFM77/ iStock

So Why Do We Use “WWW”?

The “WWW” in a URL is an abbreviation for “World Wide Web,” but contrary to what some may believe, this part of a URL isn’t actually necessary for a website to work. You could type in “google.com” instead of “www.google.com” and still land in the same place. That’s because “WWW” is more of a naming convention than a technical requirement: Back in the 1990s, it helped distinguish websites from other services, such as email servers.

Today, many websites omit the “WWW” completely, but it’s still widely used as a stylistic choice in certain instances, such as marketing and print materials. When featured on billboards, postcards, TV commercials, and magazine ads, “WWW” clearly signals that the address in question is a website. 

Basically, while “WWW” isn’t essential anymore, it remains one of the most recognizable acronyms on the web, earning it a permanent place in internet tradition and history.

Credit: KTStock/ iStock

What’s the Meaning of “HTTP” and “HTTPS”?

Another acronym often seen in website URLs is “HTTP,” but this one actually does serve a functional purpose. It stands for “Hypertext Transfer Protocol” — that is, a communication protocol allowing browsers and servers to exchange information and permitting users to access websites. It essentially acts as a set of rules that lets your browser “talk” to the server, enabling it to load text, images, videos, and everything else you see and hear on web pages. Without “HTTP,” the World Wide Web couldn’t function.

You may notice there is sometimes an “S” at the end of the “HTTP” acronym, which stands for “secure.” If a URL starts with “HTTPS,” it means the site in question encrypts your data. This adds a layer of security to protect users and safeguard sensitive information, including passwords and credit card numbers. Most websites today use “HTTPS,” and browsers may even warn you if a site doesn’t use it and therefore isn’t secure.

Credit: Kaboompics/ Pexels

Are There Other URL Protocols?

While “HTTP” and “HTTPS” are by far the most commonly seen internet protocols at the beginning of web addresses today, they’re just two among thousands of others. These protocols generally fall into three main categories: communication, management, and security, with “HTTP” and “HTTPS” belonging to the security category.

Most alternative protocols are rarely encountered by everyday users, either because they serve technical functions or because they operate in areas of the internet that aren’t easily accessible to the public. That said, you may still have utilized a few of them while surfing the net. For instance, “mailto://” opens your default email application to start a draft addressed to the email in the URL, while “ftp://” transfers files between computers over a network.

So if you stumble across a URL that starts with something unfamiliar, it may not be a red flag for a dangerous site; it’s probably just a different protocol designed for a specific purpose. The internet is much more flexible than we probably realize and likely much more expansive than Tim Berners-Lee could have ever imagined.

Interesting Facts
Editorial

Interesting Facts writers have been seen in Popular Mechanics, Mental Floss, A+E Networks, and more. They’re fascinated by history, science, food, culture, and the world around them.

Original photo by Maryam Sicard/ Unsplash

Rice is a dietary staple in many cultures around the globe, especially in Asia, Sub-Saharan Africa, and South America, the regions that consume the most rice. In fact, more than 3.5 billion people depend on rice to provide some 20% of their daily calories, proving just how pivotal this versatile grain can be. 

But despite rice’s widespread popularity, there are plenty of things most people don’t know about this culinary mainstay. Here are six facts about rice to whet your appetite.

Credit: Ella Wei/ Pexels

It Helps Hold the Great Wall of China Together

A 2020 study conducted by researchers at China’s Zhejiang University determined that sticky rice was once used in the construction of various Chinese structures, including the Great Wall of China. The study determined this unique mortar was likely developed around 1,500 years ago, combining sticky rice soup with more traditional elements such as lime and water. 

The research team wrote that sticky rice has “more stable physical properties” than the standard mortars that preceded it, thus providing additional strength and durability. For example, sticky rice is less permeable to water than lime, so the former could offer more structural integrity amid changing weather conditions.

Not only was this sticky-rice mortar used for portions of the Great Wall, but researchers also noted that given its impeccable performance, it was incorporated into “important buildings such as tombs, city walls, and water resource facilities.” Since the study’s publication, conservation teams have used special sticky rice mortars to restore various centuries-old structures, including the 800-year-old Shouchang Bridge in Huzhou, China.

Credit: Vardhan/ iStock

84% of Rice Is Produced in Just 10 Countries

Roughly 84% of the world’s rice is produced in just 10 countries. India tops the list, surpassing China, which previously held the top spot, according to United States Department of Agriculture data from 2024 and 2025. Both countries accounted for roughly 27% of the global rice production, with India producing 147 million metric tons compared to China’s 145.28 million.

The other eight countries are also in Asia. They include Bangladesh at 7%, Indonesia at 6%, Vietnam at 5%, Thailand at 4%, and the Philippines, Burma, Pakistan, and Cambodia all at 2%. Brazil ranked 11th by producing 8,200 metric tons of rice — the most of any non-Asian country.

Credit: Pixabay/ Pexels

Arkansas Produces the Most Rice of Any U.S. State

According to the USDA, the United States accounts for 1.3% of the global rice trade. Domestically, rice production is dominated by the state of Arkansas, which accounts for an incredible 40% of U.S. rice cultivation. Arkansas’ annual output is more than double that of California, which ranks second among U.S. states, followed by Louisiana, Mississippi, Missouri, and Texas. 

Arkansas’ thriving rice industry can be traced back to the late 19th century. It’s thanks, in part, to a man named William Fuller, nicknamed the “father of Arkansas rice.” In 1896, the Arkansas native visited Louisiana on a hunting trip and took an interest in the farmers growing rice in that region.

Fuller took note of how Louisiana’s soil shared similarities with the soil in Arkansas and began experimenting with techniques to grow rice back in his home state. He ultimately succeeded, inspiring other Arkansan farmers to follow suit. By 1905, Arkansas’ rice industry was bustling, and the state became the nation’s leading rice producer by the 1940s.

Credit: Consolidated News Pictures/ Archive Photos via Getty Images

George H.W. Bush Declared September National Rice Month

On August 20, 1991, President George H.W. Bush signed Proclamation 6323, officially designating September as National Rice Month in the United States. Bush signed the order “to promote greater awareness of the versatility and the value of rice, and to celebrate America’s status as a major exporter,” according to the text of the proclamation.

This decree was just one example of an awareness initiative that sought to boost the popularity of rice in the United States during the 1990s. While it’s hard to say whether National Rice Month had a direct impact, numbers show that per capita rice consumption in the U.S. certainly shot up during the decade. 

According to the Cereal & Grains Association, Americans ate 15.75 pounds annually during the mid-1980s — a number that increased to 27.09 pounds by the new millennium. The average American continues to consume roughly 27 pounds of rice on a yearly basis, according to the USA Rice Foundation.

Credit: Synergee/ iStock

There are More Than 40,000 Varieties of Rice

More than 40,000 varieties of rice are grown worldwide, most of which fall into one of two major groups: indica or japonica. Indica rice tends to be longer, thinner, and less starchy than its japonica counterparts and includes popular varieties such as Indian basmati and Thai jasmine rice. The U.S. Department of Agriculture estimates that certain indica rice varieties account for 62% to 66% of the global rice trade, with basmati and jasmine rice (also members of the indica subspecies, but technically classified as “aromatic” varieties for trade purposes) accounting for an additional 23% to 25%.

Japonica rice, on the other hand, only accounts for 9% to 10% of the global rice trade. These varieties tend to be grown in regions with cooler climates such as Japan, Korea, and parts of the European Union. Japonica rice varieties tend to develop a sticky texture when cooked and are often used as sushi rice or in risotto.

Credit: Maryam Sicard/ Unsplash+

All White Rice Starts Out as Brown

Given the significant visual and nutritional differences between white and brown rice, you may be surprised to learn that all white rice is initially brown. This color change is the result of a milling process that strips rice of its husk, bran, and germ, which are naturally browner in color. In turn, the process leaves behind the rice’s endosperm, which is much whiter.

This milling process is used to increase the grain’s shelf life: In its natural state, brown rice tends to have a shelf life of around six months, given the outer layers spoil more rapidly. White rice, on the other hand, has had those outer layers stripped away and can therefore keep for three decades if stored properly. 

But in addition to causing rice to lose its brown color, the milling process also removes many of the fibers and nutrients contained in those outer layers. To account for this nutritional loss, the white rice you see sold in stores has usually been enriched with added nutrients.

Bennett Kleinman
Staff Writer

Bennett Kleinman is a New York City-based staff writer for Optimism Media, and previously contributed to television programs such as "Late Show With David Letterman" and "Impractical Jokers." Bennett is also a devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.

Original photo by Creative-Family/ iStock

Southpaws. Lefties. Mollydookers. There’s no shortage of nicknames for the left-handed people of the world, even if much of everyday life seems to have been designed without them in mind. 

Hand preference remains an intriguing puzzle for scientists, with researchers still questioning why a small percentage of people favor their left hand over their right. This quirk has caused lefties to be saddled with historical baggage and ongoing inconveniences — unfortunately, The SimpsonsLeftorium, where left-handed folks can find everything from can openers to cars suited to their needs, remains fictional.

We may still be working to uncover the origins of this unusual hand preference and to create a more lefty-friendly world, but in the meantime, here are some curious facts about left-handed life. 

Credit: Mariana Sandoval Camargo/ iStock

Being Left-Handed Used To Be a Red Flag

For centuries, being left-handed in a primarily right-handed world wasn’t just inconvenient; it was also seen as suspicious or even dangerous. This bias goes way back: The word “sinister” comes from a Latin word of the same spelling meaning “on the left side.”

In medieval Europe, left-handedness was thought to have ties to witchcraft and, for some religious groups, demonic possession. These suspicions lingered for centuries — even into the early 1900s, left-handed schoolchildren in the U.S. and Europe were retrained to use their right hand instead.

These perceptions didn’t shift in any major way until the mid-20th century. Time magazine suggests one notable turning point came in 1968 with the opening of London’s  Anything Left Handed store, which celebrates southpaws rather than condemning them.

By the 1970s, more manufacturers were producing products specifically for lefties, and by the end of the decade, the number of people publicly embracing their preferred hand had risen from 2% in 1939 to approximately 12% in 1979.

Credit: Orbon Alija/ iStock

Only 10% of the World’s Population Is Left-Handed

Only about one in 10 people worldwide are left-handed — or at least that’s the most accurate estimate the latest data can offer. As it turns out, pinning down an exact number is trickier than you might think. 

Some people write with their left hand but throw a ball with their right. Others, thanks to the aforementioned long-held misconceptions and stigmas, were flat-out forced into right-handedness as children, even if they innately favored their left hand.

Despite the margin of error, this 10% statistic has held up across time and cultures spanning more than 10,000 years. Modern tools, various teaching methods, and shifting social norms have all seemed to have negligible effects on this ratio, and that’s unlikely to change anytime soon. 

Credit: D3Damon/ iStock

Genetics Are Partially Responsible

Left-handedness tends to run in families, but it’s not as straightforward as, say, eye color or blood type. Instead, it’s likely influenced by a combination of genetic, prenatal, and possibly even environmental factors. 

Research has identified rare genetic variants, particularly in a gene called TUBB4B, that appear 2.7 times more often in left-handed people than in right-handed people. Other studies suggest a left-handed preference may be detected in the womb as early as 10 weeks into pregnancy based on thumb-sucking activities. 

Handedness also appears to have an environmental component. In a 2019 U.K. study of more than 500,000 participants, the percentage of people identifying as left-handed increased steadily among those born each year starting from around 1920 until roughly 1970, when it leveled off. But this wasn’t simply because more left-handed people began to be born; instead, it was likely due to a decline in the social pressure to switch hands.

The study also suggested that low birth weight and being part of a multiple birth seems to increase the chance of left-handedness. Further research suggests seasonal hormone levels or exposure to infections may even make women born in the summer more likely to be lefties, but the exact reasons for these correlations remain murky and there are likely still other undiscovered factors as well.

Credit: fizkes/ iStock

Men Are More Likely To Be Southpaws

One pattern that consistently shows up in handedness studies is that men are more likely to be left-handed than women. The gap isn’t huge: about 12% of men versus roughly 10% of women prefer their left hand. Given the low number of lefties overall, though, it is statistically significant, and scientists still aren’t entirely sure why this is the case. One theory suggests hormones — specifically, higher levels of prenatal testosterone — may play a role. 

Credit: Wirestock/ iStock

Even Animals Can Be “Left-Handed”

Favoring one’s left or right side isn’t solely relegated to humans; plenty of animals show their own versions of “handedness.” Kangaroos often prefer their left hand for grooming and feeding, and even tiny creatures such as bees can display side biases when navigating around objects. 

Dogs, too, will favor a paw, and there’s a way to test it beyond a paw shake: It’s called the Kong test. Give your dog a food-stuffed Kong or other treat-dispensing toy and watch which paw they use to hold it in place while eating. If they consistently favor one paw, that’s likely their dominant side. 

Some dogs may use both paws equally, but many demonstrate a clear preference. One study found that, among dogs that showed a clear paw preference, about 58% were right-pawed and 42% were left-pawed, making “left-handedness” much more common in dogs than in humans.

Lefties Have an Edge in Certain Sports

Despite most sports gear being designed for right-handed players, lefties often hold a surprising advantage, especially in fast-paced competitive sports such as tennis, baseball, boxing, and fencing, in which an opponent’s reaction time is lower. 

Because southpaws are relatively rare, right-handed athletes don’t get as much practice facing them, whereas lefties spend their whole lives playing against righties. This creates an element of unpredictability for righties when facing a lefty: The latter’s movements, angles, and timing can feel unfamiliar to their opponents. 

In baseball, for example, left-handed pitchers are especially prized for their ability to throw off a batter’s rhythm. While left-handedness doesn’t necessarily guarantee athletic success, the list of elite athletes who’ve been left-handed is undeniably impressive: LeBron James, for example, is naturally left-handed but shoots basketballs with his right. Babe Ruth and Wayne Gretzky, two of the greatest athletes of all time in their respective sports of baseball and hockey, also belong to this distinguished club.

Nicole Villeneuve
Writer

Nicole is a writer, thrift store lover, and group-chat meme spammer based in Ontario, Canada.

Original photo by Paris Bilal/ Unsplash

Some animals make noises that seem perfectly aligned with their looks and behavior, whether it’s the roar of a lion, the howl of a wolf, or the happy clicks and whistles emitted by dolphins. We’re well-versed in these iconic sounds, even if we’ve never seen or heard the animals for ourselves. 

But the animal kingdom is full of acoustic surprises, with creatures whose vocalizations defy our expectations entirely. Some stealthy animals we may assume to be basically silent are actually highly vocal, while others produce unexpected sounds that seem to belong to a different species altogether.

The following five animals all make sounds that subvert our expectations, including predators whose calls are far from fearsome and cute critters whose sounds are surprisingly unnerving. 

Credit: Ingo Doerrie/ Unsplash+

Bald Eagles

You’d think the powerful and majestic national bird of the United States would have a soulful, piercing cry, but the reality is quite different. Bald eagles produce surprisingly weak-sounding calls, typically a series of high-pitched whistles, chirps, and piping notes often described as similar to the cries of gulls.

These sounds are probably not what you would expect from a 10- to 15-pound bird of prey with a wingspan of up to 8 feet. Hollywood is partially to blame for our misconceptions: The sound of the bald eagle is considered somewhat lackluster when depicting the animal onscreen, so film editors often replace real eagle noises  with the more dramatic, haunting cry of the red-tailed hawk.  

Credit: Unsplash+ via Getty Images

Cheetahs 

Cheetahs are the world’s fastest land mammal, reaching speeds of up to 60 or 70 miles per hour. These big cats are apex predators, using their speed, ferocity, and power to chase down their prey on the wide-open grasslands and savannas of southern and eastern Africa. You may therefore expect them to have an impressive roar akin to some other big cats, but cheetahs cannot roar at all.

Instead, these speedy cats make a variety of chirps, purrs, and high-pitched yelps that sound more like bird noises — a far cry from the fearsome roars of lions, tigers, and jaguars. This is due to the structure of the cheetah’s hyoid bone — a U-shaped bone in the neck — which is ossified (rigid and fixed in one position), allowing the animals to purr but not roar. Lions, tigers, jaguars, and leopards, on the other hand, all have flexible hyoid bones that allow them to produce deeper and far more intimidating noises.  

Credit: Unsplash+ via Getty Images

Red Foxes

Red foxes make a variety of surprising sounds, including breathy barks, frightening screams, howls, and squeals. They also make a strange combination of noises known as gekkering — a series of stuttering, chattering, throaty vocalizations — when fighting or during play. 

The loudest and most chilling sound made by foxes is the scream, or contact call, normally made by vixens when they’re ready to breed. This shriek can sound very similar to a human woman screaming, which can be particularly unnerving if you happen to hear it while walking in the countryside at night. At the other end of the sound spectrum is the infectious and endearing noise foxes make when they’re excited or having fun, which strongly resembles a squeaky human laugh. 

Credit: Flip Side/ Pexels

Koalas

These fuzzy marsupials may look like cuddly teddy bears, but they’re capable of making some fearsome noises. In addition to emitting an array of snarls, squeaks, and screams, koalas can produce a shockingly deep, guttural bellow that sounds like a strange mix of a pig grunting and a motorcycle engine revving. 

Research biologist Benjamin Charlton of the University of Sussex, explained to the BBC that a male koala’s bellow is “20 times lower than would be expected for an animal of its size” and more typical “of an animal the size of an elephant.” The unexpected depth of the sound comes from a special extra set of vocal folds outside the larynx that are more than three times longer and about 700 times heavier than those of the larynx, allowing koalas to produce extremely deep pitches with tremendous power. 

Credit: ALENA MARUK/ Pexels 

Porcupines

Porcupines are notoriously prickly, but they produce some of the animal kingdom’s most unexpected and downright adorable sounds. These animals are remarkably vocal, creating a range of vocalizations that includes whines, squeaks, moans, and grunts.

The overall effect is reminiscent of a cute yet slightly frenzied video game character, with elements of Ewokese (the language spoken by the Ewoks of the Star Wars universe). Porcupines also make a clicking noise with their teeth to scare off predators, and they can produce a loud hissing sound by rattling their quills. On the other end of the spectrum, baby porcupines — charmingly called porcupettes — make little squeaks and coos that do perfectly suit their ridiculously cute appearance.

Tony Dunnell
Writer

Tony is an English writer of nonfiction and fiction living on the edge of the Amazon jungle.

Original photo by Jacob Wackerhausen/ iStock

If you’ve ever found yourself tossing and turning in a hotel bed or on a friend’s futon, you aren’t alone. A study by InterContinental Hotels Group (IHG) determined 80% of travelers struggle to sleep when they’re away from home, and nearly half attribute this to being in an unfamiliar environment. 

There is, in fact, a scientific explanation for this widespread phenomenon. It’s called the “first-night effect” (FNE), and it’s something sleep scientists have been studying for decades. This natural reaction to being in a new place has become deeply ingrained in our brain’s architecture due to our evolutionary history. 

In recent years, scientists have uncovered more details about why the FNE occurs and, thankfully, given us tips to ease those sleepless nights away from home. Let’s dive in and look at the facts.

Credit: kieferpix/ iStock

What Is the First-Night Effect?

Essentially, the brain doesn’t fully power down during the first night in a new environment, causing travelers to feel worn out the next day. Research shows one brain hemisphere, usually the left, remains more alert throughout the night, subtly monitoring our new environment for threats to keep us safe. The first-night effect has been well documented for decades, so much so that sleep scientists tend to throw out the first night’s data of any sleep study because of the way the effect skews results.

In 2016, researchers from Brown University explored this phenomenon by monitoring brain activity among participants in a sleep lab. During the first night, the left side of their brains reacted strongly to irregular beeping sounds, while the right side maintained a deeper sleep. By the second night, neither hemisphere responded strongly to the beeping, indicating a deeper sleep for both, with the left hemisphere’s initial wary response subsiding. 

Scientists suggest this is because the environment had become more familiar by the second night and therefore was perceived as less of a threat. The study concluded that “troubled sleep in an unfamiliar place is an act of survival.”

Credit: ohrim/ Adobe Stock

It’s Animal Instinct

The partial brain alertness seen in the first-night effect mirrors how animals remain vigilant in the wild, hinting that this phenomenon is an evolutionary adaptation for survival. Many animals have developed their own versions of this type of asymmetrical sleep to protect themselves from danger. Humans and some mammals exhibit bihemispheric slow-wave sleep (BSWS), wherein both hemispheres are technically asleep, but one side can remain more vigilant. 

However, other animals have evolved one step further. Some species, including whales, dolphins, fur seals, sea lions, and certain birds, exhibit unihemispheric slow-wave sleep (USWS), in which only one hemisphere is asleep at a time, giving them an even greater ability to perceive threats while asleep. This allows some animals, such as birds and dolphins, to literally “sleep with one eye open” to detect predators. 

While these animals rely on this strategy regularly, humans typically only make use of it during exposure to a new environment, reinforcing the idea that this is a deep-rooted instinct meant to protect us, not inconvenience us.

Credit: LumiNola/ iStock

Beyond the First Night

While the first-night effect is most prominent during your first shuteye away from home, research suggests it may extend beyond a single evening. For some travelers, it can cause interrupted sleep for several days, negatively affecting mood and energy levels during a trip as a result. One culprit of this extended FNE could be its impact on rapid eye movement sleep (REM), a crucial phase of the sleep cycle linked to memory and emotional regulation.

In one sleep study, researchers recorded the impacts of REM while participants slept in a new environment for several nights. Results showed the FNE’s impacts on REM sleep extended up to the fourth night.

According to Harvard Medical School, “REM sleep is so important that if you don’t get enough one night, your body will naturally increase it the next — you’ll enter this stage earlier and stay in it for longer. This is known as REM rebound.” Basically, if the FNE wreaks havoc on your REM cycle one night, it can cause further disruptions in your sleep cycle, resulting in a ripple effect and many subsequent sleepless nights.

Credit: LumiNola/ iStock

How To Minimize the First-Night Effect

As Rebecca Robbins, assistant professor of medicine at Harvard Medical School, told National Geographic, “The truth is that when we are in an unfamiliar environment, we fundamentally have a harder time unwinding.” Robbins, a leading researcher in circadian health, is working to identify behavioral changes to help patients improve sleep. But regardless of the FNE’s impacts and longevity, there are some ways to make your brain settle down and rest more deeply while in a new place. 

The trick is to minimize the effect by introducing comforting and familiar sensory cues that signal to your brain you’re in a safe environment. Simple steps such as bringing your own pillow, using a familiar body lotion or essential oil, or playing calming music or a noise machine can create a sense of safety. These small comforts of home may be just enough to trick your brain into thinking you’re not in a foreign place, allowing you to sleep more soundly and, hopefully, feel more rested during your travels.

Rachel Gresh
Writer

Rachel is a writer and period drama devotee who's probably hanging out at a local coffee shop somewhere in Washington, D.C.

Original photo by W W/ Pexels

For many of us, the ocean is a place to visit on a beach vacation or hot summer day, and we don’t give it much thought beyond that. But beneath the surface lies a vast and largely unexplored world — one that’s deeper, stranger, and more surprising than we realize. 

Though we divide it into regions including the Atlantic and Pacific, the ocean is one continuous, global body of water that connects all the continents and shapes nearly every aspect of life on Earth. This vast body of water is jam-packed with strange life, ancient history, and untapped potential. It regulates our weather, sustains our food systems, and produces more than half the oxygen we breathe. 

And yet, despite its vital role in our lives, there’s still so much we don’t know about the ocean. For all our technological progress, the vast sea reminds us that the planet is still full of mysteries to be uncovered. Here are seven fascinating facts that just might change the way you think about the sea.

Credit: Unsplash Community/ Unsplash+

We’ve Only Explored a Tiny Fraction of the Ocean

Although the ocean covers about 71% of the Earth’s surface, an incredible 80% of it still remains unexplored, unmapped, and unseen by humans. In an age where GPS can pinpoint your exact location in seconds and satellites track hurricanes from space, much of what lies beneath the surface of the ocean still remains as mysterious as it was to early explorers. 

That’s partly because exploring the deep sea is no easy feat — the pressure down there is intense, thousands of times greater than at the surface. On top of that, it’s incredibly expensive to reach those remote depths. Even with today’s advanced sonar and underwater robots, most of the ocean is still unexplored by humans.

Credit: Unsplash+ via Getty Images

The Ocean Is Home to as Many as 2 Million Species 

The ocean represents Earth’s largest habitat by volume; estimates suggest it’s home to around 2 million marine species. Of those, only around 240,000 have been officially documented — meaning more than 90% of oceanic life remains unknown to science.

Many newly discovered species are surprising and unlike anything above the surface. In recent years, scientists have identified fish with transparent heads, squids that flash bright lights, and crabs that farm bacteria on their claws for food. 

Credit: OntheRun photo/ Alamy Stock Photo 

The Deepest Part of the Ocean Is Nearly 7 Miles Down

The Mariana Trench, located in the western Pacific Ocean, is home to the Challenger Deep — the deepest known point in Earth’s oceans. It plunges nearly 11 kilometers (almost 7 miles) below the surface. To put that in perspective, if Mount Everest were placed in the trench, its peak would still be more than a mile underwater. 

To date, only a handful of human-made submersibles have reached this extreme depth because the conditions are formidable: complete darkness, temperatures near freezing, and crushing pressure that would flatten most vehicles. Yet life still manages to exist in that alien world: Thousands of amphipods and microbe species thrive where life was once thought impossible.

Credit: Pascal Ingelrest/ Pexels

The Ocean Contains More Than 3 Million Shipwrecks

Throughout history, humanity has lost countless vessels to the sea — from ancient trading ships and pirate galleons to military submarines and modern cargo vessels. The United Nations estimates there may be as many as 3 million shipwrecks scattered across the ocean floor — and fewer than 1% of them have been explored. 

Some of the oldest wrecks date back thousands of years, and others, such as the Titanic, still loom large in our collective imagination long after they were lost to sea. The ocean preserves many of these wrecks surprisingly well, especially in deep, cold waters where corrosion and decay are slowed. 

Credit: Yanguang Lan/ Unsplash

The Largest Living Structure on Earth Is in the Ocean

The Great Barrier Reef, stretching more than 1,400 miles along the northeast coast of Australia, is the largest living structure on the planet. It’s so large, in fact, that it’s visible from space. 

Made up of around 3,000 individual coral reefs, 600 continental islands, 300 coral cays, and 150 inshore mangrove islands, the Great Barrier Reef supports an ecosystem so large and complex that it’s often compared to a rainforest in terms of biodiversity. The reef is home to more than 1,600 species of fish, 450 types of hard coral, and hundreds of species of sharks, turtles, and marine mammals.

Credit: Wirestock/ iStock

The Ocean Produces More Than Half the World’s Oxygen

When it comes to oxygen production, most of us think of forests and trees, but marine plants — especially oceanic phytoplankton — actually produce more than 50% of the Earth’s oxygen supply. Those tiny, photosynthetic organisms drift near the surface of the ocean and convert carbon dioxide into oxygen, much like plants on land. 

Marine plants aren’t just important for air quality; they also form the base of the marine food chain, supporting everything from shrimp to whales. Without phytoplankton, life in the oceans — and on land — would be drastically different.

Credit: Adrian Weston/ Alamy Stock Photo 

There Are Underwater Rivers, Lakes, and Waterfalls

It may sound like something out of science fiction, but the ocean contains its own rivers, lakes, and even waterfalls — entire hydrological systems beneath the sea. These strange features are formed when water with a different salinity and temperature — often brine, which is denser than water on the surface — settles into seafloor depressions, forming distinct, visible pools. 

Underwater waterfalls occur when this heavier water spills over ledges or seafloor escarpments. One of the most famous examples is the Denmark Strait cataract between Greenland and Iceland, which plunges 11,500 feet — making it the largest waterfall anywhere on Earth, either on land or in the sea.

Kristina Wright
Writer

Kristina is a coffee-fueled writer living happily ever after with her family in the suburbs of Richmond, Virginia.