There was a time when it was frustratingly easy to go to an ATM machine, pull out your money and then completely forget about your card. Once the ATM machine realized you weren’t ever going to retrieve your debit card, it would simply swallow the card and send it to wherever banks keep their graveyard of forgotten debit cards.
After you realized your debit card was gone—depending on the circumstances and the night you had, this could’ve just been piling on more pain to your hangover—you’d have to call your bank, cancel your card and wait (sometimes a whole week!) for your new card to be mailed to you. In the meantime, while your card was traveling at the appropriate snail-like speed to your mailbox, you’d have to go to your bank, wait in line for a teller, and then pull out money the old-fashioned way.
It was all a terrible inconvenience; and luckily, the banks eventually decided to do something about it. At JPMorgan Chase the first step it took to rectify the situation was to wait to dispense your money until you’d retrieved your card, flashing a green light to tell you that you needed to take your card out of the machine.
More recently, Chase has added an additional feature to its ATM machines: In addition to the flashing light, there is an audio signal—almost like a ringtone—that plays until you take your card out. There is also an audio signal that alerts you to take your cash once your money has been dispensed.
What Chase has accomplished with the addition of audio signals on its ATM machines is to vastly improve the user experience. It’s an important lesson for designers to consider when building products for clients, with one caveat: The audio signals at Chase’s ATM machines are there for a utilitarian purpose, warning you that you need to take your card or cash; these sounds – while not grating by any measure – also aren’t exactly what you’d call pleasant to hear, but that’s not the point.
The Sound (And the Sound of Silence)
For designers, not only has sound become increasingly important to the UX process, but—unlike with an audio signal on an ATM machine—you have to be especially cognizant of how the sounds sound on the products you’re building.
Writing for Wired last year, Kevin Perlmutter argued that companies and designers too often “neglect” sound in their products. Perlmutter noted that while few companies would ever “consider using an online library to select their brand logo or visual identity,” they often “allow their products to go out into the world with cheaply produced or licensed sounds downloaded from a mass market sound effects library,” essentially undervaluing the audio portions of their products.
“The reality is that most brand marketers and product designers are not aware of the positive or negative impact sound has on the consumer experience,” Perlmutter adds. “But the cognitive and emotional effect is greater than you think.”
Of course, it’s important to keep in mind that the “cognitive and emotional effect” of sound relates not just to the way a user hears sound, but also how much sound the user is being asked to listen to with your products. Although sound is a really important component to how your product engages with a user, you can’t forget that the user experience can also greatly benefit from silence.
“First and foremost, unwanted sound can be intrusive and annoying,” notes Jonathan Follett, a principal at Goinvo, explaining one of the reasons why sound remains a limited feature of UX design. “Rather than enhancing a user experience, audio can be a distraction and reduce our effectiveness.
Think about the Chase ATM machines: They have limited their audio signals to alerting you when to retrieve your card and when to take your cash. If the bank had included sounds with every single one of your actions at the machine, not only would they become intrusive, but it would lessen the effectiveness of the other two sounds that have been designed into the system to remind you about your card and money.
Follett points to another reason sound often gets relegated to second-tier status in UX design, in addition to its risk of being an annoying feature if done wrong: Most designers “have not learned and honed techniques for developing effective audio for user interfaces,” which means that “while the integration of text and graphics is a familiar and common occurrence, the use of audio is unfamiliar territory and rarely considered.”
But while Roman Zimarev, a sound designer and musician, acknowledges that he often hears from people who say that “sounds in non-game apps are rude and annoying,” he argues that the “rationale behind such an opinion [is] due to the many examples of failures.” Zimarev points to the two main utilizations of sound in UX design – as notifications and interactions – where “sound can significantly improve the users’ experience and be useful.”
Notifications, according to Zimarev, are used “to draw the user’s attention to a certain event,” like the audio signals on a Chase ATM machine; interactions, on the other hand, “react to the user’s actions and make the user’s experience more pleasant,” such as the simple ding you hear when a message is sent on Facebook Messenger.
“The most important thing here is not to overdo it,” Zimarev advises. “You do not want your app to sound like a slot-machine club during the busy hour.”
Sound and Meaning
A quick aside: It’s well known that Beethoven created a large chunk of his music – including his last five piano sonatas – after he’d gone deaf. What we don’t exactly know is how he was able to pull this staggering feat off.
One theory focuses on the extent of Beethoven’s hearing loss. It’s generally agreed, based upon Beethoven’s letters, that his deafness came on slowly; but there are also questions about whether he had actually suffered complete hearing loss at all, or if he’d just had partial deafness.
Another theory is that even if Beethoven had gone completely deaf, he would have been composing music for a few decades before that happened (and, uh, he was a GENIUS at composing music), so he knew what the notes sounded like in a composition. Donato Cabrera, music director for the California Symphony, points out how music is a language and languages have rules, so since Beethoven knew “the rules of how music is made, he could sit at his desk and compose a piece of music without hearing it.”
“Beethoven was a master of the language of music, which is about the creation of sound, not about listening,” Cabrera adds.
Of course, design – like music – has its own language and its own sets of rules to follow. And you don’t have to be a master at creating sound like Beethoven to at least understand where a sound can greatly enhance the user experience in your product and where it can be intrusive, unnecessary feature.
After all, some of the skills you develop as a designer involve intuition – the way certain colors look more visually appealing than others next to each other, how spacing between items on a page improve its readability, etc. You develop an intuitive skill for how things should look or feel to a user.
In the same way Beethoven could compose music by knowing the rules of music language – and understanding how a specific sound is made, even if he couldn’t hear it (or no longer hear it well) – you know where to insert a sound based on the understanding of your own design language. Sound becomes another element in your UX design strategy, following similar rules as your graphic elements and text.
And just as you would with a graphic, Zimarev suggests always asking yourself whether your product really needs sound in the place where you’re planning to put it. By understanding “the function of each individual sound component,” you can take out any sound that isn’t needed and only keep “the sounds that provide useful information or improve the user’s experience should stay.”
Zimarev adds that designers should approach sound as being a part of your brand identity, no different than your logo or any other visual element. Ultimately, like with the Facebook Messenger ping, you want people to know your brand not just by what they see, but also by what they hear.