Usability test allows professionals to employ ‘drunk vision’ without touching a drop
The days of forcing poor web designers to get sauced in order to perfect a product are over as a new onscreen user interface test can simulate blurred vision and the spins minus the hangover.
Drunk User Testing, developed by renaissance software engineer Ryan Closner, is a free and easy-to-use program that muddles and skews the appearance of a screen; this makes its contents harder to navigate and gives the user a pretty authentic woozy feeling, too. Conveniently, it can be test-driven on the website or conveniently installed in your bookmarks bar with only one click (“protip: press the ‘escape’ button to sober up”).
An experienced designer himself, Closner offers up the app as an antidote to an all-too-common error: when designers have spent so much time building a product usable for the average person “that [they] forgot to cater to the lowest common denominator — the true test of user experience — the user when they’re drunk.”
Many web developers seem to agree that testing a product’s usability under the influence of alcohol is a worthwhile practice. Blogger Terence Eden stresses the importance of noting the potential effects of intoxication while designing a successful interface, arguing that ‘beer goggles’ mimic various kinds of concentration loss:
Users have many demands on their time – being distracted by a phone ringing, or an incoming email, or a bright and shiny object [... this] has the same effect as being drunk. They return to the user interface with reduced thinking capacity.
According to anecdotal studies but also common experience, navigating complex websites and apps while under the influence of alcohol, stress, or distraction can be difficult. As Eden and others suggest, a ‘drunk test’ can weed out a UI’s trickiest hurdles and even adapt the font size and color scheme of a product to be friendlier to compromised eyes.
Eden goes on to argue that designers are spending too much time considering the context of a product — a smartphone, for example — and not enough time considering the user’s potential situation. “We rarely say ‘let’s introduce a left-handed option’ or ‘do we need a night reading mode?’ or ‘does the user have time to concentrate on this UI while driving?'” he explains. “The user should be at the heart of our decision making, and her context should feature heavily in our conversations about UI.”