Written by Ronnie Battista.
A pervasive challenge in our industry is conveying the value of a well-planned strategy for creating a user experience. Anyone who has been a UX professional long enough has at some point been asked, “How do you convince companies and executives that they need to invest in UX?” Were the answer to that question as obvious as it feels like it should be, our jobs would all be a lot easier. Of course, there are countless articles on this subject—notably, one by Jared Spool that I read to remind me I’m not the only one who struggles with this message.
So, in this column, I thought I’d share a simple concept I’ve been thinking about in relation to communicating the power of UX strategy. The hypothesis of my experiment: we need to do a better job of communicating that what UX strategy provides isscience. When I say science, I mean that the tenets of UX strategy are based on the scientific method.
Though UX professionals think of UX strategy as part art and part science, because it incorporates research, design, and validation, I think many of our stakeholders still see what we do as being more the former than the latter.
While companies and their leaders are increasingly recognizing that there’s value in basing their products and services on UX strategy—even though they might not call it that—there are still some who do not believe it’s necessary. Plus, the growing popularity and promise of the product, or maker, culture is fostering a business climate that often embraces the idea of quicker returns. Some believe that they can rely on internal experts, consultants, SMEs (Subject-Matter Experts), their gut, their spouse’s gut, and any number of other folks who are not actual customers or prospects.
A UX strategy that does not incorporate true qualitative and quantitative research circumvents laws and logic and cannot really be considered user-centered, iterative design. In my view, this is essentially an abandonment of science in favor of alchemy. But, first, let’s distinguish between the two.
All That Glitters Is Not Gold
Google defines alchemy as “the medieval forerunner of chemistry, based on the supposed transformation of matter. It was concerned particularly with attempts to convert base metals into gold or to find a universal elixir”—“a seemingly magical process of transformation, creation, or combination.”
In contrast—also according to Google—“The Scientific Method is a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge. To be termed scientific, a method of inquiry is commonly based on empirical or measurable evidence subject to specific principles of reasoning.”
When I say User Experience is science, I mean its tenets are based on the scientific method, in which you pose a question; do background research; define, then iteratively test and refine a hypothesis by conducting experiments; analyze and draw conclusions from the resulting data; and, ultimately, validate or reject that hypothesis; then communicate the results.
If, in your work, you have ever used user-centered design or UX design methods, the scientific method should feel quite familiar.
I don’t use the term alchemist as a pejorative. I just mean to point out that there are some in the business world who are driven to produce things at speeds that make the timelines of even a decade ago seem slow. Certainly, there are innovations, MVPs (Minimum Viable Products), and Beta versions of products that survive in the marketplace and, ultimately, improve over time. However, these are not nearly as plentiful as the piles of ideas that organizations throw out to fail fast.
We can learn much from putting something into the wild, but when we’re challenged to deliver something that’s “good, fast, and cheap—pick two”—I believe that, if good isn’t one of the attributes we choose, product development feels like alchemy. And the people using your product or service will be the ones who tell you whether it’s good.
Learning from Failure
In 2015, it’s safe to say that the type of UX work we’re now doing has changed significantly over the last decade. We’ve progressed from predominantly tactical execution to an experience-driven, “Outside In” perspective that the most successful companies have employed to build and sustain market share. Yet, billions of dollars still get wasted every year on efforts that would have been more fruitful had companies applied a well-structured, scientifically based strategy.
During the first day of my Rutgers UXD Course, I show a 60 Minutes video about the abysmal failure of the US Department of Homeland Security’s plans to build a virtual fence on the Mexican border in 2005–2006. It presents one of the most jaw-dropping examples of how much money, time, and effort organizations can waste on concepts that the application of some scientific rigor in testing solutions or systems with actual users would have revealed as inadequate. They wanted it fast—and it certainly wasn’t cheap—but without applying the basic techniques of research and experimentation in the field, it wasn’t likely to be good. In fact, it was an experiment doomed to failure.
I’m all about ideation workshops with smart folks collaborating and aligning around a product idea. But unfortunately, there still seems to be too much naive belief that the secret to success is tapping into the experience of others who have been successful—even though it’s something that may not apply to the business problem they’re currently facing.
I recognize the contributions of people like Elon Musk and other visionary leaders who apply the scientific method, take big risks, fail, and learn from their failures. But I’d caution against your giving too much credence to those who havenothing, but failure stories to tell—however eloquent and insightful their anecdotes might be. I’d actually like to hear how some of them succeeded—if they ever have—especially those who favor methods that preach the importance of speed over understanding and feel that circumventing actual research is the right approach. While it may be cool to say you’ve walked into your share of glass doors over the years, it’s well worth knowing whether someone ever figured out how to open those doors.
Sure, I’ve failed a lot, too. We all do—and that’s an important part of our growth and learning. But there’s a difference in why we fail. I think it’s important to highlight that alchemy failed because of a fundamental misunderstanding of chemistry and physics. Garbage in, garbage out.
In contrast, when one has applied a rigorous process of hypothesis, research, experimentation, and iteration, an experience that leads to failure yields far greater value than the failure that results. Tell that to your product guy—you know who I’m talking about. The one who came from [insert name of darling agency here] who’s been doing a great job of getting the C-suite all hot and bothered about “a strategic play that’s the Uber of [insert whatever the company does here].” The shame is that, often, by the time the play ends, that product guy is hawking his mystical plans for near-instant global domination at the next company.
So the next time someone pushes back on the notion that you cannot create a successful user experience without taking the core human element into account, be warned: The gold—which is the base metal of all experience—is people. If your team is convinced they don’t need to factor in time to understand the people their product or service impacts and their context of use, engaging those people in discovering how to make a solution better, consider asking them if they prefer alchemy over science. Hopefully, they’ll ask you what that means. If they don’t or they don’t care, wish them well and walk away long before they start chanting over a smoky pile of missed expectations