A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.md 9.0KB

1 년 전
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596
  1. title: Fast Path to a Great UX - Increased Exposure Hours
  2. url: https://articles.uie.com/user_exposure_hours/
  3. hash_url: 4c5b3193ced812222ef1a6d53e3470aa
  4. <p>
  5. As we’ve been researching what design teams need to do to create great user experiences, we’ve stumbled across an interesting finding. It’s the closest thing we’ve found to a silver bullet when it comes to reliably improving the designs teams produce. This solution is so simple that we didn’t believe it at first. After all, if it was this easy, why isn’t everyone already doing it?
  6. </p>
  7. <p>
  8. To make sure, we’ve spent the last few years working directly with teams, showing them what we found and helping them do it themselves. By golly, it actually worked. We were stunned.
  9. </p>
  10. <p>
  11. The solution? Exposure hours. The number of hours each team member is exposed directly to real users interacting with the team’s designs or the team’s competitor’s designs. There is a direct correlation between this exposure and the improvements we see in the designs that team produces.
  12. </p>
  13. <h2>
  14. It Makes Perfect Sense: Watch Your Users<br>
  15. </h2>
  16. <p>
  17. For more than 20 years, we’ve known that teams spending time watching users, can see improvements. Yet we still see many teams with regular user research programs that produce complicated, unusable products. We couldn’t understand why, until now.
  18. </p>
  19. <p>
  20. Each team member has to be exposed directly to the users themselves. Teams that have dedicated user research professionals, who watch the users, then in turn, report the results through documents or videos, don’t deliver the same benefits. It’s from the direct exposure to the users that we see the improvements in the design.
  21. </p>
  22. <p>
  23. Over the years, there has been plenty of debate over how many participants are enough for a study. It turns out we were looking in the wrong direction. When you focus on the hours of exposure, the number of participants disappears as an important discussion. We found 2 hours of direct exposure with one participant could be as valuable (if not more valuable) than eight participants at 15-minutes each. The two hours with that one participant, seeing the detailed subtleties and nuances of their interactions with the design, can drive a tremendous amount of actionable value to the team, when done well.
  24. </p>
  25. <h2>
  26. First Forays: Field Visits<br>
  27. </h2>
  28. <p>
  29. As we watched different teams go through this process, we started to notice some repeatable patterns. For example, many teams spent little time watching their users. Often these teams had successful, profitable products that had evolved over many years into very complicated designs, chock full of features that users found hard to find and often frustrating to use.
  30. </p>
  31. <p>
  32. Before they began watching users, the teams would frequently find themselves at odds in meetings. They knew that the product was getting more complex, but nobody had any real information about how the product was being used. Stakeholders would ask for features without giving any useful details to the team to implement. An attitude of “Let’s build it, and if we get it wrong, we’ll fix it” would prevail.
  33. </p>
  34. <p>
  35. For teams like these, we often choose a field visit as their first foray into watching their users. Field visits are great because we get to see what the users do in their natural environment. It doesn’t require prior knowledge of what the proper tasks in the design are. We interview the user, uncover their goals and objectives, and then ask them to use the product or service to accomplish those.
  36. </p>
  37. <p>
  38. A typical field visit is two hours. Usually with ten to twelve visits, each team member can get at least eight hours of exposure to a minimum of four different users, each trying to use the design in interesting ways.
  39. </p>
  40. <p>
  41. The results are typically a list of easy fixes. One recent 12-visit venture with a 10-member team produced 350 items on their list of quick fixes. The product improvements started showing up in just a matter of weeks.
  42. </p>
  43. <h2>
  44. A Minimum of Every Six Weeks<br>
  45. </h2>
  46. <p>
  47. We saw many teams that conducted a study once a year or even less. These teams struggled virtually the same as teams who didn’t do any research at all. Their designs became more complex and their users reported more frustration as they kept adding new features and capabilities.
  48. </p>
  49. <p>
  50. The teams with the best results were those that kept up the research on an ongoing basis. It seems that six weeks was the bare minimum for a two-hour exposure dose. The teams with members who spent the minimum of two hours every six weeks saw far greater improvements to their design’s user experience than teams who didn’t meet the minimum. And teams with more frequent exposure, say two-hours every three weeks, saw even better results.
  51. </p>
  52. <p>
  53. We think there are two reasons the frequency turns out to be important. First is the way memory works. It’s harder to remember someone you’ve met more than six weeks ago than someone you’ve met last week. If we want our users and their needs to be present in our minds as we’re creating our designs, we need to regularly see them.
  54. </p>
  55. <p>
  56. The second reason has to do with the pain of an ongoing frustration. It’s painful to watch someone struggle with your design. It’s even more painful to come back a few weeks later and see someone else struggle with the same problem again. The more times we’re exposed to those struggles, the more frustrated we get, the more we want to fix those problems. (And the happier we’ll be when we finally see someone who breezes right through with our new design.)
  57. </p>
  58. <p>
  59. Some problems are particularly gnarly. Seeing these problems repeat, in the field and in the lab, gives us insights into the nuances behind their potential causes. Testing out new design ideas can help us get to a solution faster. A regular exposure program makes that happen even better.
  60. </p>
  61. <p>
  62. By having a six-week minimum to our exposure, we leverage these two factors, making our users and their needs the driver of the design work we’re doing on any given day.
  63. </p>
  64. <h2>
  65. Types of Exposure to Users<br>
  66. </h2>
  67. <p>
  68. Field visits aren’t the only form of exposure we found that works. Usability tests, both in-person and remote, can be very effective. (We found a mixture of both works better than 100% remote sessions.) Once you know the tasks that users naturally use with the design (because you discovered them during your field visits), it’s easy to construct realistic scenarios for usability testing.
  69. </p>
  70. <p>
  71. For folks heavily involved with a style of self-design, using it themselves for real work also can contribute. (For more about self-design, see my recent article, <a href="//articles.uie.com/self_design/">Actually, You Might Be Your User</a>.) Again, validating these results with other methods, such as field visits and usability testing, helps you understand what your users experience that you don’t when using the design.
  72. </p>
  73. <p>
  74. Watching users work with competitive designs also is important. Seeing them work through those same tasks with someone else’s design can help identify where there are gaps in your own design. It also makes it easy to point out where your advantages lie.
  75. </p>
  76. <h2>
  77. The Team of Influencers<br>
  78. </h2>
  79. <p>
  80. Our research had a finding that took us by surprise: Teams that excluded non-design personnel didn’t see the same advantages as teams that included those people.
  81. </p>
  82. <p>
  83. For example, we worked with teams where only the designers and developers were having regular exposure to their users. Stakeholders, such as product managers and executives, along with other non-design folks, like technical support liaisons and quality assurance management, didn’t participate in the field studies or usability tests. While the core design team became very familiar with what users needed and wanted, they were constantly battling with these other individuals who didn’t have the same experiences.
  84. </p>
  85. <p>
  86. The tipping point came when we found teams where all these other folks were participating in the user research studies. No longer did they assert their own opinions of the design direction above what the research findings were telling the teams. Having the execs, stakeholders, and other non-design folks part of the exposure program produced a more user-focused process overall.
  87. </p>
  88. <p>
  89. Exposure is easy to measure. You can just count the hours everyone has had participating in the studies. We’re seeing teams make it part of their quarterly performance reviews, sending a clear message of the importance of user experience, especially when all the influencers are measured the same way.
  90. </p>
  91. <h2>
  92. The Challenge: Two Hours Every Six Weeks For Everyone<br>
  93. </h2>
  94. <p>
  95. Granted, we admit our data could be flawed. There could be other factors here. However, we’ve tested every possible theory, spent time reviewing every factor we could imagine, and we keep coming back to this one item: Get every member on the team to spend two hours every six weeks and you’ll likely have a great user experience appear before your very eyes.</p>