User experience evaluation explained

User experience evaluation (UXE) or user experience assessment (UXA) refers to a collection of methods, skills and tools utilized to uncover how a person perceives a system (product, service, non-commercial item, or a combination of them) before, during and after interacting with it. It is non-trivial to assess user experience since user experience is subjective, context-dependent and dynamic over time.[1] For a UXA study to be successful, the researcher has to select the right dimensions, constructs, and methods and target the research for the specific area of interest such as game, transportation, mobile, etc.

Dimensions

There are many different dimensions to consider when choosing the best assessment approach:

Laboratory experiments may work well for studying a specific aspect of user experience, but holistic user experience is optimally studied over a longer period of time with real users in a natural environment.

Constructs

In all cases, however, there are certain aspects of user experience that researchers are interested in (measures), and certain procedures and techniques used for collecting the data (methods). There are many measures and some high-level constructs of user experience that can be used as the basis for defining the user experience measures, for example:

  1. Utility: Does the user perceive the functions in the system as useful and fit for the purpose?
  2. Usability: Does the user feel that it is easy and efficient to get things done with the system?
  3. Aesthetics:[2] Does the user see the system as visually attractive? Does it feel pleasurable in hand?
  4. Identification: Can I identify myself with the product? Do I look good when using it?
  5. Stimulation: Does the system give me inspiration? Or wow experiences?
  6. Value: Is the system important to me? What is its value for me?

To properly evaluate user experience, metrics and other factors surrounding a study need to be taken into account, for example:

Methods

An individual method can collect data about a set of specific constructs of user experience. For instance, usability testing is used to collect data about usability construct.[3] Methods also differ if they are to measure a momentary or episodic experience (i.e., assessing how a person feels about a specific interaction episode or after executing a task) or an experience over time, also known as an longitudinal experience. UXA methods can be classified in three categories: implicit, explicit and creative methods.

Implicit methods

Implicit methods of UX research focus not just only on what the users say, but also on what the user cannot express verbally. Many available tools can assist in the implicit evaluation, in particular to gather implicit or objective data. When available, UX researchers utilize state of the art equipment to uncover all aspects of the experience.

Examples of implicit evaluation methods and tools:

Explicit methods

Explicit methods of UX research explore what the user is consciously aware of getting them to reflect on their own feelings or thoughts, and gather their views and opinions. An important aspect of explicit methods includes usability testing and emotion evaluation.

Emotion assessment

When investigating momentary user experiences, we can evaluate the level of positive affect, negative affect, joy, surprise, frustration, etc. The measures for emotions are bound to the methods used for emotion assessment, but typical emotion measures are e.g. valence and arousal. Objective emotion data can be collected by psychophysiological measurements or by observing expressed emotions. Subjective emotional data can be collected by using self-report methods, which can be verbal or non-verbal.

Examples of emotion assessment methods:

Creative methods

Equally important to implicit and explicit methods are the creative methods that the user researcher can utilize in order to bring together the design team's view, as well as the target market's dreams, aspirations and ideas of optimal design. These activities are more open and allow people to either co-create with the engineers/designers, or to use their imagination to express their ideal system.

Examples of creative assessment methods

Longitudinal

In contrast to identifying a momentary emotion, longitudinal UXA investigates how a person feels about a system as a whole, after using it for a while.

Examples of longitudinal UXA methods (excluding traditional usability methods):

Areas of UXA research

Transportation

Automobiles have come a long way since their beginning in the late 19th century. One of the major things that have helped automobiles to provide more safety and convenience is electronics. With the advances in technology and electronics, car manufacturers have been able to offer a wide variety of services and conveniences. From the creation of the electronic fuel injection to the popular global positioning system found standard in many cars today, the auto industry has revolutionized the way people travel from place to place. Understanding how people interact with vehicles today, what contributes to a great driving experience, what is their current relationship with the car, what placement does it have in their lives, is key to the development of these technologies. This information ensures user-centered design practices to generate cohesive, predictive and desirable designs.

Once specific design concepts and ideas are on the table, UXA researchers further explore how people react to them regarding desirability, findability, usefulness, credibility, accessibility, usability and human factors metrics. Outcomes of this work includes user requirements, concept validation, and design guidelines. Researchers have conducted intriguing research to answer questions such as: could an In-Vehicle Infotainment (IVI) system with a speech evoked personality change your relationship with your car?,[15] could an in-car system support unwinding after work?,[16] could in-car solutions address the special needs of children as passengers, and assist the parents with the task of driving?[17] and many others. Additionally, workshops and gatherings of researchers around the world take place to discuss current evaluation techniques and advance the field of experience research in the area of transportation. An important professional venue for this work is AutomotiveUI, the International Conference on Automotive User Interfaces and Interactive Vehicular Applications.

UXA methods for transportation

As with other UXA's the method chosen has a lot to do with the outcome desired and where the project is in its design cycle. Given that, methods are selected best suited to the research problem which most times ends up being a combination of implicit, explicit and creative. Some methods include:

Video games

A relatively new pursuit in video game play-testing is UX and usability research. An increasing number of companies including some of the world's biggest publishers have begun outsourcing UX evaluation or opening their own in-house labs.[21] [22] [23] Researchers use a variety of HCI and psychological techniques to examine the effectiveness of the user experience of the games during the design process.[24]

There are also some companies starting to use biometrics to measure the relationship between in-game events and the player's emotions and feelings (the UX), such as Player Research and Serco ExperienceLab in the UK,[25] [26] and Valve, Electronic Arts, BoltPeters, and VMC Labs in the US and Canada.[27] [28] [29] [30] The interest in this area comes from both academia and industry, sometimes enabling collaborative work.[31] [32] Game UX work has been featured at professional venues, such as the Game Developers Conference (GDC).[33] [34]

Web design

User experience evaluation has become common practice in web design, especially within organizations implementing user-centered design practices. Through user testing, the user experience is constantly evaluated throughout the whole product design life-cycle.

See also

Notes and References

  1. Law, E., Roto, V., Hassenzahl, M., Vermeeren, A., Kort, J.: Understanding, Scoping and Defining User Experience: A Survey Approach. In Proceedings of Human Factors in Computing Systems conference, CHI'09. 4–9 April 2009, Boston, MA, USA (2009)
  2. Moshagen, M. & Thielsch, M. T. (2010). Facets of visual aesthetics. In: International Journal of Human-Computer Studies, 68 (10), 689–709.
  3. Web site: Stop overthinking UX and try the coffee shop test . Pelt . Mason . venturebeat.com . 23 May 2016 .
  4. Baenziger, T., Tran, V. and Scherer, K.R. (2005) ‘'The EmotionWheel. A Tool for the Verbal Report of Emotional Reactions, poster presented at the conference of the International Society of Research on Emotion, Bari, Italy.
  5. http://dl.acm.org/citation.cfm?id=1979047|Pollak, J. P., Adams, P., & Gay, G. (2011). PAM: a photographic affect meter for frequent, in situ measurement of affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 725–734). ACM.
  6. Laurans, G., Desmet, P.M.A., & Hekkert, P.P.M. (2009). The emotion slider: a self-report device for the continuous measurement of emotion. Proceedings of the 2009 International Conference on Affective Computing and Intelligent Interaction. Amsterdam, the Netherlands.
  7. Isbister, K., Höök, K., Sharp, M., and Laaksolahti, J. 2006. The sensual evaluation instrument: developing an affective evaluation tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Montréal, Québec, Canada, 22–27 April 2006). CHI '06. ACM, New York, NY, 1163–1172
  8. Desmet, P.M.A., Overbeeke, C.J., Tax, S.J.E.T. (2001). Designing products with added emotional value: development and application of an approach for research through design. The Design Journal, 4(1), 32–47.
  9. Bolger, N., Davis, A., & Rafaeli, E. (2003). Diary methods: Capturing life as it is lived. Annual Review of Psychology, 54, 579–616.
  10. Csikszentmihalyi M, Larson R. (1987). Validity and reliability of the Experience-Sampling Method. Journal of Nervous and Mental Disease. Sep 1987;175(9):526–536.
  11. [Daniel Kahneman|Kahneman, D.]
  12. Hassenzahl, M., Burmester, M., & Koller, F. (2003). AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In J.Ziegler & G. Szwillus (Eds.), Mensch & Computer 2003. Interaktion in Bewegung (pp. 187–196). Stuttgart, Leipzig: B.G. Teubner.
  13. Laugwitz, B., Schrepp, M. & Held, T. (2008). Construction and evaluation of a user experience questionnaire. In: Holzinger, A. (Ed.): USAB 2008, LNCS 5298, S. 63-76.
  14. Toussaint, C., Ulrich, S., Toussaint, M. (2012). HUX - Measuring Holistic User Experience. In German UPA e.V., Usability Professionals 2012 - Tagungsband (pp. 90-94).
  15. Jennifer Healey and Dalila Szostak. 2013. Relating to speech evoked car personalities. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 1653-1658. DOI=10.1145/2468356.2468652
  16. Zoë Terken, Roy Haex, Luuk Beursgens, Elvira Arslanova, Maria Vrachni, Jacques Terken, and Dalila Szostak. 2013. Unwinding after work: an in-car mood induction system for semi-autonomous driving. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '13). ACM, New York, NY, USA, 246-249. DOI=10.1145/2516540.2516571
  17. Liang Hiah, Tatiana Sidorenkova, Lilia Perez Romero, Yu-Fang Teh, Ferdy van Varik, Jacques Terken, and Dalila Szostak. 2013. Engaging children in cars through a robot companion. In Proceedings of the 12th International Conference on Interaction Design and Children (IDC '13). ACM, New York, NY, USA, 384-387. DOI=10.1145/2485760.2485815
  18. Lallemand, C. (2012) Dear Diary: Using Diaries to Study User Experience
  19. [Kate Hone|Kate S. Hone]
  20. Areti Goulati and Dalila Szostak. 2011. User experience in speech recognition of navigation devices: an assessment. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI '11). ACM, New York, NY, USA, 517-520. DOI=10.1145/2037373.2037451
  21. https://www.wired.com/gaming/virtualworlds/magazine/15-09/ff_halo?currentPage=all Halo 3: How Microsoft Labs Invented a New Science of Play
  22. Bolt, Nate. (22 January 2009) Researching Video Games the UX Way – Boxes and Arrows: The design behind the design. Boxes and Arrows. Retrieved on 21 October 2011.
  23. http://www.mcvuk.com/press-releases/56236/THQ-Usability-Lab THQ Chooses The Guildhall at SMU to House New Usability Lab | games industry | MCV
  24. Hong, T. (2008) Shoot to Thrill: Bio-Sensory Reactions to 3D Shooting Games, Game Developer Magazine, October
  25. http://www.gamesindustry.biz/articles/2012-08-14-biometrics-the-science-of-play GamesIndustry.biz
  26. http://www.playablegames.net Game usability testing
  27. http://www.valvesoftware.com/ Valve
  28. http://www.ea.com/ EA Games – Electronic Arts
  29. http://www.vmc.com/gamelabs.aspx VMC Consulting – Tailored Solutions for Your Business
  30. http://boltpeters.com/ Bolt | Peters | Research, design, and products
  31. Nacke, L., Ambinder, M., Canossa, A., Mandryk, R., Stach, T. (2009). "Game Metrics and Biometrics: The Future of Player Experience Research" Panel at Future Play 2009
  32. 8–9 April 2010, Seminar Presentation at Games Research Methods Seminar, "Using physiological measures in conjunction with other UX approaches for better understanding of the player's gameplay experiences", University of Tampere, Finland
  33. Ambinder, M. (2011) Biofeedback in Gameplay: How Valve Measures Physiology to Enhance Gaming Experience. Game Developers Conference 2011
  34. Zammitto, V. (2011) The Science of Play Testing: EA's Methods for User Research. Game Developers Conference 2011