Comparison of usability evaluation methods explained

Usability testing methods aim to evaluate the ease of use of a software product by its users. As existing methods are subjective and open to interpretation, scholars have been studying the efficacy of each method [1] [2] [3] and their adequacy to different subjects, comparing which one may be the most appropriate in fields like e-learning,[4] e-commerce,[5] or mobile applications.[6]

Evaluation MethodEvaluation Method TypeApplicable StagesDescription Advantages Disadvantages
Think-aloud protocolTestingDesign, coding, testing and release of applicationParticipants in testing express their thoughts on the application while executing set tasks
  • Less expensive
  • Results are close to what is experienced by users
  • The Environment is not natural to the user
Remote Usability testingTestingDesign, coding, testing and release of applicationThe experimenter does not directly observe the users while they use the application though activity may be recorded for subsequent viewing
  • Efficiency, effectiveness and satisfaction, the three usability issues, are covered
  • Additional Software is necessary to observe the participants from a distance
Focus groups Inquiry Testing and release of applicationA moderator guides a discussion with a group of users of the application
  • If done before prototypes are developed, can save money
  • Produces a lot of useful ideas from the users themselves
  • Can improve customer relations
  • The environment is not natural to the user and may provide inaccurate results.
  • The data collected tends to have low validity due to the unstructured nature of the discussion
InterviewsInquiryDesign, coding, testing and release of applicationThe users are interviewed to find out about their experience and expectations
  • Good at obtaining detailed information
  • Few participants are needed
  • Can improve customer relations
  • Can not be conducted remotely
  • Does not address the usability issue of efficiency
Cognitive walkthroughInspectionDesign, coding, testing and release of applicationA team of evaluators walk through the application discussing usability issues through the use of a paper prototype or a working prototype
  • Good at refining requirements
  • does not require a fully functional prototype
  • Does not address user satisfaction or efficiency
  • The designer may not behave as the average user when using the application
Pluralistic walkthroughInspectionDesignA team of users, usability engineers and product developers review the usability of the paper prototype of the application
  • Usability issues are resolved faster
  • Greater number of usability problems can be found at one time
  • Does not address the usability issue of efficiency

See also

External links

Notes and References

  1. Genise, Pauline (August 28, 2002.). "Usability Evaluation: Methods and Techniques". University of Texas
  2. Book: Dhouib. A.. Trabelsi. Abdelwaheb. Kolski. C.. Neji. M.. 2016 9th International Conference on Human System Interactions (HSI) . A classification and comparison of usability evaluation methods for interactive adaptive systems . 2016. https://ieeexplore.ieee.org/document/7529639. 246–251. 10.1109/HSI.2016.7529639. 978-1-5090-1729-4. 19110009. 2021-02-07. 2021-02-14. https://web.archive.org/web/20210214114313/https://ieeexplore.ieee.org/document/7529639. live.
  3. Hocko. Jennifer M.. 2002. Reliability of Usability Evaluation Methods.
  4. Book: Vukovac. Dijana Plantak. Kirinic. V.. Klicek. B.. 2010. A Comparison of Usability Evaluation Methods for e- Learning Systems. A Comparison of Usability Evaluation Methods for e-Learning Systems. https://www.daaam.info/Downloads/Pdfs/science_books_pdfs/2010/Sc_Book_2010-027.pdf. 10.2507/daaam.scibook.2010.27. 9783901509742. 2021-02-07. 2018-06-03. https://web.archive.org/web/20180603013250/http://www.daaam.info/Downloads/Pdfs/science_books_pdfs/2010/Sc_Book_2010-027.pdf. live.
  5. Hasan. L.. Morris. Anne. Probets. S.. 2012. A comparison of usability evaluation methods for evaluating e-commerce websites. Behav. Inf. Technol.. 31. 7. 707–737. 10.1080/0144929X.2011.596996. 9998763. 2021-02-07. 2021-02-18. https://web.archive.org/web/20210218062945/https://www.tandfonline.com/doi/abs/10.1080/0144929X.2011.596996. live.
  6. Book: Mathur. P.. Chande. Swati V.. Microservices in Big Data Analytics. 2020. Empirical Investigation of Usability Evaluation Methods for Mobile Applications Using Evidence-Based Approach. 95–110. https://link.springer.com/chapter/10.1007%2F978-981-15-0128-9_9. 10.1007/978-981-15-0128-9_9. 978-981-15-0127-2. 214128768 . 2021-02-07. 2021-02-18. https://web.archive.org/web/20210218062951/https://link.springer.com/chapter/10.1007%2F978-981-15-0128-9_9. live.